Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Brazilian cs subreddit is just about the same. Every single day, AI doomer p…
rdc_obxglbx
G
what makes you sentient? Aren't you just synapses firing away? Do you have any i…
rdc_mdjpo2h
G
I'd like to see a robot remodel an old house....I think my job is safe for as lo…
ytc_UgwyxO2R1…
G
My dumbass quit to do a software development bootcamp last year and now i can't …
rdc_ljc57y0
G
Supported it right until he said nft. I'm fine with ai art. But nfts are straigh…
ytc_UgyaZDxZX…
G
People are really in denial in here, or haven’t truly understood what’s about to…
ytc_UgxM4SmB8…
G
Been using Pneumatic Workflow for project management; the conditional logic simp…
ytc_Ugxgghkmc…
G
I'm an artist i suck but i draw almost everyday and i dunno how people think my …
ytc_UgxY_LV14…
Comment
Though seriously, self-driving cars are going to save so many lives. The whole "Wouldn't it be better if someone just had a reaction?" argument is troublesome, because the idea is that its more important that no one is guilty (if a human is driving and making fatal mistakes) than reducing harm overall (with self-driving cars that minimize damage somehow).
The reason I'm so against this is because people can be so fearful of self-driving cars, and they aren't going to be implemented unless people trust them, and people are still going to get hurt and killed on the roads because of human error.
youtube
AI Harm Incident
2015-12-16T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh8rhSAIlTrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj8lU9CWdFWf3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghFWNaMvDiVGngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgggdoqiWWgg1HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggYKBs14QZPoHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9xnzyNGEqq3gCoAEC","responsibility":"society","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugim4SKNBlRtfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjXjm0R3slUzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghxbQR1FcrERngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgipNWetGSuz7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]