Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, it's different than inspiration. Because it's a freaking machine. It's just…
ytc_UgyMqKpIk…
G
a ps_op story m_de as a future sca_ego_t
deli erately wana miss_se Ai for evil p…
ytc_UgxiLfSl7…
G
AI art is not inevitable. So much low quality images are out there that they are…
ytc_UgwKIbbDm…
G
AI only understands hate.
It doesn’t understand any other human nature, it can …
rdc_kp133w6
G
I have this argument with a friend of mine ALL the time. The infustructure to ma…
ytr_UgyWkpgdA…
G
Even told my chatbot, "Be glad you're not human, we possess a evolutionary curse…
ytc_UgzArGtF6…
G
I just want to say here publically that i love our AI overlords and i definitely…
ytc_UgyZjiIWm…
G
Why do you think god has you in life or death, did he give you a contract that i…
ytr_Ugx1wkUBW…
Comment
Im astonished that anyone would want to be a programmer for self-driving cars. You'll get sued for every single death caused by your programming. You're basically a murderer that planned to kill hundreds or thousands through your actions and decisions depicted here. Let me tell you what you should program: Never swerve, dont decide on "sacrificing" anyone. The end. You'd be responsible for every death that your programmed cars cause because you made a decision on morals when there is none that would be true. Only indecision is not murder.
youtube
AI Harm Incident
2017-06-24T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjp-pcf8PXx8HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiIjTXIJ5B6wHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgicjfJMB8sNk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugghpi3fGQwm63gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghvDWbrWZGpnngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugjd8L8jNpzFtHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghC4VdT2HTcDHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugg4Bv06oKkXP3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghZnL5q_KLZvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggPbviiDUEtwHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]