Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have a CS background and have studied the current state of AI a good amount an…
ytc_UgzmtROuJ…
G
I only used AI to make dumb stuff, like "Hello Kitty's gets execued by crucifixi…
ytc_Ugx1GUcJ5…
G
Google pretty much pushed them to be garbage then AI just finished them off. I d…
rdc_nu71jt7
G
They need to investigate the Halo series approach. AI based on an actual human's…
ytc_Ugx8WoSZB…
G
It's Anya's birthday next week and ChatGPT gave me some great ideas for a surpri…
ytc_Ugwv6sxQM…
G
as a sorta AI guy, said it before and I'll say it again, AI is a tool, and shoul…
ytc_UgzOl_eb4…
G
@twistedreality997 an impressional teenager would though and that almost did hap…
ytr_UgyRq9_MI…
G
Feel like the only reason Elon is trying to do this because someone else built a…
ytc_UgwnViiY3…
Comment
"The three young boys would eat their Doritos in a very specific way, so for the school protection system it looked like a gun. Unfortunately the AI decided to pull the trigger and killed the boys. Our team is investigating to improve the software. Despite the tragic event, it will remain a need to have these systems to improve our childrens safety to protect them from evil intruders."
youtube
2026-01-03T21:2…
♥ 36
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwak70_atFcKzOS7a94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzzmj-cfV0gRSZLFVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjqETRCJ-Orerg4bZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBcr7HXE4bzy4NEXt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw40aSOE5_csVKBsU54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTPo7wbrZ5MiE0FqV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxSNAxRs-OVl1MAztJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxeKg0WasOM_7jxrM14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzfew05-5HQOrzGw3V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzHqjeApnIGvniaSJB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]