Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very creepy episode. First you imagine AI would be fully developed in maybe 30 y…
ytc_Ugyb2h4bn…
G
Ai 'artists' need us, if we quit then they cant do what they think we do…
ytc_UgzNXhJuI…
G
If you prevent humanity from producing food or hunting you dont need no more mon…
ytc_UgxDRb7Nc…
G
What most people in modern western civilization will have to grapple with is tha…
ytc_UgwEZChmR…
G
"They're a lot safer than human drivers."
And you know what's a lot safer than …
ytc_UgxTpWEkz…
G
@zandrrlife the default for the top AI companies is increasingly large, expensiv…
ytr_Ugz-xaGPm…
G
Ai art looks ugly and generic as hell to me. And most importantly, it doesn't st…
ytc_UgzFNwtoz…
G
Because nobody's going to do anything about it. The government needs Boeing, Ama…
ytr_UgxZmeNu4…
Comment
As a 20 year old who’s more familiar with AI. It’s so easy for the mom and parents to blame AI, but never ask themselves “what behavior did I cause my child to seek love and comfort from an AI bot”. These older millennials wanna put them blame on everyone but themselves. I use the AI chat bots from time to time for entertainment and I started at 15, never once did I get attached like this boy did. Investigate the parental family. 😊
youtube
AI Harm Incident
2026-03-30T17:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxun81mUuFIcFVvc4V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxd9tBlAjHWGa0y96R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzdhQp8FPdYDVI898d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7-nHsJAAhxCQ5eiB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxS9JM-kYTl72glGHl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzWEFtxgpw0G5ctNPh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDMrlfpmCi7K8_ORx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8T-1w87gBfXHCWlx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy70gc-M9CouO0cQxB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyy_cI6u1KJn6kLFX14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]