Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Parents: "The company should have known of the dangers of the AI chat bot". Yeah, and YOU should have known what you're kid was doing on the internet. That said, a suicide over a "virtual" relationship. Can you imagine what people in the Middle Ages (people who had tangible, unimaginable daily struggles) would think of this (insert appropriate word)? Well, I'm done. I have to go find someone to blame for my actions, seems to be contagious.
youtube AI Harm Incident 2025-11-28T18:3…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxsWZfT1Y3LWjxG61B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzkmM90ZS0O92qaSI94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzxrd7ePKB8fdWQZ6Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwCBkSLmOPZR_S_Tj14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw3ZGi9b0mP89JvNvF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwCz5fJPInuA8iLI8J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwjxKEsEAQCiN_FNzx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwSl5xb9Z7Bk9gc8aJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwqYw028YeCXT5a3dx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy2gnyWyuvEyps6kVx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]