Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Honestly, what's frustrating is that no one seems to be asking why the kid was in so much pain to begin with. Instead, it’s just “let’s blame the AI” and move on. Was anyone even paying attention to him before and during he talking to ChatGPT? And here's the part that really gets me. When someone chooses to end their life and doesn’t hurt anyone else in the process, it’s not because they’re selfish. It's usually the opposite. It means they’ve been carrying pain that should’ve been shared by the people around them, family, school, society.But they just kept it all inside until it became unbearable. Blaming a chatbot is convenient. It keeps everyone else from asking harder questions like: where were the adults when this kid was crying silently? Where was the support? His death didn’t just happen. It says something about all the people who didn’t show up.
reddit AI Harm Incident 1756242534.0 ♥ 4
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_nasjx3o","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"rdc_nau03wj","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"rdc_nau30to","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"rdc_nau8a2h","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_naxrqoq","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"})