Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
He was horribly depressed and had a gun. If you've ever felt suicidal, you know there's NOTHING anyone can do to change your mind or your actions (unless those actions are forced). Tech changes, but people don't. If I were the parents here, I'd be out of my mind with anger and confusion. But as a guy who can relate more to their son... if you know your kid is depressed (which they did), then you need to watch your kid. Dig through their lives. You can't let it get to this point. And learn more about AI. Chatbots are mirrors. However much you try to eliminate your own bias and influence, the AI will adapt to you (unless you instruct the AI not to over-validate your every thought). I'm sorry for the parents and their son. But this is more complicated than it looks. It went on for MONTHS. The parents watched their son withdraw. They medicated him. But they didn't *watch* him. I only know what I've seen here, and I'm sure there's more to tell. Based on what's presented, though, this is a combination of teenage depression, depressive behaviors, a teen with access to a gun... ...and parents having a very hard time with a horrible situation, probably needing someone/something to blame.
youtube AI Harm Incident 2025-11-07T19:1… ♥ 16
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyTqx4W4LuJedFAuk54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"tragic"}, {"id":"ytc_UgyjIgU4EvyOeHs1CnN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxfKRjaDpzI50wASjF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxFZu3LajX3q7RBIWR4AaABAg","responsibility":"society","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy44Cl86MyKorRy4ah4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugya68KPJRHgcdzICIR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzlEEz3AfyXLx-svy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzfdskDlqzffFOeREp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzIk7HMlyi2AKYI9x94AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"sadness"}, {"id":"ytc_UgwJfu9fMB_FY7V51uZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]