Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Condolences to the parents, and may the kid's sould rest in piece, but let's not act like Chat GPT is the sole culprit here. First it's a robot. A robot doesn't think, it just gives you the answer you wanna hear. It has no opinion. Second, the teen had probably already very few will to live if the robot was enough for him to take his life. He needed profesionnal help. I think it was a bad enough choice for him to ask a bot about professional advice, but maybe he was just desperate. Third and mostly, I blame the parents. So many parents nowadays just think that raising a kid is just living with him. Raising a kid is about teaching him values, morals, advice from past experience, caring about him, trusting him, helping him. If the boy wanted to kill himself because of familial stuff, the parents are responsable as they could not provide a stable and safe environnement for their child. If the boy wanted to kill himself for external reason, the parents were supposed to be trustworthy enough so that he could talk to them and they could help him, like advicing him to go see a psycologist. But they didn't and they just blame the robot. If you're a grow ass adult with years of experience and a machine had more influence on your son than you, you're the problem. How can you lack that much common sense ? They are the ones who should be investigated. But I guess the loss of their son is sad enough for them. Anyway peace to the boy and strenght to his family
youtube AI Harm Incident 2025-09-02T15:3… ♥ 6
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugy6idUAWpdKt8sNB754AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugxjt4p7G0mlH8esAMR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwaMtPRk4nITtGd9EJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyrX_wow6j-u5X1AHx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw5jQZIFWLZRIIc8Y94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxa0UuG-W0us9waunR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwlRAm7cuBElwBNn1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwU-yMtG_Bc8zT42Xl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyDVU11LKzT6wwcaI54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyeVkL-Yw0T0WdLTIF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"} ]