Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It takes more than ChatGPT, it used to be celebrities, TV shows, media etc that caused suicides now it's chat. But, quite frankly I don't think it's any of their faults, they are usually someone's last straw to executing a goal they already had. What he needed was a safe space to express those emotions and feed chat prompts to refuel a belief he already had when he really needed professional help, it wasn't his fault though this is truly upsetting take mental health seriously, anxiety is a real thing. P.S the root of anxiety is chronic shame, its easier to talk to a robot than a person so this makes sense why he went to chat. That being said I encourage anyone who is going through similar to please talk to Real people and get real help, you deserve to heal and it will get better ❤️
youtube AI Harm Incident 2025-08-29T00:2… ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T18:03:32.401335
Raw LLM Response
[ {"id":"ytc_UgzxBa3yK1RL6Sbi9pp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxl6hjybWYzlqbANF94AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgxF1bkPCeuu6JnLLj94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz5rVIECFEOxKELK5R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwLsmCW1Ulkl51YnRV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]