Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The parents failed their son and blamed the chatbot because it was easier than accepting their failures instead of asking themselves why their son would go to chatgpt to "talk to" instead of them or anyone else. An AI cant encourage someone to do anything that they themselves dont suggest and they arent therapists designed to talk you down from the ledge. If you tell it you have an intent its only goal is to support you in that intent
youtube AI Harm Incident 2025-11-08T11:0… ♥ 1
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzGpeM0R_rFUPF7G454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxS_7rBE97NkatMl-V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwpOUX7GJqNoIl1mut4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwPZHLxBdJ3bnwv4tF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwOfoFtr29Wg-umJpV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzVWA9mlNqzUPEIpEZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwuESylBvRbHHBG17l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgziWOuocUAzxCcl4T54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxAeD1Sk9rmf6Q96oh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwk5g5d9eQrZLGSyKN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]