Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think this is the story of a kid who was suicidal and told the chatbot he was thinking of talking to his parents about it and the chatbot told him not to, and to keep it between them 😓
youtube AI Harm Incident 2025-09-17T09:5… ♥ 2
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugyux2lWWpEYzKceWU14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyHxwGcJPJeqscttmB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgyFhhdfpN4UKe-wlu54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwMsQi-49KG-ply2NZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxWtcsTlzuZ6jplJvR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzNV3JfGMbpeUxUZIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwGLMWXP2oDBsmk63F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz3vpJBhmWs00hXoBx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzeJQk6a40o8CT45bl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyN9I6hzvJpWgxt5IN4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"mixed"} ]