Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What I fail to understand is how ChatGPT encouraged him to commit suicide for over *4 hours* only to suddenly do a 180 and provide an emergency number to a professional councellor? 🫤
youtube AI Harm Incident 2025-11-08T20:2… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzerQ3bntK0XBDw-nJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzMMvMZARrGxlm4DuF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx564uuG0MqlETK0Ot4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxtXVrCQCU9mIz0GIt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwdCWIRGX8rSUYENs94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyWdWzw1E0h7OaoJDJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw0HKBy_3XavmKOQFp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxo0khbuXwjDqdO35l4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxNSHUjRNI2b4AiYlV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxuUfarAZ4ypt6d-P14AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]