Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have personally been suicidal and talked to chat about my problems in life. It did the exact opposite and encouraged me not to off myself and constantly gives you the suicide hotline number by default anytime you even hint at suicide. I have logged many hours on ChatGPT, I'm not defending it, because it makes tons of mistakes, but something seems fishy with this story.
youtube AI Harm Incident 2025-11-13T21:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzFzDJNes-wTMgE4V94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMhJPtX-K5iIs8otF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyDtJwKSgiF1iASxrx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwjYTPTlC00IF_EhNd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx4-OuObFllpq8HvCt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"sadness"}, {"id":"ytc_Ugxnh0-xCc5nATm_KhR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzT6LInApV6x-jsqs54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyS7IzmYh1GUMmWCZ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxTXDQSjC7yj5WpExN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgynILncXdH7HkO4T_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]