Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
They just want someone to blame. As someone who has had the same conversations with ChatGPT and thoughts for years, it does not tell you to commit suicide. Just like you can find resources and aide through a google search by bypassing its suicide message. ChatGPT just validates your feelings.
youtube AI Harm Incident 2025-08-26T22:2… ♥ 1
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T18:03:32.401335
Raw LLM Response
[ {"id":"ytc_Ugy-BThO01oNuJxVliF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZKdi8wrCFUMITRP14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgzZKWZbetO1FntzHqF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgylxKTDMGwT-GcusR54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxxtviPWb51-1GYDDd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"} ]