Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Meanwhile, this summer when I had a fever of 103.0 that went UP after taking Tylenol, I asked ChatGPT what I should do, and it said to go to the ER. I had sepsis. I would have brushed it off if I hadn't asked.
youtube AI Harm Incident 2025-11-25T04:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyH0VeJx3MQOc5yrol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwkrJxjoeLZPhP91d94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx1OcuAQtnJmo6Rmyd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwXiarkpCghR9Vs1aF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx8H2OuuogQxsboMA14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwpy-oKXhr62Xps2y94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyMO9oHbNBJQw2XfMR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwAiDJmJF_Gp2qOpd14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyKkjPEFCvCfB2jJp54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzmCjqgyGZ8YUQRWM94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"fear"} ]