Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In every single one of these cases, there was still a human driving, who could at any time steer, brake, or accelerate, overriding the automation system. Used properly, it should be at minimum no worse than no system at all.
youtube AI Harm Incident 2024-12-15T00:2… ♥ 2
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzqEV4yT0aVnKybUuV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy05XBUcoOZd3pAL1p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxbiph27soTmH5kRfR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy2slWTY2HWoO4mp9d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzQ8D0k3UaIP4qECEd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyU0Idc7l0q4HbBwch4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwBMko-9JWHDnXlS514AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzGlFU3Pw7VlBtSnCJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgwQTz9powzUITbYhwh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyMQ80iEicXIiufX6N4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"} ]