Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So, the best they could come up with was a guy who was told to pay attention 19 times because the system itself is not meant to be autonomous self driving and the guy didn't and died when you can see he had buckets of time to react. He didn't until he was right on top of the truck.
youtube AI Harm Incident 2025-09-04T19:0…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwuQRD2kupySFWdpTh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyfovMhdm3eLeQV9G14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzXhs9F4VXJJur7V4d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxm7nng80gPtO_qWEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgznIuALORolMMREhxh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzRAWB1j9sIV36PwAB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw4a-Xy9TqWSUF5xWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwnoKYJtEYMF6GUwbF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwvKXyJ3tTKFOnqNpd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw1gx3jQu_InJLPGqp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]