Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How is it that in this hypothetical situation the car has not enough time to brake but enough time to accelerate into the back of the truck? Also I'm pretty certain self driving vehicles would be programmed to have strict following distances varying on what sort of vehicle you're behind.
youtube AI Harm Incident 2015-12-10T08:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugi6QnAsmjUnJ3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UghFvsP8eJwde3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggeP-aEJVCsH3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgiCtG8B4yTOPXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghRaFbj5Dr8UXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgifStBUPZbFQXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgjgWMosOf9y_3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjA1lZE8jvePngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj5Kvaeg5HsAngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgibjtNUDEehjngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]