Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Think of it in reverse. If we started out with robot cars, but then wanted to allow humans to drive them, the argument would be against all the deaths caused by human error.
youtube AI Harm Incident 2014-05-25T16:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgjmT9M6pRPF63gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugjb5mbCYWFOZngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugh3t6ctXcIqLngCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjhRXM2999pMXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjWeooRvjbb43gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgiQxrJhhXFfdHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjQXLSSJdVGungCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UggumOOE0wMPhXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgjQ6QxetkdMEHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugg5W5f566W6tngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]