Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"What if a truck has an unstable load that falls on the road in front of you? Should the self-driving car prioritize your own life or the common good?" "A self driving car would probably keep itself a good distance away from such a truck as to avoid this scenario." "Ok, but what if a child was crossing the road? Would it save you or the child?" "It would probably have a better reaction time than I would, so regardless the chances of any fatalities are minimal." "What if a tree fell on the road? What if a meteor was about to destroy the highway? What if your mom is so fat that the road caves in?" People keep attempting to put self-driving/operating "whatevers" into increasingly ridiculous and improbable situations. Regardless of the situation, a self-driving car will almost always save more lives than it would ever endanger. Its the same thing with vaccines; it will save millions if not billions of people from needlessly dying, but one person in 10 million will receive fatal complications due to an allergy or something else.
youtube AI Harm Incident 2015-12-08T18:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgizunohajILCHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjjWOUDi8MzcHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjQvTuYsrqOtngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgghF14lWrWg93gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghSobsLJzKwTngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghTsPIeRMcNT3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiGpAhmNNMkf3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgjlPAxVCSrTmHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugju10Xr0tXdF3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugjc3KGPZNZyqngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}]