Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
+Xezarious42 This could be regulated by holding the driver responsible for the decisions the self driving car makes. Company B's cars would inherently be more prone to cause harm to others, even if they're slightly safer for the occupants. Occupants of company B's cars would thus suffer more liability on account of their car's decisions in the long run. The argument only works in a scenario where you could be severely hurt and would rather sacrifice someone else to minimize your harm. But you could never predict that if you buy company A's car it would save your life and company B's car will get you killed. Both companies would have good safety track records if self driving cars become common. Ultimately people will choose the car that is least likely to hurt someone or cause damage regardless of if it's self driving or not, because they're responsible for that damage.
youtube AI Harm Incident 2015-12-08T23:0… ♥ 3
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyliability
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgggitcG_CbrUXgCoAEC.87WDKCb8uB_87Wc8Git_dg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087ZUsAi0XrE","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgiwrXyY1rZ69XgCoAEC.87WAOkvgKe087Zk-pe2Tvs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgjFbGyekK77fngCoAEC.87VyK9Y8dlO87WlJMEeWoL","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgiiYSCGtUOQQ3gCoAEC.87VxmXkagiW87W5Qy0QLaG","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytr_UghlFJ0pJ4lt_ngCoAEC.87Vxapt4mjd87W8NWV8_40","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UghfG4rKbyLwlXgCoAEC.87VxK1IcVdT87W9BZPSPbn","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"mixed"}, {"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VyZ_yCQtY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_UghiyJc91JDD0XgCoAEC.87Vw0Yij8ek87VzogUdvsL","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugg-LAtK9Y2urngCoAEC.87VuXpLFzck87VxWEbSmkv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]