Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is a wrong exemple. This situation should not exist. The self driving car should not drive so close to the truck. If the car were driving further away from the truck, it would have had time to stop and no one would have been injured.
youtube AI Harm Incident 2017-02-19T12:2… ♥ 18
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UggPlXqhTyqn-HgCoAEC","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgijP7n1AYDAFHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgjSelYS_yNxMXgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghtfnAXloUXangCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ughv4M1zM_ZhFHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_UggWc282B73l5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugir1uoAgHGQ63gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghiAb5OOQ50H3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghKAohdhKOGKHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgjB5UYNyemZAngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"} ]