Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ok. I think that the Self Driving car should be programmed not to get its self in that situation in the first place. For example, the car should be programmed to never get boxed in unless it's parking.
youtube AI Harm Incident 2015-12-29T19:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugj_f2_hIfbFIngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgjuSAOvpXKjoXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgjXkfuodsaTaXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgiH4bJgUd72t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugis_iWcr_zaLHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugg1rrdyzbR2AXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UghCPalsjYnrLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgjHJF2WYdJEkngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiXr1C50oWCgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugh_wlHO5sE7gngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]