Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
this entire dilemma is only relevant during there transitional period into self driving cars due to the fact that it can use a hive mind to move the cars around it (as they would also be self driving) to make room for everyone to avoid the accident. until then you could stop using Kantian ethics to feel better about yourself and accept the reality that people are going to get hurt and try to do as little damage as possible be choosing the safest crash
youtube AI Harm Incident 2015-12-15T10:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgjaTrm3xjVlsXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgiVJWa_Y6bmRXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghRt6TFpVDC0HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjWo4JZkIB25ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggQIWXW0Sjhu3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugh4kvsWRmbvVXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghHt-JHGMzZ1XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgiJ_L1RWjzFSHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugiygo6Qdg1iq3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggxWJ27f-_UIngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]