Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I've heard this before, the car could ask questions (through computer) such as this. For example if the person in the first dilemma set his car to hit the SUV the car would hit the SUV in a situation like that. I don't think self driving cars should exist either way
youtube AI Harm Incident 2017-05-21T03:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UghMzFQ5uciXyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghSVao5v-7LzHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghWZB_DNXhaTXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj7Q3CElinFQ3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgjTiwroBtb2T3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgjIFRBxgjA2tXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UghlT0jEO-duZ3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugh8Tr7F8wrmeX3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugi0hd2FnlV7Z3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugil9BPZ0b0LongCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]