Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I would never feel safe being a passenger in a self-driving car if the car doesn't prioritize your safety. Car: "Now throwing you off a bridge because of someone in the car ahead has more passengers but the driver in their car was stupid, sorry for your luck." While it would be nice to minimize the damage, sacrificing the passenger should never be an option no matter how many people are at risk. If they prioritize the most people to survive I can say for certain that self-driving cars won't catch on because passengers would be too scared that some idiot on the road may make a mistake and just because they have more people in their car the self-driving car will sacrifice the passenger by swerving off a road. Humans are programmed to be self-preserving first, then considerate of others second. If self-driving cars were to reverse this logic then it wouldn't make sense.
youtube AI Harm Incident 2016-06-25T18:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugi0oNCeHP92AHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjQqqQ8pvsVC3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UggUueruHXVu1ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgijXoYPKjY_1HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghGnVVF0cNqSHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Uggfmpuz0HRxeHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggRgo7ALDJJCHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghT-lpLHZCE-HgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UghO2h5e1TxTNXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UghNM3jgeKUHEngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}]