Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is definitiely still some tough choices, the trolley problem might not be exactly what we get but close enough. If the car is going too fast to stop safely who takes priority, pedestrian or passenger? If the car is autonomous it almost certainly did not commit a mistake, so maybe there the passenger's survival takes priority and instead of driving off and killing them to save the passer by, it reduces damage caused to the lowest possible degree. Most of the time there will be a way to do no damage to live people, but this does matter because there will be other instances. There doesn't need to be machine error for these situations to happen, humans are dumb and might run through the street. What if the person running is a child, does that change who takes priority even if the child is definitely the one making a mistake? Don't get me wrong, automated cars will eliminate almost all traffic incidents and are already much better than human drivers when put in good conditions they are trained for, but that doesn't mean we shouldn't care.
reddit AI Responsibility 1648691618.0
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_i2s8j5h","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"rdc_i2smx2p","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"rdc_i2sjcg5","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_i2s4sm4","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_i2s8p86","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]