Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is a fucking stupid argument against self driving cars I am tired of seeing. Situations where it will literally be unavoidable to kill someone will be extremely rare given the improved reaction times/control self driving cars will have. You know what would probably happen? It would do whatever it could to cause the best chance of reduced injury. But these sort of hypothetical questions pointless: "What if aliens came down and said they would kill all humans if you didn't murder your child, would you do it!?!?!?!?!?" A real person in that situation wouldn't even have time to react before hitting and killing both people. But no, lets keep human drivers because the trolley problem is hard.
reddit AI Harm Incident 1504815035.0 ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningutilitarian
Policynone
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_dmp2zd0","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}, {"id":"rdc_dmp6sw7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"rdc_dmp9o7f","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_looxc0k","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"rdc_loq5txa","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]