Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
No it's not. It's only an issue here because this was a prototype that still needed a safety driver. Fully self-driving cars won't need someone to suddenly handle emergencies, at worst they may need guidance in socially complicated edge cases.
reddit AI Harm Incident 1529672803.0 ♥ 5
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_e156cxs","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"rdc_e157sop","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_e145vca","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"rdc_e14mr65","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"approval"}, {"id":"rdc_e13ta08","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]