Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's really disturbing how people put faith in AI. Haven't they ever had a problem with a computer system? Think about how well the self-checkout at a grocery store works. It can't even tell when you have the right amount of stuff in the checkout area. Should that kind of technology be used to jail people and ruin their lives?
reddit AI Harm Incident 1773458856.0 ♥ 1
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_oa94vj2","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_oa9ctsj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"rdc_oa9oqdn","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_oab4slp","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_oacf306","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]