Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I honestly think there needs to be some form of compensation for these sorts of mistakes. 10 days isn’t terrible but we’ve heard of people going to prisons for years, even decades. The least they could do is set them up for a good future since they took irreplaceable time out of their life for NOTHING. Facial recognition doesn’t need to be implemented if it’s going to be used improperly, and I don’t know if I fully trust the U.S. justice system to get it right,
reddit AI Harm Incident 1609210427.0 ♥ 2
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_ghcsqjc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_ghbq5f2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_ghc13ao","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"rdc_ghc13u0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_ghcrzo7","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]