Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
When you use an image from a camera and compare it to the images in a very large database, the risk of "false positives" skyrockets. People in the criminal justice system don't seem to understand this.
reddit AI Harm Incident 1773369211.0 ♥ 5
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-25T08:13:13.233606
Raw LLM Response
[ {"id":"rdc_o9zy0oe","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"rdc_oa4o8uc","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_oa5nomh","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"rdc_oa5r1d2","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"rdc_oa8dplb","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]