Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Computer engineer here. One thing I've seen some claim is that this is intentional. That police use tech that's less accurate on black faces or that the tech companies intentionally design their software this way. That is not true. AI can only use pictures scanned in to understand the faces it's seeing. The shadows in your face are important for identifying contour and seeing what your face actually looks like. Darker skin naturally blends in with shadows more and when many of the pictures scanned into the machine are blurry, it's not surprising that it'd make these mistakes. **However,** this software should *never* be used as a justification to make an arrest. That is absolutely insane. At best it could be used to create a short list of potential suspects to look into, but at no point should the tech be used the way it was here. The police are completely at fault and this woman should get one hell of a payout (god I wish it was from the pension fund). Fuck all of the cops involved in this stupid decision and I hope we can get some legislation in place to block the use of this tech as "evidence" going forward.
reddit AI Harm Incident 1691419794.0 ♥ 26
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_jv5vcw3","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},{"id":"rdc_jv5wncf","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"rdc_jv5wf34","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"rdc_jv5zk4y","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"rdc_jv6vtkz","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]