Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The software company had nothing to do with this. That'd be like suing the gun manufacturer when a cop shoots someone's dog. The police are to blame here for using the technology to make an illegal arrest. The tech is fine for identifying potential suspects, but to blame the company when their tech is used out of scope is ridiculous. Additionally, the program almost certainly doesn't "discriminate based on race". It's just objectively more difficult for an AI to identify black faces than white ones since shadows blend in more, making contours more difficult to "see". Excessive darkness is more common than excessive bloom, so white faces are easier to accurately identify than black ones in most cases. Either way, no one should ever use this software as a justification for arrest. At best you should be able to get a list of potential people to look into. Blaming the software is exactly what the cops want.
reddit AI Harm Incident 1691419307.0 ♥ -1
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_jv68y4n","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"rdc_jv6d7de","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_jv5p3yb","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_jv6b6es","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_jv5ychu","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]