Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Facial recognition technology these days are made by using a massive amount of training data for the algorithm to practice getting really good at recognizing people. The problem is, that much of that training data is white people. There is less training data of minorities, so the algorithm isn't quite as good at recognizing minorities as it is others. Add in additional technical issues like lighting on darker skin and such, and you have a piece of technology that is very fallible, particularly towards minorities. Facial recognition might be a useful tool for law enforcement, but it should not be used as a first-contact ID system, just like IP addresses should not be used in this way.
reddit AI Bias 1609348870.0 ♥ 15
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_nck67fy","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"rdc_nck6g6y","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"rdc_nck6trm","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"rdc_ncka3ai","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"rdc_ghiku29","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"} ]