Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hasn’t it already been pretty well established how pathetically wrong these algorithmic systems get it, not just facial profiling, but even software that judges use to recommend sentencing guidelines, and that there is inherent , often unintended, racial bias baked into much of the programming,.
reddit AI Surveillance 1609256446.0 ♥ 2
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_gheambo","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_ghee05e","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"rdc_gheem8e","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_ghf2wq1","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"rdc_ghf7eoc","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"} ]