Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
>The biggest one I see is law enforcement. Cops are hugely biased by most studies favoring white people over minorities in the US. Something society has largely just accepted as status quo. Replacing police in many of these roles with automated systems is ultimately superior since it levels the playing field, reduces costs and frees up resources for other things. Machines can also be biased, there are many instances where this has happened, for example https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G The data the AI learned from was from the past 10 years of job applications since tech is mainly dominated by males the AI figured males must be preferable so it would penalized female resumes and promote male resumes. You also have this classic example from Better off Ted. https://www.youtube.com/watch?v=jqG1fX3ZaLQ
reddit AI Surveillance 1580441660.0
Coding Result
DimensionValue
Responsibilitynone
Reasoningutilitarian
Policynone
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_fg2azx3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_fg1optw","responsibility":"none","reasoning":"none","policy":"none","emotion":"outrage"}, {"id":"rdc_fg1k7k6","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_fg2umgp","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"rdc_fg27ih6","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]