Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
> So, not really the AI's fault. Just the inputs weren't ideal for diversity. It might be a good metaphor for systemic injustice as a whole. Even a strictly objective computer algorithm is biased by the long history of gender imbalance in the field. We can't really expect human managers to be much better.
reddit Cross-Cultural 1539195962.0 ♥ 10
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_e7j08kv","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_e7imrxm","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"rdc_e7is54n","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"rdc_e7ix5sk","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"rdc_e7j76uf","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]