Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There's actually a lot of other biases the AI had too when you read the article. Seems like once it determined the top 5 were never women due to the gender imbalance in the data it fed on, it then would class the women's only schools as a negative factor. This also means it likely focused on certain schools for top candidates, as well as similar factors. In effect, this AI literally just repeated the hiring preferences of the humans who did the job previously, possibly retaining prejudices too.
reddit Cross-Cultural 1539207256.0 ♥ 17
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_e7j94ef","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"rdc_e7l28qc","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"rdc_e7jb51r","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"rdc_e7je0wy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"rdc_e7ihc49","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]