Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
From the sound of the article they threw in a ton of resumes and trained the AI with the resumes of people they hired. Really the AI was picking up on the bias of the hiring personnel if the only reason it could tell why the resume was not chosen was because it had the word woman or an all woman's college on it. This really sounds like a garbage in garbage out issue... Since I don't really think there is an empirical way to determine a good person to hire that wouldn't include the bias of the person choosing the candidate.
reddit Cross-Cultural 1539204356.0 ♥ 117
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_e7il5t7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_e7inq3u","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_e7j81fp","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"rdc_e7jals0","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"rdc_e7juc4u","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]