Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The reason i kind of like the whole AI Art thing is that you can get an inspirat…
ytc_UgwP85zpe…
G
Embedded land also has a lot of things that look 99% the same, but that 1% is co…
rdc_n3lxuu1
G
Ai can not replace devlopers avoid these ty[e of myths people are exaggerating …
ytc_UgwDLe_lu…
G
Look back to the “Roaring 20s”, when large-scale mechanical automation replaced …
ytc_UgwRYaqOa…
G
Dude, are you dumber than ChatGPT? Why are you treating it like an intelligent a…
ytc_UgywaBVI4…
G
if we get AI that is so smart we are hopefully smart enough to program their rew…
ytc_UgjagJyEa…
G
I'll be the one to get emotionally attached to a pet robot and cry if it stops w…
ytc_UgyQLeUXl…
G
Is this how you are seeding peoples minds with the idea of malicious deep fakes,…
ytc_Ugyn50D5b…
Comment
<hate capture portals>
PRIME INTELLIGENCE
Amazon built an AI tool to hire people but had to shut it down because it was discriminating against women
Isobel Asher Hamilton 4h
Jeff Bezos
Amazon CEO Jeff Bezos. David Ryder/Getty Images
Amazon tried building an artificial-intelligence tool to help with recruiting, but it showed a bias against women, Reuters reports.
Engineers reportedly found the AI was unfavorable toward female candidates because it had combed through male-dominated résumés to accrue its data.
Amazon reportedly abandoned the project at the beginning of 2017.
Amazon worked on building an artificial-intelligence tool to help with hiring, but the plans backfired when the company discovered the system discriminated against women, Reuters reports.
Citing five sources, Reuters said Amazon set up an engineering team in Edinburgh, Scotland, in 2014 to find a way to automate its recruitment.
The company created 500 computer models to trawl through past candidates' résumés and pick up on about 50,000 key terms. The system would crawl the web to recommend candidates.
"They literally wanted it to be an engine where I'm going to give you 100 résumés, it will spit out the top five, and we'll hire those," one source told Reuters.
A year later, however, the engineers reportedly noticed something troubling about their engine — it didn't like women. This was apparently because the AI combed through predominantly male résumés submitted to Amazon over a 10-year period to accrue data about whom to hire.
Consequently, the AI concluded that men were preferable. It reportedly downgraded résumés containing the words "women's" and filtered out candidates who had attended two women-only colleges.
Amazon's engineers apparently tweaked the system to remedy these particular forms of bias but couldn't be sure the AI wouldn't find new ways to unfairly discriminate against candidates.
Gender bias was not the only problem, Reuters' sources said. The computer programs al
reddit
Cross-Cultural
1539208935.0
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_e7jfxbi","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"rdc_e7j8m7u","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"rdc_e7j4eex","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"rdc_e7jrf20","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"rdc_e7iswsg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]