Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Due to the nature of AI and how it uses pattern recognition, this won't be effec…
ytc_UgyWCUAiv…
G
Yes, i use it almost the same way, but I would suggest requesting (rather requir…
ytr_UgwgmLGGE…
G
All this talk about sentient AI. Dude, the problem has started the moment AI bec…
ytc_UgwXayCy2…
G
And it didn't affect only young people that are entering the job market. I am IT…
ytc_Ugy-1NN3n…
G
There is also the entire argument of “if auto pilot is so safe how come you have…
ytc_Ugzb1C-4n…
G
I don't understand how chatgpt knows so much about so many things yet couldn't t…
ytc_Ugz9rHukj…
G
As much as I'd like to dump on Elon/Tesla and watch them sink - there's no excus…
ytc_UgzQ4tgfI…
G
Oh hell, no I’m legally blind. I already have to deal with electric cars that I …
ytc_Ugz3b4uKu…
Comment
And a few of these 'secret sauce' AI learning programs were learning to cheat. There was one in South Africa attempting to detect pneumonia in HIV patients versus clinicians, and the AI apparently learned to differentiate which X-ray machine model was used in clinics vs. the hospital, and used this data in its prediction model, which the real doctors did not have access to. Because checkup x-rays in outlying clinics tend to be negative, while x-rays in the hospital (where more acute cases go) tend to be positive.
https://www.npr.org/sections/health-shots/2019/04/01/708085617/how-can-doctors-be-sure-a-self-taught-computer-is-making-the-right-diagnosis
> Zech and his medical school colleagues discovered that the Stanford algorithm to diagnose disease from X-rays sometimes "cheated." Instead of just scoring the image for medically important details, it considered other elements of the scan, including information from around the edge of the image that showed the type of machine that took the X-ray.
> When the algorithm noticed that a portable X-ray machine had been used, it boosted its score toward a finding of TB.
>Zech realized that portable X-ray machines used in hospital rooms were much more likely to find pneumonia compared with those used in doctors' offices. That's hardly surprising, considering that pneumonia is more common among hospitalized people than among people who are able to visit their doctor's office.
reddit
AI Bias
1569418111.0
♥ 114
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[{"id":"rdc_f1eh26o","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"rdc_f1ef65d","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"rdc_f1ednqf","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},{"id":"rdc_f1e9frc","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"rdc_f1ebf3f","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}]