Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
😂they clearly aren’t using their brain or they’re missing it. There’s certainly …
ytc_Ugwl20k1x…
G
@Goroca the output is still different from any image in its database and if it’s…
ytr_UgxeJQ2aR…
G
Went to an art school with a strong focus on contemporary art and guerilla art. …
ytc_UgxnmHPUT…
G
AI will eat up most of people, sure. But eventually, riches will also eaten by A…
ytr_UgyukOLrd…
G
I agree with this guy. I think a fear of technology stems from a lack of underst…
ytc_Ugw8phYbt…
G
Look, everyone. If you want some random pic to go along with your random fanfic…
ytc_UgzZpT5bd…
G
Don't start worrying about AI now because AI has been running our " new" society…
ytc_UgzjStRMD…
G
OpenAI didn't want their illegal practices "opened" up to the public. So they Ep…
ytc_UgxQ7jcz5…
Comment
This kind of armchair computer science really irks me.
- First example: they magically "programmed a robot" (how?) that "preferred" (how?) men over women, whites over blacks, "and so forth" (how much?). The closest real studies I know in this context test language models to see which jobs they correlate with which identity trait. The model correctly guesses that firefighters, policemen, engineers, CEOs, surgeons etc. are more likely to be men, and teachers, nurses, students etc. are more likely female occupations. That's not sexist, that's literally what the statistics are. It makes no sense to train a model and then wish it doesn't reflect reality.
- Second example: it's not the fault of the model that most Chicago crime is committed by blacks. It's not the model's fault either that the Chicago PD feeds the person's race and/or picture into the machine for prediction, and it's certainly not the model's fault that the PD puts people on high surveillance based on just that one result. If you don't want the machine to do profiling, then don't give it the data it needs to do so.
Also: the victim has been shot twice by whom? The video doesn't say. In Chicago, the likelihood of it being a cop is lower than a gang member. If it was a cop: what were the circumstances? Did he threaten police with a gun? He has no criminal record, but every criminal has to start somewhere, so was he committing a crime?
Stop slandering "AI". You don't know what you're talking about.
youtube
AI Bias
2022-12-19T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyW5oYgaVl2e9eScrh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOlLsQtWifo8qBeQx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwEbeTLua_--O7oGCJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyfQC1znGqPDUmf6Jx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJTtik7PBvLyP0K0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgymNjau3ZOnG7o0nVt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzgO4s1K_621UNd4cF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzFUnNA61qn1OQpdZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx47dwVyaMkFNvjDAN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxsRaTcho6OC6myK0l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"})