Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Keep your chin up, and keep working. There's been not much public talk yet, but …
ytr_UgxfuscXY…
G
See!!! The AI robot just told the truth but people still want to create robots!!…
ytc_UgyOZOkyC…
G
I can tell this guy is the type of guy that uses AI to cheat on his assignments.…
ytc_UgyO8UlOf…
G
AI just learns what humans teach it. Humans are not ethical, ipso facto, AI is …
ytc_UgxCXJ8N4…
G
@AlisSpark How many frames could an AI draw to fill in an anime before it no lon…
ytr_UgwbpI7I5…
G
Spectacular video. 82 year old veteran of the development of the “PC” from 1984 …
ytc_UgwSOBmJK…
G
autopilot aint fsd(full self driving). "AUTOPILOT" IS THE NAME TESLA GIVES TO CR…
ytc_Ugy-tlU-W…
G
A big part of the problem is that *these aren't real banks*. For the most part …
rdc_ohua29k
Comment
Hahaha a biased robot. I just watched the whole video. I don't think you're wrong about the AI being biased because at the end of the day, when looking for possible candidates for a job, one needs to be biased. What you didn't explain however, was how that was inherently bad. If the code of the system was trained to compare men and women, sure that would be bad. But the AI doesn't have that programmed, so you can't say it has gender bias. You even explained in the infographic that it's only bias was "good" and "bad" after examining characteristics. Maybe the video should have been called "AI reflects candidate preferences in male dominated industries"
youtube
AI Bias
2021-06-10T23:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwDW3MEKUQRg5cYVQt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugws_d2Y-7hbfgM5h_h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBj0WUHpEKWZTnKB14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyujJa0atwPZIGzyap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyz7L92AER4HUs36ol4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJm5cmf8UQXUUVjWh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAuySIDzFI1lyx2NN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwkPjlTScz2Yn_XgQd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyMBK_LdH17XO8Qaah4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzs4fUmt_E3je34Pf14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]