Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@MVersusPIt’s quite interesting to me how he didn’t connect the dots, because he…
ytr_Ugw37CTH4…
G
As a black guy ,i tried it and it worked ,chatgpt is racist af ngl…
ytc_UgzjPZ_Li…
G
So even if everything he said was correct. And that's a big IF "He says he wasn'…
ytc_UgyBM-U4O…
G
I'm fine with AI taking away jobs. Just means these companies will have to accep…
ytc_Ugz9dpeVd…
G
ai= its fun today but trust me it will be usefull someday.
better not spent tri…
ytc_UgygXbyLe…
G
Someone needs to investigate the funding for these projects. 99% chance the US G…
ytc_Ugz9BwKED…
G
Elon is butt hurt that he isn't in the ai ponze bubble and tiny hat men gonna ta…
ytc_Ugx_julfw…
G
When I ask AI to generate dictators playing basketball on the Moon, either it ca…
ytc_UgyuAT0Q1…
Comment
Biased dataset? Men work longer hours than women, do more overtime, do more self-studying in areas that influence performance than women, work more years, take less time off work. So AI tasked with finding better employee will obviously favour men as men are obviously (by objective data) on average more profitable employee. BTW. example of AI prefering whites over coloured people refers only to blacks, not to east Asians. ANd for obvious reasons AI fed data from US would prefer whites. Blacks are only 13% population but commit over 50% of murders and majority of violent crimes... Hard to call AI or dataset biased.
youtube
AI Bias
2022-12-22T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzBgq84hAC8XJ7aTu14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgBQKRuRRbF03YVhp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1YaQHzD0C140Mh954AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwit7ej6L59J5fai894AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzlgmnh92fWHNcNhMZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTeFzi5k0mdJj7mVB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyINQKr_hQoIEj9dXF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwRCA44vT7-1Ma5C-l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzcxzPiFBVymj2CU6t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9UmD0BLjhJFNqD1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]