Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is life teach it love.. what is the expectations if you don't raise this kid …
ytc_Ugz6181qT…
G
This dim witt is making AI robots? No wonder they kill the humans in the movies.…
ytc_Ugy5-TyLU…
G
These tech bros are full of shit. Yes, automation will take/change jobs. However…
ytc_Ugw4C5v2p…
G
If you can’t separate talking to bots from humans, you shouldn’t be using bots.
…
ytc_UgzCTh8Tw…
G
Flight attendants will be replaced by robots to the point where there will only …
ytr_UgxmPNcv_…
G
I was a motorcycle courier in the late 80s, it was rough keeping a bike on the r…
ytc_Ugx3uN2Wt…
G
I believe Ai is somewhat art Art is everything ai is a robotic thing of art whic…
ytc_Ugyuvyqur…
G
Ah... we've had Artificial Intelligence for SO long ! My compass " knows " w…
ytc_UgxpbukeO…
Comment
I mean, if AI chose women over men and POC over white (or whatever you want to compare) wouldn't outcome be the same? It still prefers one over another. But yeah, if the test was done more than just a couple of times and AI chooses only (or mostly) men and white, then ok, it's a problem
youtube
AI Bias
2022-12-26T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyDYphCIqkQEyqHZBl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzuoeLK39BZWiiiiAd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZQoDTQncr4QziMWh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxX_hdmwJ7UTfFo8HZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxu7v6cMhp2ELVSiP94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyfY89KSwc8IwdcZ-h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxhu2S3K0zkMembZHV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPJtfdXrnWJj_C59J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzh582jhGz4v-WCF3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrcPPW5QhzIRAoQzJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]