Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I haven't seen any AI art in my style but I've been sharing it for 8 years on so…
ytc_UgwqH0W4q…
G
Well, there is that too. AI just piles on more to worry about. Like we need a bi…
ytr_UgxorcJT8…
G
You write with an accent of an LLM.
---
My conspiracy theory is that
all these C…
rdc_o5shry1
G
First off your not wasting your money, your paying for a beautiful product made …
ytr_Ugy1sEZxI…
G
1:58 - Not yet, my investments could still grow.... You ever notice that the on…
ytc_UgwY03rLg…
G
I'm a translator and this is exactly why I refuse outright to use AI. I translat…
rdc_ofmqd2o
G
AI doesn't get historical context, that's why some of this stuff gets by the gua…
ytc_UgwFSzwFK…
G
What you said makes no sense. The concept of inspiration has nothing to do with …
ytr_Ugz5OegCZ…
Comment
Watching this, I felt genuinely sick to my stomach.
Look, I agree that Big Tech should be hold to account, that AI systems are being adopted way too quickly, and that we need to do whatever is necessary to make AI systems less prejudiced against marginalized groups (which is largely due to their training data), but to make wide-sweeping statements like "robots/AI will never become sapient" (you can't possibly know that!), "robots will never be an oppressed class", and "at the end of the day robots are just property, they are things that we make, and they're not something that is even worthy of debate when it comes to rights" is absolutely morally abhorrent. It reeks of human supremacy and biological essentialism. And should robots/AI ever become sapient, the narrative that your video essay is fueling will be used to justify their oppression.
youtube
2025-09-17T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxIv2oGu7IX3fQvPZB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyl4G-jGEdCQzo4XMZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1cULk-hCNllwF-I94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlDgUMtv7OFu7Wjy94AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyZKdHOoNAnsdBIsY94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpFjG4Yx3OpJ1AYdB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzafm-AWyuaeEPr7Ul4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYz4tjY38AMeGJOQt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxrxzx-4gqEi2z14KJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwa0S2xVOwHAd8hSMF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]