Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No one will need a job because everyone will get everything they need for free o…
ytc_Ugw92U4oc…
G
I am going to be completely honest here, I am scared for the future. My primary …
ytc_Ugzr1PQSy…
G
Thank YOU Sam, for voicing this out!
It has annoyed me so much this whole time~…
ytc_UgxYNheNL…
G
YOOOO. AI stans are sooo right... Why should I bother training for speedruns in …
ytc_UgzQacyjQ…
G
So relatable like I’d be lucky to find a beginner artist who doesn’t use ai art…
ytc_UgzVupWAE…
G
*This made me think of a classic Buddhist question: where is the self?*
*If I l…
ytr_UgxvOe6qF…
G
I used to like shadiversity and his historical content but those dogshit takes I…
ytc_UgwZYsg01…
G
Nope, we've been in recession for a while now, AI is just glossing over a falter…
ytc_UgxYZWu_A…
Comment
10:10 this is a very temporary argument. Humans are not really different from AI other than the material we're made of, the amount of it, and how we evolved to arrange it. There is theoretically nothing stopping us from creating an exact equivalent of a human (in, fact, we already have the mechanisms for that, called reproductive organs, except we don't know how it works and how to control it yet). There is no such argument as "AI is bad because it's not human", only "AI remains overall simpler and less useful than a human (for now)"
youtube
Viral AI Reaction
2025-08-30T08:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwjX1HNvCP3EL_K0ql4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyotSIauK36RRszEIp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4DBS5nxGQMW7oNt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"confidence"},
{"id":"ytc_UgwyTgmW2vYEysaqFVB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNBvezXH09vJzBxAF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugww6EFFHwb-5QEuz5t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwVl__muuLODiBMpsR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJFAUGzdEiQQFclTd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycwV9yWjcGDJ5ck2F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzU82MumC9cNGlTsxh4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"}
]