Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't believe this BULLSHIT! AI is NOT in control of anything. AI makes mistakes…
ytc_Ugzjugr0M…
G
They are getting closer! Her mind can be an a.i, thats actually preferred since…
ytc_Ugzj_-MJS…
G
He wasn’t arrested by facial recognition he was arrested by a human. Who wrote t…
rdc_h55mr2f
G
None of the AI software engineers that I know personally. Think that there's a m…
ytc_UgzUkkp1C…
G
Honestly I'm getting tired of hearing about how scary AI is. The world could be …
ytc_UgwDizRUk…
G
Here is a crazy idea. Just use nuclear energy. It's better for the economy and m…
rdc_ibdnco6
G
You’re still drawing the picture. You have to put time and effort into it. With …
ytc_UgxtyD7Fe…
G
things are happening so fast that there seems to be no time to think about right…
ytc_UgynNzGh1…
Comment
In some ways, I think the more concerning situation isn't if most of the jobs are replaced by AI, but if only 10 to 20% are. At that point, it'd be really easy for there to be this societal adjustment that is just like "wow these folks are too dumb and lazy to get a job!" If 80% of the jobs are taken by AI, then I think it's clear that something HAS to be done.
I don't think we have enough resources to make enough robots to do all the jobs. I think we will eventually reach a point of diminishing returns where it doesn't make sense to do certain things with AI. But also, this entire thing is going to be incredibly ugly. Especially so for folks like us who are already struggling to find a job.
youtube
Viral AI Reaction
2025-07-22T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwV9m27cLurI8Iul1V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzLYGDgmJU5-Xqr3I14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcN0hlLnLTVjSOMMF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwL0x_aXYfC_SGdx914AaABAg","responsibility":"society","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyYg1w4IhI-J3w17hV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxycQrpRvL85yKwfFt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzGarQtY9Hny8Pu9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxOAdGukuOJkiJl0mR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2HbnWelwYDkG3BcJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLw62iJDkISM9trxJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]