Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let me tell you AI is everything a human brain is but 1000 if not much more capa…
ytr_Ugz8xU6P4…
G
Actual Legend. I've always said A.I should stand for Artificial Idiot. and if a …
ytc_UgyYC-gqD…
G
I asked Grok 3 to make me an image I wanted...Grok 3 told me no...then gave an e…
ytc_UgxtDbdG9…
G
The Robot is thinking, "who the hell is this dork and why is he on stage with us…
ytc_Ugz6zPLrs…
G
Like:
Sorry to burst your bubble, but we ain’t consider that ai generated image …
ytc_UgxrwuABh…
G
Sam's argument doesn't really hold up to more presentations he gives. Artists (c…
ytc_Ugwtcqv-b…
G
the quick explanation to as why ai "art" is stealing and not taking inspiration …
ytc_UgwzClY4x…
G
After complaining that the big AI huys don't stop, he gets the opportunity to pu…
ytc_UgzkRZK1X…
Comment
I would go even one step further and claim that we will become the robots. Like you said, they won't get humanoid robots to work cheaply, so we will do the manual work - that an AI tells us to do. Maybe even with nice AR glasses, so the AI can tell us exactly what to do and we don't even need to think.
youtube
AI Jobs
2025-08-29T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzpPWHuBwtrK5Q885h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsWNYJzeyBZNqsJyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJ6QXBlMUDa3T58sJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxeWH4OSYdZ9fzzK5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyn9ykPlnRpeFNc3Zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWABZGAh3dGPZqq7t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMa8UTw_WGDJ1AnHp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxahWsCECMivLQcMad4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJwvMbyJEIXDJmwSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5rkr0aRHXc-e7H1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]