Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Like you said, AI progresses suddenly in leaps, such as ChatGPT and competitors …
ytr_Ugws_ERjL…
G
No one wants to die. There's no guarantee that whoever takes over will spare us …
ytr_UgwUSz7uv…
G
I tried to use AI to help me develop an app, the code was outdated as hell, so I…
ytc_UgzKCPHFy…
G
And ai partnered companies have trasparently stated that they steal and use user…
ytr_UgwIjM-fq…
G
No technology in the history of the human species that has gone through the Gart…
ytc_UgyjqX1Or…
G
It's incredible to me, that this gentleman is so ignorant of history that he thi…
ytc_UgwqBYS5C…
G
If you think a human taking inspiration is the same thing as an ai spotting a pa…
ytr_UgyVu4Nsy…
G
I'm confused!! Imagine a time when AI takes over all service-based jobs and huma…
ytc_UgxTO_1xK…
Comment
this video overlooks a very important fact(and also a litle less important one), the main fact is that to even make that 7 month improvements you need humans, ai cant create new things they can only created thing that were mentioned in training material.so if even clsoe to evryone is replaced then ai wont have any improvemtnsa and that wiil be the downfall, second and this is a limitation of curent AI(not models but architechtures ) transformers get increasingly resource incentive the larger the duration of the task, so at some point an ai of that size wont be more cost effective than a human, but a lot more expensive although SSm exist the quality needed for AI to even remotely coordinate on that scale is not achieveable without transformers, so hypothetically if this ever actually happens it will also end aa lot faster scale, so this prediction isnt viable
youtube
Viral AI Reaction
2025-11-24T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4Fd0lYdCUw6ftBOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQdY-o33h3tNe0MQl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz9jDEha3jeKcXsGDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw5vVowOItm4_LRwep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyujAXqS7R4wIybDZp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzE4W3MuvDFhLLEyWp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxR3s_TaG57l9c4JZx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzG9Dq020n6SwRKpIh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqLTkaaDCSJEZ_WQN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0kWQRf06UsEZXTD94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"sad"}
]