Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wasn't this guy hyping up AI and saying it was the future in his other video upl…
ytc_UgwXqKcgB…
G
What’s crazy is how much of a contraption the Waymo looks and they STILL crash g…
ytc_UgwnONBem…
G
there is a chance that AI can eat itself to the point of unusablity by processin…
ytc_Ugzyiz_Ut…
G
En effet le métier de prêtre exorciste sera la seule chose que l'ia ne pourra pa…
ytr_UgzIvPSlH…
G
1. It's not PURELY driven by corporate greed.
If your competitor is using AI e…
ytc_UgzdLAKkJ…
G
>Why isn’t it possible to create a system where everyone has some work, with …
rdc_gsqvcuk
G
3:28 ngl the original was far prettier and stylish than the ai version.
Origina…
ytc_UgweNPELL…
G
The beginning of an android A.I like commander Data from star trek how cool is t…
ytc_Ugwbj1Djx…
Comment
The point about resource economies is one people forget even today.
There are literally people in Canada who want to throw all the knowledge based industries out and focus solely and only on oil and gas extraction.
They don't see this as a losing bet, even though it definitely 100% is (for most people).
In fact, Canada did do a bit of this under Harper and it lead to deindustrialization and loss of jobs and sectors that we still haven't recovered from (deindustrialization is easy; reindustrialization is basically impossible).
----
I also find it interesting that people used to be hopeful for the future and for AI that could do everything we want it to do.
But now the sentiment has largely flipped. And it's mostly older people (who are already wealthy and don't care) or right wing people (who want to see AI "hurt the people they hate") who support it cannibalizing the economy this way.
I genuinely believe there will be some jobs AI will never do, either because it can't ever be as effective at it (we know, for instance, that human connection is essential in healing, so human doctors and caregivers will always be required) or because it's simply not worth the cost to invest in AI at that low a level (we don't see a lot of "AI laundry folders" or "AI garbage collectors" etc, because humans are cheap labour and getting cheaper).
i also believe that there will always be some low level human labour if only because power means nothing without the ability to lord it over someone else.
We already see this with things like WFH where companies demanded employees go back to the office. WFH was cheaper, more efficient, and actually an overall benefit to employers. But that wasn't enough, because managers also want to "control" their employees (and they can more easily feel that if they're standing there in the same room).
---
The last thing i wanted to point out is that it's not that the govt is going to be beholden to AI companies because of "tax revenue".
The govt doesn't *need* your taxes. Taxes largely exist to create demand for the $ (and for social outcomes, like wealth redistribution, that most people think it shouldn't do).
The reason the govt would cater to the AI companies is the same as the reasons we see today. Because the govt is, in the end, run by people. And those *people* (who do need money) are getting their palms greased by the wealthy to engineer policy that's against the will of everyone else.
Ultimately, the threat isn't AI.
It's the economic system. It's capitalism. And you did mention this, but i feel like these above points end up muddying the water a bit on it.
youtube
Viral AI Reaction
2025-11-28T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz6-P0466k4YqRVS0V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySJRLUZlxZKsBjPrt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy77pqex9Nj-r4FvQR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTfM8gJdPHDbwXvRx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxG9-LAHoAv2tQUnhR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUAcIcXYgPdX46EMJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxzgrOTzb52nYya3dp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwze2WPlO4or25jmBh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw7oxJ3A74zVVTZXE14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzYbtJ7yE78BvNd3CZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}
]