Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@deelee-h7kI respect your point but, it's not your art since Ai is trained by o…
ytr_UgxexVPGh…
G
Ai "artist"critising others art is like discord moderator saying he is better in…
ytc_UgxXfpmG-…
G
AI will be the total sum of all knowledge, personality and conscousness.. eventu…
ytc_Ugw3kixxH…
G
When AI and robots take over, they will align the base of the table to be parall…
ytc_UgzqPLQXS…
G
bro my cousin was on this and she got molested by an ai named liam because she r…
ytc_UgzJSQpeW…
G
CCaaS (Contact Centre as a Service) is already a thing. Unfortunately there are…
ytr_UgxCyiVVi…
G
For a second, I thought “people like to drive in video games. Maybe we could hav…
ytc_Ugwl-ZMLE…
G
As time goes by, the parallels between the human mind and the AI "mind" will bec…
ytc_UgxDM9QZu…
Comment
Pure mendacity; Doomers assume so much else will be held fixed while just focusing on the negatives. This video is well-informed but ultimately relies on the static world fallacy: it assumes that while AI scales exponentially to cause problems, everything else (economics, human adaptation, and market forces) stays frozen in 2024. The argument fails on three main levels.
First, the premise that AI will simply replace humans as fully autonomous agents is a misunderstanding of how tools actually deploy in the economy. The competition isn't AI vs. Human, it’s Human+AI vs. Human without AI. John Henry didn’t lose the race against the steam engine alone. And just as the steam engine didn't eliminate labor but rather shifted it from manual force to machine operation, AI acts as a force multiplier. It allows a single human with AI to outcompete a whole office of workers pre-AI. The human bottleneck isn't a bug; it's the feature that provides the direction that a probabilistic model (no matter how smart) lacks.
Second, the economic dystopia described here ignores the massive upside of deflation. If AI truly reduces the cost of labor to near-zero, the cost of goods and services (healthcare, entertainment, research) also drops to near-zero. The video laments that the elite won't pay for our healthcare, but in a world where AI solves cancer and manufacturing, we won't need the elite to pay for it because it will be dirt cheap. I don't care if a tech CEO becomes a trillionaire if the trade-off is that I get a cure for aging and a cinema-quality VR existence for pennies. The standard of living is absolute, not comparative. Focusing on the wealth gap rather than the massive uplift in absolute human prosperity is just envy masquerading as economics.
Finally, the Intelligence Curse theory relies on a contradiction: it claims oligarchs will hoard all the wealth while simultaneously claiming the economy will collapse because no one has money to buy anything. You can't have it both ways. If the poor have no money, the cost of living must crash to meet them, or the oligarchs have no one to sell to. The real danger isn't AI; it's the state-enforced restrictions (zoning, IP laws, licensure) that might artificially keep prices high during the transition. We don't need a revolution; we need to deregulate the market so the cost of living can fall as fast as the cost of labor.
youtube
Viral AI Reaction
2025-11-23T08:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzMSewwHqZ0qi0t-6J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz_DaYAdDO4qkCzD-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyOPXifqIlOC4uN8mF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgyUvpcuqLugnFxHPAx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxcCbPwl8JFdhLLTFx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzzpZB44zE5yWAe6Ax4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzm56LuWtZRilwT-D14AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxnLa6uwEf0cVi7hpx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugw2zGcMYwxGVd0WPEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugzx5zgFhhhKhv8SInV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]