Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great food for thought! But? Major Flaws in the “AGI Replaces Everything” Scenario The narrative describes a total AI economic takeover, but several assumptions break down when examined against real-world limits. Here are the core weaknesses. 1. AI cannot do most physical jobs in the real world The scenario assumes AI can replace all workers, but: Robots are not the same as AI models LLMs and agentic software do not magically turn into: plumbers electricians gardeners builders warehouse loaders waste collectors care workers cooks drivers in chaotic environments mechanics mining crews Robotics is limited by: dexterity challenges unpredictable environments maintenance requirements cost of hardware power consumption slow deployment cycles AI can replace information work far faster than physical labor. The vast majority of society still runs on physical infrastructure that must be maintained by humans for decades. The collapse of that labor force would halt the very supply chains needed to build and operate AI servers. 2. Someone must buy the goods — markets collapse if consumers are broke The scenario assumes: AI companies keep getting richer while humans lose income AND the economy keeps growing This contradicts basic macroeconomics. If 20–30% of consumers lose purchasing power, demand collapses. Tech corporations (and their shareholders) rely on: food markets housing markets entertainment spending clean water systems consumer digital services device purchases transportation markets If humans have no income, there is no profitable market, and the entire revenue stream of AI firms collapses. AI companies can't sell products to AIs — that produces no real profit or tax revenue. 3. Energy and materials: AI scaling is not infinite AGI takeover narratives often assume unlimited compute growth. In reality: AI depends on: massive datacenters chips made from rare materials global mining operations fossil fuels or renewables cooling water supply chains with thousands of physical steps These cannot scale indefinitely due to: energy constraints grid bottlenecks chip fabs reaching physical limits rare-earth scarcity geological depletion rates maintenance costs A world with collapsing human labor cannot support the resource-intensive global infrastructure needed to run superintelligence. 4. Governments do not remain passive — mass unemployment is politically unsustainable The scenario assumes: policymakers serve only AI corporations citizens remain passive democracy cannot respond History strongly contradicts this. When inequality becomes extreme, societies destabilize: French Revolution Arab Spring fall of the Soviet Union end of apartheid dozens of labor uprisings countless populist revolts When people cannot afford food, rent, or healthcare, governments are forced to intervene through: regulation taxation of automation social safety nets employment guarantees redistribution shutting down dangerous technologies No democratic state can keep 25% of its population unemployed without risking collapse. 5. Capitalism punishes companies that eliminate their own customer base Even the wealthiest corporations eventually fail if: consumers stop spending economic circulation breaks infrastructure degrades social unrest intensifies Historically: monopolies get regulated extractive industries face backlash predatory systems collapse or get replaced Tech corporations do not exist outside society — they require: stable markets functioning states living consumers workers (even if highly augmented by AI) A total hollowing out of the human economy would destroy the environment that keeps tech wealth meaningful. 6. The “intelligence curse” analogy is misleading The transcript compares AI to oil wealth (Venezuela, Congo, Nigeria). But oil rents depend on one asset controlled by an elite. AI requires: distributed global infrastructure human operators market demand political consent It cannot be controlled by a small group in the same way oil rigs can be. The analogy breaks down at every level. 7. Social, cultural, and legal resistance always slows or redirects disruptive tech The scenario assumes frictionless adoption. In reality: Unions resist Consumers resist Voters resist Courts intervene International treaties emerge Cultural norms shift Religious and ethical institutions push back Technology adoption is never purely economic — it always passes through social filters. 8. The scenario assumes AGI improves exponentially without encountering the plateau that ALL technologies reach Every historical technology experiences: diminishing returns saturation points physical bottlenecks regulatory bottlenecks Even if AI grows rapidly now, exponential curves do not continue forever. In Summary: Why the dystopia doesn’t hold up The scenario collapses under real-world constraints: AI cannot replace physical labor at scale. Capitalism cannot function if consumers lose income. AI systems depend on fragile supply chains requiring human workers. Energy and materials impose hard limits on scaling compute. Mass unemployment leads to revolt, regulation, or regime change. Wealth cannot accumulate meaningfully when markets collapse. Historical patterns show societies correct extreme inequality. The fear comes from assuming: AI replaces everything, instantly, without friction, and everyone else just accepts it. But economics, physics, politics, and human nature all push strongly against that outcome.
youtube Viral AI Reaction 2025-11-23T16:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzn3eloyte0YFdZ9JB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwTRgH2wBN3qD7VU-V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzjjhUjnbFZ3vRwz294AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyamiA35jBTWhhB_mV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxRI049jfxipEZlRrh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxFO4E0VggMlw41FyV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwuScchFF0k292QThp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyKVoixztdVs62L4at4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxGXs9QmxUMfdMe5fF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzL93BVaQwYNWrqhLZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]