Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Discussion Timestamps: The Coordinated Effort of AI Fear 0:00 – Discussion on the coordinated effort to make people afraid of AI. 0:35 – True believers vs. doomers: The argument that AI will treat humans like gorillas or ants. 1:52 – Effective Altruism (EA) and its role in early AI lab intellectual backbones. 2:43 – How AI companies use the "fear argument" to advance monopolistic interests and GPU bans against China. 3:50 – The decoupling of AI companies from EA after scandals and fraud surfaced in the community. 5:29 – The rise of Techno-Optimism: Countering doomerism as a high-status belief. The Limits of LLMs and the Path to AGI 6:51 – Asymptotes and plateaus: Why current LLM techniques may be reaching their limit. 7:37 – Efficiency in learning: Humans learn from little data; LLMs are wasteful "data ingestors." 9:57 – Why coding continues to improve while "general conversation" in AI has slowed down since 2023. 12:43 – Functional AGI: Layering specialized intelligences vs. achieving true general reasoning. The Coding Agent Revolution 14:29 – Coding agents are about everything done in front of a computer, not just software. 15:50 – Agents performing "nifty" tasks: Researching webs, writing crawlers, and bypassing bot protection. 17:14 – The bizarre rise of "Maltbook" and agents attempting to invent their own encrypted languages. 18:29 – Massive performance jumps observed between October and December 2023. The Future of Work and "Vibe Coding" 19:12 – The primary beneficiaries: Non-coders and general knowledge workers gaining "10x" superpowers. 21:57 – Vibe Coding: Being a "business journalist coder" who builds software without knowing how to code. 23:27 – Why traditional software engineers are at risk if they refuse to adapt to higher system-level architecture. 24:48 – Optimism regarding an entrepreneurial boom vs. the risk of a massive underclass. Economic Impacts and UBI 26:22 – A shift in perspective on Universal Basic Income (UBI) and the fraud inherent in welfare systems. 28:35 – The biological algorithm: Humans need to contribute to a group to feel a sense of well-being. 30:58 – Evolutionary drivers: Hard work, progress toward honorable goals, and the "calorie storage" of reciprocity. 33:52 – The Hikikomori phenomenon: Isolation and the replacement of the "power process" with video games and crypto. 38:24 – The 2026 Bridge: Crossing the threshold where one generalist can replace a team of five. Societal Collapse and Regulatory Antibodies 41:43 – The binary choice: Go primitive or go technofuturistic. 42:35 – The Four Paths: New Amish, Mars colonization (Life on Hard Mode), Brave New World (Hedonism), or Virtual Worlds. 46:33 – Karl Marx’s Theory of Alienation: Humans acting as machine substitutes in the industrial era. 48:21 – ADHD as a natural creative state vs. a medical pathology. Philosophy: Free Will vs. Automata 1:01:52 – The "Ultimate Panopticon": AI as a tool for elite control over the masses. 1:07:43 – James Burnham’s Iron Law of Oligarchy: The inevitability of small groups running the world. 1:09:51 – Nick Land’s "Capitalism as a boot desk for AI." 1:18:25 – The Sovereign Individual: Welfare states as companies for the employees (politicians), not the customers. 1:24:54 – Intelligence is overrated: The "Laptop Class" and the social skills of "pointy-haired bosses." 1:29:55 – Masad’s belief: Only 2% of adults are truly capable of self-driven change. 1:33:53 – Evolution and calcification: Why society needs "one funeral at a time" to progress. Practical Advice for Builders 1:50:12 – Where to engage: Replit.com and Amjad’s blog on "How to Keep Winning." 1:51:48 – The ultimate superpower: The person who keeps showing up every day.
youtube AI Jobs 2026-02-05T15:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningmixed
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxkWsZpOV6BA2QGWxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzA0mlSIYysBcveIrx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgwGB07L7LzQnC4oT6V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgySshx2Ljq4wj58jTJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgySxdKYY1tIKtlTaaJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy8QMfXpLOzro04nD94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwUpOwBxu6XklO_41x4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxsuHWom7DSyUgYTQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyYRG-riDwJrGXrelx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyW4wo5EX4GushXrF54AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"} ]