Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’d be cautious about treating this as deep analysis. The “Intelligence Curse” is a speculative essay by two early-career researchers published on their own website. It’s not peer-reviewed academic work. The essay contains no empirical data on actual AI displacement patterns, no labor economics research, no quantitative modeling, and no technical justification for their claim that AGI is “>90% likely in 1-20 years.” It was informally reviewed by people in the same EA/AI safety community, like Richard Ngo (who by his own admission couldn’t make progress on practical ML work at DeepMind and focused on philosophy instead). I work in the AI automation space and I’ve watched countless companies blame “AI transformation” for routine layoffs while their actual AI capabilities remain mostly vaporware. This essay takes those marketing claims at face value and builds a dystopian scenario on top of them. The resource curse is real in political economy, but applying it to hypothetical AGI without evidence is just speculation. Unless they find a path outside LLMs (which more and more AI researchers are claiming is a “dead end”) to AGI.
youtube Viral AI Reaction 2025-11-22T22:1… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxcrIvXEk1yOdgo-U14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwv1OXSDP6H8B7TPop4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"}, {"id":"ytc_UgxwWC3jMRA5szL_l3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxjENX6Rh3x7ehj2u94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz3CQg8NEmIV654p8N4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyhq_NBSp5LvQS3DsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw_ApDXYs7hCrwaSt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx1Cxwn2KKBtoMXOhh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxCGy87yPcJvS24Jfd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyWORiv6Kr25Gs_2fB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]