Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here's the thing: we've plateaued. There is no real growth anymore: just incremental "improvements" and an endless series of hype bubbles each stupider than the next. Our infrastructure is failing, our population is descending into poverty, our government being a shitshow would be improvement at this point, and our trade is collapsing. America is failed state that just doesn't know it yet because we can all still sorta afford food as long as our credit cards hold out. When the AI Bubble pops it's all going to come crashing down. Honestly? At this point I want to see it happen only because maybe this time it'll give us the impetus we actually need to eat the 1%. All of them. Even their children. Leave their bones bleaching in the sun as warning to the 10% to never step the fuck out of line again. Because the other option is that they win and put us all out of work and the BEST we can hope for is a barely adequate UBI backed by the cheapest possible food, clothing, and housing to keep us alive while the top 20% or so get an AI-fueled utopia, and any attempt for us to better our lot will get met by ruthless crackdowns supported by a security state of staggering scope. And that's if we're lucky. It's just as likely that they'll decide we're a bunch of useless eaters and, well . . . if we're lucky they'll kills us relatively painlessly. Maybe sterilization as a requirement to received the UBI so we just die off in a couple generations. Look, the current AI tech isn't going to give them a utopia but they're going to keep pushing for it because the potential gains (for them) are too great. They're looking at their Star Trek future and they'll keep going and going, plowing infinite money into it until it succeeds. Eventually they're going to win. Unless we eat them. Of course, there's a reason they're plowing so much money into the Security State and training the cops in urban warfare.
youtube AI Jobs 2025-12-24T14:5…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningunclear
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgzMcjt-AFfQbTmjGLl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_UgyOJvLbt_HhA0GEL3J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzOueId-bnZp4Ln4a54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwk_LfT2KgXnqDz0x54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgxQo-a2ZTrXKbSzy6d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugw3_7z7bROJzq3eenx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgxMzdM1qkyphdEJq6V4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzBPzpX4C65tTnZzNx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxYfH8n_49pSnBeli94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwPOkTUkbf_2O-07254AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}]