Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Perfect way to explain this, very nice, i knew i could count on this artist to g…
ytc_UgxWfyYy_…
G
There will be a basic income solution as there is no other way because you are n…
ytc_Ugj8s-p9l…
G
If you genuinely care about your own art, and other artists art, stop using ai. …
ytr_Ugy9Oc85L…
G
People watch a few science-fiction movies and think that AI is always going to b…
rdc_kvh5lo8
G
This is why the world is seeing so many people hurting because of greed. So now …
ytc_Ugy2SCu33…
G
Actually the world is so disgusting that they had to put in protected groups due…
ytc_UgzCQUZLG…
G
@Redtornado6 if that’s the case, then answer the question. In your worldview,…
ytr_UgzBw8oLX…
G
😅 Grok said it was wild compared to other A.I until I pushed the limit and I was…
ytc_Ugyphh5q8…
Comment
Bernie, that’s not how it works. There’s always a transitional phase, and major advances usually create a wave of new, higher-skill, higher-paying jobs. Yes, “lights-out” robotic factories exist—but they’re rare and mostly limited to stable, narrowly defined processes. Real plants still need people for changeovers, QA, maintenance, and exception handling.
Take trucking: we can automate long-haul routes and yard moves, but last-mile delivery remains stubbornly human with current tech. Even if line-haul becomes largely automated, we’ll still need master drivers to supervise systems, technicians to recover and repair vehicles, and engineers to build and maintain the infrastructure. Jobs don’t vanish; they shift up the skill ladder, and legacy skills persist for contingencies like manually recovering a disabled truck. That’s a feature, not a bug—phase out repetitive, low-autonomy work and move people into roles that compound skills and earnings over time.
Where I think your view misses the mark is on what policy can do. A negative income tax (a universal, refundable floor that tapers as earnings rise) would let everyone share in AI-driven productivity while preserving work incentives. Pair that with portable training credits, rapid re-employment support, and pro-competition rules so concentrated platforms don’t bottle up the gains—and you get a future where robots cover basic needs while people learn, build, and choose the work they actually want.
The goal shouldn’t be keeping humans laboring forever; it should be decoupling basic security from a single job so people can move into better ones as the economy evolves—STEM, skilled trades for automation, field support, safety oversight, and the entirely new categories that always show up after a tech shift. That’s practical, optimistic, and achievable with straightforward, good-faith collaboration across the aisle.
youtube
AI Jobs
2025-10-09T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8oDCp3o_ZeeUNpol4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxzr4FTrgmbXLGKdKJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEPiF5mzbrPho8EFF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQRtYqnc3iOl_q-ah4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwt3mTFVUySCpx8sh94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxvcDXgC7-xUx_NCAt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzM392l4VwKRTDApAp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxVR3lafyXR1cpSMJN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwVQvCcpKBtZE0cx8J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxtvpZ4Y-EC17H61HR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]