Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Companies are always trying to make their products as cheap as possible to produce — and sell them for as much as the market will bear. So this optimistic theory that “people in the future will only work 20 hours a week and still have enough money to live their best lives with their families” — it’s just not realistic. The truth is, companies have no real incentive to reduce prices for consumers. If people work fewer hours, they’ll earn less money. And if AI replaces a significant portion of the workforce (which I strongly believe will happen), the real winners won’t be the average worker — it’ll be the same rich elite who always benefit from these shifts. Even if, as many AI CEOs claim, anyone will be able to start their own business using AI agents — who the hell is going to buy anything if 80% of humanity is out of a job? As for “new jobs” — let’s be honest. Humanity has never experienced this kind of technological revolution. Don’t compare it to the invention of the steam engine or the internet — this is something else entirely. There won’t be enough new jobs to absorb everyone. Let’s say AI wipes out 80% of current roles, and a few new job categories are created — then what? Everyone rushes into those same few professions. And we know how markets work: if your friend offers a service for $10, I’ll have to offer it for $7, and someone else will go even lower — maybe $4 — just to compete. But again: don’t forget — 80% of people might not have any income at all. This isn’t some utopia. It’s a nightmare
youtube AI Jobs 2025-06-28T01:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyunclear
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugzmj66HUXJxKeDn3J14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugz5yzsoerhUG5EF2vN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgygvYzOPLp0TZ1Ve094AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwQd7m4estI9erqnlJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxzHLOV9o8oLq97dEl4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx4qHPpdr0Xg4SZwwF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugx1uKy8hL3_Npkszll4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxjW9GDB4osA6A7dh94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzxyMlxkeTJUr6lnsd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxNxCJt4FYJ1Rx8V4h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"} ]