Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The scenario where AI actually succeeds in replacing human labor is, in fact, the worst one. The outcome would be nothing like what tech CEOs are selling us, quite the opposite. In capitalism (especially the deregulated kind), technological advances are used to perfect the exploitation of workers, not to liberate them. Corporations are not going to distribute the wealth generated by machines among the population, at least not in any significant way. The very reason they crave automation in the first place is to stop paying wages and increase profits. And they will use those enormous profits to lobby and prevent the state from carrying out redistribution through taxes. You can't expect private property to act as if it were public property. How can you think that a private corporation's profits will be socialized, rather than privatized? The post-scarcity, post-work society that Sam Altman and company promise is impossible under capitalism. They are literally describing a socialist utopia while trying to convince people it's a natural outcome of capitalism. The reality is that we would see a repeat of what happened after the Industrial Revolution, when technological progress that was supposed to alleviate workers' lives was used only to perfect their oppression. We are going to see a concentration of wealth and misery at victorian levels. Fertile ground for radicalism.
youtube AI Jobs 2025-12-29T00:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwPysZf7PLdVIRBWiR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzlLUtg-AT9NS_9bix4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxigJJJsiTITTIXXQV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw3uVnyk7koP68YMjB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwsPGHlqrSHLHN1rRV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwYvHRnm8hZV2wRUad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwDBQ1mmYYlQRHMmPx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy70aCVpzGaXQ6AVe94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx1rSfhwvZ-IUmBssN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgytV06qhpmH1Gzuzx94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]