Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Automation could be good for everyone in a better world. The irony is that when corporations seek out maximizing profit above everything else, efficiency requires a human cost. Humans need to suffer and die in order to make society more efficient. And when we get a society that is the most efficiently run, thats when we've maximized the amount of suffering. We've created a world for the lucky few to have stolen or earned or inherited enough wealth to survive in it. And human greed makes no sense. In a more efficienttly run society, why would you need to screw more people over? When people can no longer "contribute" to survive, wouldn't it be more beneficial to just make survival a right? When soceity effectively runs itself and survives without us, why shouldn't that be the moment when we're all emancipated from the necessity of work? It makes no sense. They claim they are thinking long term with automation, but if they truly were, they'd use that saved money to better the lives of those that sacrificed so much to keep society afloat. But they dont care because they're part of the class that has enough to survive when others can't. Automation is good. Its people that suck.
youtube AI Jobs 2026-01-11T04:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyNxHqkDxxfTQXTx_14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwsnKnFj7bboCcpfSx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzVcbhfSeBDTVN2esV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyFlxW2OGcBNhSyz_F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgznkFY2d5VF0J7kkzt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxZI1izIvn39d8f_yx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxcxaXtzgIaz0mde2x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgyJWH8pQv3X8qDUfON4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxgO9__Fj3M5owFvT94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyBTVZQUleSQYN6D_t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"} ]