Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Interesting theories, but a prediction that 300 million jobs could be lost to AI by 2030 , mostly in menial, entry level positions, manufacturing, retail, tech, middle management, etc. In the end, there will never be enough new positions available to employ that many people. The UBI concept is great, except that the government is financed by taxes. So, with 300 million jobs lost, AI isn't paying taxes and god knows the corporations aren't going to sudddenly fund the governmet to provide a Universal basic income. So, the entire Capatilist system caves in if people don't have money to buy goods and services. The companies run by AI have no need to manufacture anything when there's no demand because all of society is unemployed. AI is good for the bottom line for companies. Cutting out labor costs boosts the bottom line profit. But, that's sort of the snake eating it's tail isn't it? If the governmet doesn't put into law limits and regulate AI now, this could become an absolute implosion of civilization as we know it unless we implent an entirely new economic system that rids us of money and just allows us to get food, supplies and shelter for everyone without concern for costs. Get rid of money and let AI go to town. That's my two cents. Even though we don't have pennies anymore.
youtube AI Jobs 2025-11-20T21:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugz0iTyLmScnt0Vn3QV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzWhCaElPbaKyHa2iB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz708H-CSdoo1jwm1x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwA0XisQbSIeIsgD5p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx8WC4aYDodik88KK14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugwu2r1cJMFEu2ksTpp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwJO2C9c8LUIDBhnzp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxrFxQui2iuoOVr6qp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyHXaQ6lKX1pEjQuHJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyNqoXB6jThSjJImpx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"} ]