Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
it's silly. If AI makes everyone unemployed this means EVERYBODY WILL BE EMPLOYED. Because it will require UBI. But if you're already paying everyone a living wage through UBI not using them for labor is stupid, as it's free, so it will always be cheaper than AI. So what's likelly to happen is that goverments will mandate that each company hire X people calculated by their revenues or profits and then have those companies figure out how to use those people. And those who will not want to participate will just have to pay extra tax that's the same ammount as those salaries. Of course, we might never live to see this, as this will require AGI and what all those tech CEOs, inclding Tan, work really hard to hide, is that nobody has absolutelly any idea how to build one and any predictions at when this might happen are as reliable as predictions where we might achieve FTL. It might never happen during our lifetimes, so we might just end up with AIs acting as booster tools in some areas, but never reaching level high enough to actually cauise high unemployment.
youtube AI Jobs 2025-10-18T12:3…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy0WDvGqVvWGJC9rB14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxpydx-PwK_zWF4Qmp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxBmf7xpMnXFJYZywN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzjod8UnlbKhY7nLhl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxTQXw3fmlNoxIDkNN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxue3GTMJF1gM7LK4l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwtn0jTTuYB9Iyc8354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugyu09IOKEdDBdPcVGB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzUZIuZv06KEXjLvqJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz1evMeDj4l9dejTBp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]