Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
For years now, ever since I saw the video "Humans Need Not Apply" by CGP Grey here on YouTube, I thought some form of UBI would be necessary as workers are replaced with all forms of AI, robotics, etc. Not a human helper bot, but if something replaces the human job...The details is where it gets cloudy. I think the AI work should be taxed at the same rates as workers and paid at least the federal minimum wage - which should be updated to whats realistic and tied to inflation yearly as well. Now, how to distribute it, I'm not sure. And yes, it should be tied to something as well, not just "free money" and nobody does anything all day. It could be education, job training, caretaker roles, whatever. Alternatively It perhaps could pay for universal health for all. Smart folks can figure something out (or AI! LOL). I think Scott bristles at the thought of seeing everyone sitting on their porch drinking/smoking all day doing nothing, and that's right, humans need to feel like they are doing something of value...One example (not AI related, but it popped in my head) Thailand has an education VISA offer whereby you can stay in the country for quite a while to learn something such as the Thai language, Thai cooking, etc.
youtube Viral AI Reaction 2026-04-24T12:1… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwUa_zcPWGeokhQn0F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxOzKci4yqJI2tWzlx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwqR7O4rGeqRPlEtuV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxfNqKn59bCXXaihBl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugzsrn0PtATK87V3l594AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxTnsUbWHZMB1uPnkZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzetxzSd7mzfRJxzjt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwDCvI5prm5bX6lZWl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgytmkhMVGRQytkyjLx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgynPdkO_t28TPPZ2954AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]