Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The loss of income from automation will be so complicated to track that the only solution is something simple like a UBI or a huge, fully refundable tax credit. That Ezra will get a check despite not needing it can be solved by uncapping and massively increasing the payroll tax so that the net flow of money from successful, gainfully employed people is negative. The real problem is getting these companies to pay taxes not only here but everywhere else. For the sake of argument, let's suppose that we are able to tax Anthropic and OpenAI enough to pay every US citizen enough to live comfortably: these companies will put people everywhere out of a job. Dealing with this will require something like the universal minimum corporate tax that Trump got us out of. If this fails (say everyone incorporates in Ireland), it's conceivable that you might be able to tax their electricity use, but that would require neighboring countries to cooperate. Mexico or northern Africa would likely be very happy to house all the data centers that feed the US and EU (putting everything in a single location would subject distant places to a lag of ~100ms due to speed of light constraints that might not allow for real time, interactive applications). So we would need to work together to get rid of tax havens, pass a constitutional amendment to tax wealth, and make sure that every human on earth gets something out of this if you don't want massive unrest and migration. Good luck Mr Bores!
youtube AI Responsibility 2026-04-22T20:2…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxnG4Yxn4vm17JPCdF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx2E83NWXD_vCUdXT54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwbxpY4ofB1RmXBc594AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxdEUg7zHBsf_c2fK14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz5UWCa2yH1583gYFh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzuJ9_xZU2lcmKVWqR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyzZxPpHlwKwApPUvx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyGxXAP8Dp7gd6DAkF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgygCDY7KPSYlkbwyHZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxIkDYGggBKz5GwW9l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]