Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Interesting take by Dr. Yampolskiy, but there’s a huge missing piece: government intervention. His prediction of mass “unemployability” assumes governments will just sit back and watch society collapse. History shows the opposite. The main job of any government is to maintain social order, and mass unemployment is the biggest threat to that. Every major technological disruption has been met with massive government action: Industrial Revolution: Harsh conditions forced governments to step in with labor laws, child labor bans, and public education. Great Depression: With 25% unemployment, FDR didn’t wait for the market — the New Deal created millions of jobs directly. Post–WWII: Fearing an economic crash from returning soldiers, the G.I. Bill preemptively educated a generation and built the middle class. AGI won’t trigger a collapse; it’ll trigger a political transition. Expect governments to roll out things like: Universal Basic Income Taxes on AI-driven productivity (“robot taxes”) Regulations requiring human oversight in critical sectors This isn’t meant as a political commentary. This just my take on how history suggests governments will respond, for better or worse. With that said, it’s worth remembering: sometimes the “cure” can end up worse than the problem. So the real question isn’t if people will have jobs, it’s how we’ll redefine human value when labor is no longer central. This is a political challenge as much as a technological one.
youtube AI Governance 2025-09-05T11:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugy8XC7Dz38GnKzBeQh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzLZYVIc5osA1TIWfF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyLb4aT663e9xXgBV14AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugwihcht-W3mTun2eVF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwi3zrMSdCrXf7JXaR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwdJgcSqY2bK_vq0Z54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxI6lMaeJnczzxBePV4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugy1EzwK0Bbkg6PFRyN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx7hAZyH6NAWhVjgw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyxAVAWBc8ca4I5rrN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"})