Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
with all due respect Mr. Sanders, it's kind of an inevitability... its not like we are developing AI just because, but there are real-world solutions for customers and consumers. For example, it would streamline a lot of the tedious workflows that one would have to go manually do today. And if these AI solutions work well, then guess what, company stonks, and when company stonks, more $$$ for the executives. So like, yeah, obviously it sounds ridiculous that the top 1% will keep getting more, but this is the corporate structure of America. My question to you is, how would you address this? Because I see this as the real problem. Tangentially, when AI can automate jobs that would normally require a human, yea I mean it's inevitable that some of these jobs will be replaced. Like if the risks are lower with AI than with humans, the choice is clear. It's going to be really hard to reconcile that displacement. Down the line, we might end up needing some sort of UBI. So how exactly do you plan to tackle these problems? Like it's one thing to just say, "hey, our billionaires and corporate executives can't be greedy." But what is your proposal then? What will actually make them listen and fall in line?
youtube AI Jobs 2025-10-08T04:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyindustry_self
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxsueiwZv7DEWei2814AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxHjNwapvK8OsvoZpZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzeAjjmhm_x4TQ0C3d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgymYwFLrXG4kC7R0jB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw6txFj6UhZ4VudABB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyGxQI7HwEEi8ROVFx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxdTGc4g8C7gd1u8h94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy2Rabay8S98kPgxJ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwxUUiVtN3MRNsQNI94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwRycchwfrMiItInnh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"} ]