Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think a better metaphor than the one about the plane is that a gun is being pointed at us, and they're in the process of assembling the firing mechanism. Whether or not they can pull it off doesn't diminish the threat. There is no reality in which a tool like - in theory - AGI benefits the public more than it benefits those in power. More likely than not, it kills everyone, yes. But even if it doesn't, even if it could be controlled somehow, it is a tool for scaling towards infinity. It will widen the gap between whoever has it and whoever doesn't in ways we cannot even begin to imagine. Billionaires fear, more than anything else, losing power. Any smart billionaire must be keenly aware of the increase in class consciousness that's been happening lately. If you give any billionaire a genie, their first wish would undoubtedly be "Make it so the peasants can never overthrow me." Any benefits from AGI that could possibly trickle down to any of us would come at the cost of being so utterly and thoroughly controlled that we likely couldn't even comprehend the idea of freedom anymore. More likely, things would get substantially worse than they are now. People in power do 'the right thing' right now primarily for the purpose of appeasement. What do you think a world would look like where they no longer need to appease anyone? For the record, I don't think LLMs are capable of making it there. But I'd rather not let them finish building the firing mechanism and pull the trigger, if we can help it. You know, even if you're pretty sure a gun isn't loaded, and even if the safety is on, you still shouldn't be pointing it at people.
youtube AI Moral Status 2025-11-03T07:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwKBnOek438mAagMAd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzxYQRVAegFgHXg7Xx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugy1Uh_2A6Hmqz2zX3N4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgyWlZdsdRzUsOyBErZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyhMjYKq1Cxw9NDepx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxRPbCIL-qFBAmgtih4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw6tbpjSp5ybqmD2ON4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgzOZfFGG5Nz-yNf8cx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxNbi0qF58Lo_Arj2B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzuuXOlamvh4ku8XWV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"} ]