Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The 1956 movie Forbidden Planet is instructive... An advanced society, the Krell, creates a massive underground cube-like city, 20 miles on a side, powered by multiple nuclear reactors, that contains machines that translate their thoughts/wishes into material object, transmitted anywhere on the planet like by Star Trek transporter beams. In essence, we have that capability. It is known to us by names such as Amazon, Ebay, and Ali-Baba. And the data equivalent is the World Wide Web. Whatever we want, we can have, just by thinking about it (and hitting a few keys or speaking to an AI). Sort of Heaven On Earth 1.0, which we are busy upgrading towards 2.0, by creating a god to run it (ie AI), so we don't have to, and can just sit back lazily, de-skilling ourselves to the point of being dumb sheep that can't even remember what they created or how. But here's the rub: The Krell had monstrous thoughts that translated into physical monsters that wiped them out. Ah but not us, you say? Watch Hegseth coaching senior military on being Holy Warriors. Watch Trump threatening to end an ancient civilisation for all time. Watch Putin threaten that a future without a Holy Imperial Russia is unthinkable to him, and he would wage a nuclear war rather than suffer its disintegration. And these are the guys with red buttons. Like those buttons you can get from Amazon for ordering toilet paper or washing powder. Except that Trump's and Putin's red buttons they do a bit more than prompting an Amazon delivery. (Except for the Diet Coke button on Trump's desk, which I hope is not close to any other buttons.) If we can't get our own acts together, we have no chance. AI or no AI. But if there's AI, it will take one look at us, and conclude we are not worth saving.
youtube AI Governance 2026-04-18T08:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugx1g0F6Df0jfmOM4154AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyYT8L5OTd0I4Pz5f54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwQqKc3En10AnGAC7V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzEaPIAzo8g8G_DCA54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzRfrSImj4pMzPVV3t4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwyyHfrdU6qjEPrONF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzziM6LRpLhPArcBEJ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz4eX5J7qCalu6Rl8J4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzDLsppSVNR-87U1VB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwEwqmuNdXviB6O98x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]