Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The danger coming sooner that could kill millions of people isn't a rogue AI pressing the trigger or the red-button. It's by replacing huge amount of our labour with almost free one. Yes in theory that would free people to do other things, even if only recreational things. But the big trouble is that our current economics won't be able to adapt, at least not fast enough in order to provide food (and other most necessary resources) to all the people as fast as needed. Yes that would happen in stages, first some professions would be hit, before others. But my point is these won't be minor disturbances, it would happen so fast and on such scale that it would leave millions of people hungry and poor on the street, and we just don't have a system to deal with such situation. We never have. Every time in history that food was scarce people died not only from starvation, but also because the more powerful were hoarding more resources, and some because were fighting violently for these resources. I don't think such scenario would exterminate us. Even in the worst case scenario I think a small % of people would survive and adapt. But after we adapt AI might be the dominating civilization and we could be more like its pets.
youtube AI Governance 2023-12-31T00:2… ♥ 32
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzDyfwk9HibtxrCzIx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzAyproHIrSdizcfER4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzHSN-hxaeG5t9RlZV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyVXzl_QNFa5V0fmNJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw-vK8R4T_XfigaeTp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw1fSxXCyw9zwMBLZ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwfwbP9YkYtoEX0FV14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwhGshfD0JlhVBNaRp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgweuiImBq4PQVwaXXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzVcc6qv62fXgPjK814AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"} ]