Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think what he was saying is that the only way to get the situation under control would be to have a world consensus and world wide rules about what AI can and cannot do. Otherwise ethical rules on how AI can be developed will not have a chance of working because there will always be countries that will be willing to create immoral destructive AI's and the countries that are trying to protect humans from AI's will have no choice but to retaliate. Result, AI arms race to the bottom. So cooperation on a worldwide scale is the only solution. But of course this will never happen because we are advanced animals without the for-sight and self control to put humanity above the individual. I don't think for a minute that he thinks that a world government could be created in the way that would protect humanity. The type of world government that you are alluding to is a more autocratic dictatorship type of system that paradoxically is what we are most likely heading towards with the help of AI, that is before AI becomes the ultimate world "leader" and we become extinct.
youtube AI Governance 2025-06-20T12:4… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugw0NE-Qae1iHWroXg94AaABAg.AJc5zaBlc2bAJc9OnoGd8p","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzvGikxwteDwECMBV54AaABAg.AJb4mapOkgeAJbE70j0fiC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwO1ARIrbWDUT3Cu2R4AaABAg.AJb1uhxadw5AJb8kVMngvA","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgxnI_GYza4F5bdq7Nd4AaABAg.AJasEH2pHNPAJcdJievvDZ","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgxL1HGcuhK1mRwfYnV4AaABAg.AJafzQvQ7jDAJajS0cu70-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzlCPnXTFgoHyl1gzx4AaABAg.AJafyP1wZIoAJalqzhwlEx","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxBFvPMqMPX7m2ELvZ4AaABAg.AJaJuMrB9L1AJaMOT9r7Iq","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgxZUIuQQgDMMq_SHrJ4AaABAg.AJaFhH3sZ8OAJaO-9D2KK_","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugw3uziqsdNr7_425sx4AaABAg.AJ_vuqW3vLhAJaAyFIXNQ-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw3uziqsdNr7_425sx4AaABAg.AJ_vuqW3vLhAJaCmBJ-Jlk","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"} ]