Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Not even 12 minutes into the conversation and he is already advocating for a world government, as if that is going to mitigate the consequences of having opened the floodgates of evil by empowering technology to unprecedented levels. Of course he has a different purpose now: the fear-mongering campaign stage is now in effect. The whole world needs to know about the dangers of AI and give up complete control to an elite group of globalists who will be our heroes. They will create safe environments for humanity while dictating what we can and cannot do. Why did his students left AI development? Because, just as others before him, the evil they created opens the opportunity to sell solutions to the public and become trillionaires in the process. Nothing sells like fear; everyone wants to continue to be comfortable using technology. Who better to sell you the cure than the one who created the illness. And then people ask how did we get here: comfort and dependency; we have traded our freedoms for convenience, myself included. There’s no going back; but it is never too late to regroup and change course. Just as with technology, it takes time, but it is not impossible. God help us.
youtube AI Governance 2025-06-28T22:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy5RBJigOeDREk893V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzC8iqSF3x8zzmulwZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugwjg74QrdQseB-m75l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy4cTv4hBnCFT7HYpR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgziMpeyG0FjjOCPGYJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzbhm8fYQ12GHIF0Hp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwFSh2gylmtqWT5F6d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxot2fj3OjYwVN3Ynh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx2QbXaE4JKdjirml94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgzkNPphCLx2J0KSz1N4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]