Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Wow push the button save the world. And you refuse. Nobody within the AI industry takes the threat as seriously as the possible cost to their own prestige. This is half the problem. Hypotheticals aside what laws can be implemented to mitigate AI disaster. The most effective practical law would be the ban humanoid or any appearance of organ presentation. If we are not able to identify human form machine at a glance we are only magnifying the problem. Once a humanoid AI masters the projection of human emotion we will no longer be able to make logical decisions about the value of such a machine. To step back from the technical aspects. Is it not the case that AI is simply magnifying and accelerating the worst aspects of human power structures. We are headed for the same result with or without AI. After all AI is mimicking human behaviour at the corporate level. We didn’t need AI to develop biological or nuclear weapons. So AI might be a blessing in the presents exactly where we are going. “You are appalled” no you are lying your care is for a defunct economic system. To value AI in dollar profits is absurd and ignorant but people who are financially or socially successful in a corrupt system have no motivation to criticise that system. I think it’s disgusting how you backed out of your AI threat position when offered the hypothetical choice. You even went on to describe your former position as crazy. We are f*ed because CEOs are exactly like this fellow. STOP THE MADNESS. STOP THE PRODUCTION OF HUMANOID ROBOTS.
youtube AI Governance 2026-01-01T12:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzoZt0lFI_A02LDDWN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzlU98_uZCpzsWpARl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyKIG71CxzumDCX2hJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyJbmNteK-wFM_o4k94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugyh-25U7MZ0huPJAqp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzK-OIPM6Y308JdNe94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwpN4e4KDV_ODiwAnh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyFDhWlV8rcSUxlB-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwRSXTNteDY2N3C-PB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwca2z0ar8FCtMY8ox4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]