Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Money, or power, or ego, or religion. I think many in the AI field are more concerned by power, if you really think about what AGI means, then money starts to lose value as anything but something that can help you get to AGI faster. Hypothetically, if you can control(dubious, I believe that's an intractable problem) AGI, then you can become absolute ruler, money becomes just a number for you, that you can tune as you wish, you will not care about it whatsoever. I think a lot of more scientific minded folk aren't as concerned about power, for scientists your schlong is measured by the importance of your work, and AGI would be the most importance invention of humanity, and the last it would or even could make. There are also the cultlike transhumanists, singularists, etc. who have various religious beliefs about AGI. You have the delusional optimists like Ray Kurzweil who believes AGI will just solve every problem and we will obtain permanent paradise for humanity, David Shapiro is an example of that here on YouTube, and then you have the more realistic but completely insane cultists who are deliberately trying to replace humanity. Larry Page is an example(calling Elon a "speciist" for arguing to specifically protect humanity), Nick Land is another example. There are many drivers, but yeah, these people are risking our lives for egoistic reasons.
youtube AI Harm Incident 2025-07-26T19:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_Ugx4DpVOOc43UwyLm7N4AaABAg.AL14G8EAgn9AL3CMVOxCEP","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugz5wtENy4-hzvQjT0N4AaABAg.AL-_BNb-CTvAL-eeIKjiq4","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwhIfnj00bZZ-6XYZx4AaABAg.AL-HnLd7QpaAL2oPVwJvF1","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugz7k9UVLsRTIKuCMtl4AaABAg.AL-7Sl2gGI1AMvtOEWK8JO","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugzr1mlkani93OXomrd4AaABAg.AKzW6fGIvvcAL3xD7NC0g7","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_Ugzr1mlkani93OXomrd4AaABAg.AKzW6fGIvvcAL43cR_FSLq","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_UgyT5hz1vbeuq0KaNnl4AaABAg.AKyvjoo5zkNAKyxhh8FIl1","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_UgxlmBGpdluey6NQ1qB4AaABAg.AKyMaf8FZlNAL-jQl9I1jD","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_UgxlmBGpdluey6NQ1qB4AaABAg.AKyMaf8FZlNAL02PHtDvHo","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxlmBGpdluey6NQ1qB4AaABAg.AKyMaf8FZlNAL49bj2kyOo","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"} ]