Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Super AI is the fear of Yudkowsky and his think tank MIRI because -- phew. Well, part of is because they're Singularists. They believe that super AI is either very likely or certain and that it will happen in the near future. Yudkowsky and his Rationalists (a community that sprang up around his blog and forum LessWrong which is all about how you gotta think a certain way to be super logical and yeah if this sounds like some sort of super Reddit debate bro stuff it's because it kinda is) think that a super AI would be more capable of crunching numbers and coming up with the most ethical decisions through highly complex analysis that humans can't do. Therefore, since most Yudkowsky and a lot of his Rationlists are also Effective Altruists and believe that they, through their super logical and rational ways of thinking about the world, are more capable of identifying the most ethical choices compared to the illogical masses, the illogical masses must be taught Rationalism. As Rationalists, they well then come to the absolutely logical conclusion that the way to maximize good is to develop the benevolent super AI as quickly as possible so it can make even better decisions than any human. So, for them, yeah, they have to talk about super AI. The other stuff is irrelevant to them, because they've convinced themselves that any time not spent on developing super AI and "aligning" it to human interests is just time wasted. If that sounds a little culty, well, yeah, it's a little ... much. Yudkowsky might not himself run a cult but it's -- there's a reason there's been at least one death cult associated with Rationalism.
youtube AI Moral Status 2025-11-02T21:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgyvQRoflPZn7t69o_x4AaABAg.AP2QGordsKqAP4U16ElfC3","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgyprtK2H6ZnQccEmax4AaABAg.AP2LElv1ficAPAH6oAP5Lg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyprtK2H6ZnQccEmax4AaABAg.AP2LElv1ficAS2wKizF0w8","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwCMEtyTtZwynwkXrV4AaABAg.AP1lM-KdbLQAP4Oidn1uw9","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwCMEtyTtZwynwkXrV4AaABAg.AP1lM-KdbLQAPCY0MmjHmx","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugzrp8HbL5oyccS7tDh4AaABAg.AP1jrhQEhXQAP1xLlAG7D-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwRM1UtUh06iVVjG654AaABAg.AP1_ZRcpUnuAP1ykCa0r7U","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwqbApXzc0IwtAuhjV4AaABAg.AP1FQf2hs4-AP1zdo8IbnA","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytr_Ugw2SqS_h8aKB6KVPI94AaABAg.AP1A9_s-JYIAP1Doud4CvW","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytr_Ugw2SqS_h8aKB6KVPI94AaABAg.AP1A9_s-JYIAPFPAAXcqJX","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"} ]