Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well, part of that is that Yudkowsky and MIRI, the organization Yudkowsky founded, are part of a ... kind of a new religious movement. They're "Rationalists", which is a weird, somewhat culty community that sprung up around Yudkowsky's blog and forum LessWrong, and I think most if not all Rationalists are also Singularists, so that's just one of their beliefs: that super smart artificial intelligence is either likely or inevitable in the near future (the event where AI becomes super AI is called the Singularity). They think the danger with the Singularity is not that it happens, but that it happens accidentally and that it's not the godlike benevolent super AI that they want. Oh, and by the way, Rationalism and Effective Altruism often have a lot of overlap, so these people also think that in order to min-max the amount of goodness in the future, one needs to develop an AI smarter than humans to make better ethical calculations that humans can. Thus, to do the most good, one needs to spend their life bringing this super AI into existence as quickly as possible. Yudkowsky and a lot of other people in this space are big into cryonics, too, so they also think you should cryogenically freeze yourself after a long life of trying to hasten the arrival of a benevolent super AI who will in the future develop methods to revive you. So, yeah, it's not the *interview* that makes this book seem unreasonable. I think interviewing Yudkowksy and Soares without the context of their wild beliefs makes the book sound far more reasonable than it should.
youtube AI Moral Status 2025-11-02T21:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgyvQRoflPZn7t69o_x4AaABAg.AP2QGordsKqAP4U16ElfC3","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgyprtK2H6ZnQccEmax4AaABAg.AP2LElv1ficAPAH6oAP5Lg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyprtK2H6ZnQccEmax4AaABAg.AP2LElv1ficAS2wKizF0w8","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwCMEtyTtZwynwkXrV4AaABAg.AP1lM-KdbLQAP4Oidn1uw9","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwCMEtyTtZwynwkXrV4AaABAg.AP1lM-KdbLQAPCY0MmjHmx","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugzrp8HbL5oyccS7tDh4AaABAg.AP1jrhQEhXQAP1xLlAG7D-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwRM1UtUh06iVVjG654AaABAg.AP1_ZRcpUnuAP1ykCa0r7U","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwqbApXzc0IwtAuhjV4AaABAg.AP1FQf2hs4-AP1zdo8IbnA","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytr_Ugw2SqS_h8aKB6KVPI94AaABAg.AP1A9_s-JYIAP1Doud4CvW","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytr_Ugw2SqS_h8aKB6KVPI94AaABAg.AP1A9_s-JYIAPFPAAXcqJX","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"} ]