Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@austinlong6936 MIRI is Yudkowsky’s think tank, so while I don’t know of Soares as well, I know the two are not, like, collaborators from different perspectives so I feel safe focusing on the more famous Yudkowsky. Yudkowsky is a major influencer in the sort of culty Silicon Valley tech bro sci fi weird movement. He’s the guy who developed Rationalism, where he says he can teach you how to think in this super logical fashion. Rationalism and Yudkowsky’s blog/forum is a,si where we get a lot of the Effective Altruist stuff, too, since you can min max good. According to Yudkowsky, who believes that a super intelligent AI is inevitable, a super AI would be most capable of analyzing good/evil, thus to maximize good means doing whatever you can to bring about this godlike super AI.  For example, rather than giving money to charity, it may be better to spend that money printing out Yudkowsky’s Harry Potter fanfic in order to introduce more people to his idea of Rationalism (which they can fortunately read half a million words about by reading through a series of writings of his called The Sequence). That’s because then there will be more people who can discover how to think Rationally and who will dedicate their time to maximizing good by developing that super AI.
youtube AI Moral Status 2025-11-02T14:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgyGDWc0iNMHE29Y-qp4AaABAg.AOwyq6rRRRAAQNPX-K-r0z","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwkGd_OWtC-51LYHUl4AaABAg.AOwxCjNfKfPAOxVrEsOyAl","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzKrVVcaRxCW5jxgoB4AaABAg.AOwvnaY-7qdAOy-6FotgBA","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_Ugyw6fhE_puxgOT9Otd4AaABAg.AOwuITYQwS_AOwwsh6wtCz","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugyw6fhE_puxgOT9Otd4AaABAg.AOwuITYQwS_AOwyaDtuxuM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugyw6fhE_puxgOT9Otd4AaABAg.AOwuITYQwS_AOx-U9KU2BJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyW6pJd9Hs3u6CotBV4AaABAg.AOws_FUM07hAOwv2bQl-dV","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyW6pJd9Hs3u6CotBV4AaABAg.AOws_FUM07hAOxPu9ceH9d","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw97TiE8gqJdjSAQql4AaABAg.AOwruDBcQOBAP1AUK6AqgY","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxRGLCxCRK_44Hs5PV4AaABAg.AOwqrIgoeI_AOx9PjIZzky","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]