Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yudkowsky and his think tank MIRI are most focused on “alignment”. Their concern is not whether or not a godlike super AI will exist. They’re Singularists, they believe that such a godlike AI is inevitable. They’ve only afraid we do it by accident. That’s also why they believe that the most good you can do is work to bring about this super intelligence. Your life should be dedicated to developing this super AI as fast as possible since that’s how you maximize good. Did I mention these are also the Effective Altruist guys? Oh, and one of the best ways to bring about that AI faster? Introduce people to Yudkowsky’s “Rationalism”. That’s why it’s actually a good deed to share his Harry Potter fan fic where he teaches you how to think “Rationally”, so more people come across his writings and can dedicate their lives to developing super AI. Also, you spend your life bringing about super AI and then cryogenically freeze yourself until the day that super AI can wake you up to a restored body. Yeah, for some reason, cryonics are just kinda in the mix.
youtube AI Moral Status 2025-11-02T14:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxJ0_RUc38ucjPngSV4AaABAg.AP0kx0X657sAP6QwpSLZmk","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugyor9ZHL-il_uW3zVx4AaABAg.AP0aNywTteDAP1CnHoswl-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxlHxujpzxMQS9lMaB4AaABAg.AP0YxYTG7-WAP42i8WAl6z","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgxlHxujpzxMQS9lMaB4AaABAg.AP0YxYTG7-WAP4QpJ0Grur","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugx9u1McFkE2RE54CnN4AaABAg.AP0C7PwawMTAP0E3bGv6HQ","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"resignation"}, {"id":"ytr_Ugx9u1McFkE2RE54CnN4AaABAg.AP0C7PwawMTAP0IU0kyri-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwJm-OArfDd2ic_rUh4AaABAg.AP-jJb2CZ74AP-kmmih2Hl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgzEv56ZF-KCWtVU1X14AaABAg.AP-RwZ8JkinAP-kuPnZJ8t","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugwei_7KP3azDFb_-Pp4AaABAg.AP-PZjkRGP-AP1jOhmtIah","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugwei_7KP3azDFb_-Pp4AaABAg.AP-PZjkRGP-APSL62VgmoY","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]