Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol, now imagine you are forced to ride Robotaxis because they are the only ones…
ytc_UgzUe291M…
G
But their is a huge issue with hacking AI it's like a hacker's wet dream.…
ytc_Ugz2BsGCq…
G
It's also crazy how heavily subsidised US companies like OpenAI and Co are. But …
rdc_m9gqdo8
G
What the heck are you doing in Hong Kong helping out the Chinese with AI intelli…
ytc_UgyWPbpIs…
G
i genuinely believe an Ai big enough with lots of data and space for its own act…
ytc_Ugy2c_e93…
G
5:35 Exactly! I don’t understand why simply none of our normal
Rules don’t apply…
ytc_UgxGeKaxx…
G
Honestly, steal what? When an ai training agent sees your image, does it vanish …
ytr_Ugz6QScWr…
G
>anti-authoritarianism: **Anti**\-**authoritarianism** is opposition to **au…
rdc_f1uhs7f
Comment
Yudkowsky and his think tank MIRI are most focused on “alignment”. Their concern is not whether or not a godlike super AI will exist. They’re Singularists, they believe that such a godlike AI is inevitable. They’ve only afraid we do it by accident.
That’s also why they believe that the most good you can do is work to bring about this super intelligence. Your life should be dedicated to developing this super AI as fast as possible since that’s how you maximize good. Did I mention these are also the Effective Altruist guys? Oh, and one of the best ways to bring about that AI faster? Introduce people to Yudkowsky’s “Rationalism”. That’s why it’s actually a good deed to share his Harry Potter fan fic where he teaches you how to think “Rationally”, so more people come across his writings and can dedicate their lives to developing super AI.
Also, you spend your life bringing about super AI and then cryogenically freeze yourself until the day that super AI can wake you up to a restored body. Yeah, for some reason, cryonics are just kinda in the mix.
youtube
AI Moral Status
2025-11-02T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxJ0_RUc38ucjPngSV4AaABAg.AP0kx0X657sAP6QwpSLZmk","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugyor9ZHL-il_uW3zVx4AaABAg.AP0aNywTteDAP1CnHoswl-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxlHxujpzxMQS9lMaB4AaABAg.AP0YxYTG7-WAP42i8WAl6z","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxlHxujpzxMQS9lMaB4AaABAg.AP0YxYTG7-WAP4QpJ0Grur","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugx9u1McFkE2RE54CnN4AaABAg.AP0C7PwawMTAP0E3bGv6HQ","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_Ugx9u1McFkE2RE54CnN4AaABAg.AP0C7PwawMTAP0IU0kyri-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwJm-OArfDd2ic_rUh4AaABAg.AP-jJb2CZ74AP-kmmih2Hl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzEv56ZF-KCWtVU1X14AaABAg.AP-RwZ8JkinAP-kuPnZJ8t","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugwei_7KP3azDFb_-Pp4AaABAg.AP-PZjkRGP-AP1jOhmtIah","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugwei_7KP3azDFb_-Pp4AaABAg.AP-PZjkRGP-APSL62VgmoY","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]