Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI itself isn’t making art either. It’s stealing bits of real art, and then …
ytr_UgwwacVoi…
G
Don't you think that to building "nice" AI to protect from "bad" AI is... a bad …
ytc_UgxIOZLOx…
G
The whole system works of consumerism and greed AI will be a tool to continue to…
ytc_UgyQjwrwO…
G
you need to thank it in the beginning. studies show that the AI is way more like…
ytc_UgyKLErCI…
G
I do not believe AI's would compete, as they are not stubborn, refuse to listen …
ytc_UgzAaZVoE…
G
Commenting to boost the algorithm, more folks should see this and at least ponde…
ytc_UgwKA6TNn…
G
Its illegal to "pre-crime" anyone! Ypu cannot use tge law according to rights an…
ytc_Ugyxt6Lnp…
G
ppl can bearly take care of their child and now we are creating a new life form …
ytc_UgzUIdO6_…
Comment
@izzya8132 I mean, it's a half million words that are supposed to teach you how to think by someone who really thinks that he alone can teach you how to really think.
Or is HPMoR a million and Yudkowsky's "The Sequences: A to Z" a half million? I can't remember. One's definitely a million words, the other's like 660 thousand, I always just forget which one is longer.
Both are supposed to introduce you to Yudkowsky's ideas of Rationalism in order to bring more poeple into the community and convince them that the greatest good they can do with their time is to do whatever they can to bring about a benevolent super AI. That includes share HPMoR, so more people can be taught how to think right by Yudkowsky and thus devote their lives to the development of super AI.
Yudkowsky's Rationalism might not itself be a cult, but it's got some of the same problems as cults do and is really easy to just tweak it a little and get full blown cults, like Ziz did.
youtube
AI Moral Status
2025-11-02T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyvQRoflPZn7t69o_x4AaABAg.AP2QGordsKqAP4U16ElfC3","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyprtK2H6ZnQccEmax4AaABAg.AP2LElv1ficAPAH6oAP5Lg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyprtK2H6ZnQccEmax4AaABAg.AP2LElv1ficAS2wKizF0w8","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwCMEtyTtZwynwkXrV4AaABAg.AP1lM-KdbLQAP4Oidn1uw9","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwCMEtyTtZwynwkXrV4AaABAg.AP1lM-KdbLQAPCY0MmjHmx","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugzrp8HbL5oyccS7tDh4AaABAg.AP1jrhQEhXQAP1xLlAG7D-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwRM1UtUh06iVVjG654AaABAg.AP1_ZRcpUnuAP1ykCa0r7U","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwqbApXzc0IwtAuhjV4AaABAg.AP1FQf2hs4-AP1zdo8IbnA","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytr_Ugw2SqS_h8aKB6KVPI94AaABAg.AP1A9_s-JYIAP1Doud4CvW","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytr_Ugw2SqS_h8aKB6KVPI94AaABAg.AP1A9_s-JYIAPFPAAXcqJX","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"}
]