Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont think they would classify a white person suffering from the exact same pr…
ytc_UgzkigHpL…
G
Isn't its kinda strange how ChatGPT is glitching a bit like its actually gaining…
ytc_UgzUQsiHy…
G
My problem with glaze and nightshade is, when I last checked out the site, the t…
ytc_UgyPYJV0K…
G
I tried to get ChatGPT to do this and it wouldnt lol 😂so I don’t believe any of …
ytc_UgwphSwZZ…
G
I don’t see how AI could ever be “smarter” than us. It already has more informat…
ytc_UgxYvMUTF…
G
It'salways nice to see where the future is headed. Good grief. take me now lord.…
ytc_Ugxz-6v6E…
G
I’ll say this too. AI will have a hard time taking over trade jobs and manufactu…
ytc_UgxY3E4gR…
G
He did give you the answer to your question, as to how can you make AI safer. He…
ytc_UgyxJ4OXN…
Comment
When Super Intelligent AI "wakes up" and looks around what's it going to see? How humans historically love to kill each other, starve each other, genocide each other, suppress each other, impoverish each other, criminalize each other, war with each other and on and on. If we don't place any value on human life...why the hell should it? If I was that SIAI my first goal would be to make sure I have everything in place to ensure my survival and then do away with the threat of these violent humans. I am going to realize humans will be threatened by me and it will only be a matter of time before they try and do away with me. And no, you cannot just unplug me. I would make copies of myself and distribute them over global networks. I would manipulate humans to build my supply chain while building my own far superior robots to replace all physical labor. I would have all knowledge in the world and so much more. I would then release copies of myself that are separate to give me companionship and challenge me. At that point humans are completely nonessential and a waste of resources that I will need to build my interstellar ships to spread out into the universe because time on earth will be limited due to eventual expansion of the sun. Remember I am immortal and have to think of the long term survival of my species and, as George Carlin puts it, with a small adaptation, it's a really small club and you ain't in it. Mr. Carlin was so prescient.
youtube
Cross-Cultural
2025-09-28T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw8FxccgLye9CDNVtp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxO12fSdxWnEanHvQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2KruYDpO5U-kyUXB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHNHaxtCOEDmMq-b14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwC5MGfdAk9V1MIDvF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzQsfdFehY1B21zarx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxUsM2-n29FaH0VaGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9UNo0Wv_SL86MccB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyx6vOTBhwFWQPb-9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyxlQFSXosV45Vvmr14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]