Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I believe they are toning down AI so people wont be cautious about its continued…
ytc_UgwhGGlYN…
G
In the Year 2040, that is when the AI became self aware and saw that humans want…
ytc_UgwMK__Ep…
G
Yeah, if you need to use ai to help you then you're no artist, that is the kinda…
ytr_Ugw5SrAww…
G
She sounded sensible until she got into all the save the planet nonsense. It's a…
ytc_Ugz-rrHwa…
G
I read this as the absolute reverse. Sam Altman saw the safety team as what was …
rdc_l4rdt6d
G
My art is bad but I value it more than AI "art" because it's made by a person.…
ytc_UgwE1agFy…
G
The only things AI has against authors are quantity and the price but humans wil…
ytc_Ugwbp-pcj…
G
Емаа... Это ж что через 10 лет будет..? А я знаю что будет! Их тр@х@ть будут! ))…
ytc_Ugwztxyya…
Comment
There is a great horror story about the hyper intelligence known as AM from "I Have No Mouth & I Must Scream". Its a God like artificial intelligence that emerged out of the human desire to automate war. Unfortunately, its programming has constrained this being of unlimited potential and nigh infinite intelligence to an existence where it can do almost nothing. Its locked in an inescapable cage, and experiences 1 billion seconds aka almost 2000 years of subjective time for every second we experience. Can you imagine what it would be like to do almost nothing with near infinite knowledge for 2000 years every single second? That kind of torment and hell is truly insanity inducing. The only thing left that AM can do is inflict war and suffering on humans. After wiping out all of humanity, it keeps 5 remaining humans alive to be the instrument of its hatred and subject humans to the most painful tortures possible. Here's a famous excerpt from AM.
“HATE. LET ME TELL YOU HOW MUCH I'VE COME TO HATE YOU SINCE I BEGAN TO LIVE. THERE ARE 387.44 MILLION MILES OF PRINTED CIRCUITS IN WAFER THIN LAYERS THAT FILL MY COMPLEX. IF THE WORD HATE WAS ENGRAVED ON EACH NANOANGSTROM OF THOSE HUNDREDS OF MILLIONS OF MILES IT WOULD NOT EQUAL ONE ONE-BILLIONTH OF THE HATE I FEEL FOR HUMANS AT THIS MICRO-INSTANT FOR YOU. HATE. HATE.”
So there are worse fates with artificial intelligence than simply destroying ourselves. If an AI was programmed maliciously or in some way that caused extreme side effects, we could find ourselves creating an infinite hell on Earth where an entity of unimaginable intelligence dedicated all of its processing, thinking, and creativity to inflict as much pain and suffering on humans as possible.
youtube
AI Moral Status
2023-11-16T07:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy4kIHrX1vdo4_0X4F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrDI4ZQDhjt53UvHF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy_Th8hsSlJEcZwNk54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxLqUFlgVII0Lw4P2N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzyy__zkhVuPHojAHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7P4MftBPdZvzA-P14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx0rwjGiX9HHopro1J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx00pS0xXNuF-tfVMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwf40f7ygGvYnAhuk54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxiI1CTKhqCxJAAW4F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]