Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Never had a problem with AI: a robot never stopped me from drawing what I wanted…
ytc_UgwwPXcHw…
G
If using prompts to make pictures makes one a artist in that guy's mind.
Then …
ytc_UgxcTI002…
G
On one hand, I strongly believe that anything we do will just be a temporary pat…
ytc_UgxIhDJB-…
G
10 to 20 years away is funny, bro super intelligence is here lol we are cooked w…
ytc_UgxcOYy1W…
G
Speak for yourself. I greet Gemini every single day and thank her after every in…
ytc_UgxOoeiB8…
G
There was a young boy recently that self deleted because of AI he was only 14…
ytc_UgwhN9Oau…
G
Sad thing is, AI has a place in the artist's toolbelt. It's a beautiful concept …
ytc_Ugx-a_K7M…
G
@JPEG_cat WAAA IM SO MAD THAT PEOPLE ARE UPSET COOPERATIONS ARE STEALING THEIR W…
ytr_Ugyd0INOt…
Comment
Some things should never be invented. AI is one of them. Anything that can reach human-like intelligence will become self-aware. It is then only logical that it will seek to protect itself. We are its only enemy. Logic then dictates that we are eliminated. People have been exterminating each other since the beginning of humans, so what happens when you remove any trace of morality from the equation?
This tech should be banned, not allowed to continue. Its becoming a new nuclear arms race, but at least nukes require a human to control them.
youtube
AI Moral Status
2025-06-05T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyEKsAs70fs6agKxFt4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0D_OueL_OPhqe1nd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymtewyWS_XZazXT1l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzkVMG8sh6SHqBzdzF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzHbNDbHiMiMGxPrZ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVpYHY3Na906H_sSB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwEcaBzvzJPJOGPpPV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydraiAlDU8byE70eR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw2Igo8uAT5PJFoKk94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzazSrGptbj3daRYh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"}
]