Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
at least AI agree 100% with veganism and 110% with plant based.
since it can be …
ytr_UgwMGIntd…
G
He's telling the TRUTH about JOB replacement! AI will REPLACE lawyers, doctors, …
ytc_Ugwc4rZ15…
G
This is just designed to weed out the idiots drivers and incompetent drivers
Wh…
ytc_UgzTIznhi…
G
AI is like religion, a piece of shit stuck on a stick to manipulate idiots...…
ytc_UgyyY7jkv…
G
😢😢😢😢AI does not need humans once it becomes self aware. It can manufacture ever…
ytc_UgxodhSWm…
G
Human traits have always been undesirable, though. That's why we invented the co…
ytc_Ugzb9ScCw…
G
No, those are older tests, the newer one explicitly stated that the AI should pu…
ytr_UgxQDg74d…
G
Is the person who commissioned the Sistine chapel an artist? They gave Michelang…
ytc_UgzclSieH…
Comment
The "hallucination" is a feature of randomization it's suppose to do. It's doing what it was program to do just like all other AI for video games or programs. It's not thinking. It's randomization is what makes the AI seem like it's "thinking" by appearing "creative".
It's not smart, it's not actually thinking, it's carrying out a simple search and retrieval and portion of it is required to poop out a portion of randomization.
Lots of video games are programed like this too, and we see some stupid stuff AI does.
But it's not thinking.
None of this is doing anything that resembles biological thinking or any thinking. Its running a program and doing what it was told.
youtube
AI Moral Status
2026-02-01T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGKcm2ybSTRfn718d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5FHO6CmReYCS29QF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyETgHr4DWhzwEaahl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMqd62o-XcHw9j-1l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxuwgtghtG9vOV2tgF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDB6WluvGPywRSqsV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqUokl3OZiIq_FY8x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCMN4wsZtAEZy1Jep4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaoMYJegby_khXa694AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMZ7UKWfAWyHSogbF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]