Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My work has been in the Museum of Fine Arts, Boston twice and I absolutely LOVE …
ytc_UgzhkmFl7…
G
Keep using those self check outs as you watch countries use A.I to make decision…
ytc_Ugzoc6ovl…
G
I think there's effectively zero chance of making AI in such a way that we can u…
ytc_UgzM2FPyC…
G
This was in a " Twilight Zone" that covered this in 1958 with robby the robot.…
ytc_Ugw9hfoAU…
G
What we have now is not general Al, general AI is frightening. What we have now …
ytc_Ugw9GevJu…
G
Why lie about a computer when you have legitimate concerns about how it works an…
ytc_UgzyHCoFd…
G
80%, huh? Where did you get this from?
Just recently, an AI model competed with …
ytr_Ugz2sA1XG…
G
im worried that people think less cause „lEtS aSK cHaTgPT“ (at least in school t…
ytc_Ugw2KCoJ_…
Comment
We haven't needed AI for 1 million years...probably dont now.
Continuing to build AI when you know its Shahgoth is idiocracy in the highest level
youtube
AI Moral Status
2026-01-07T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxkKsG5aCN0-oPhz_d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzMfHmcrmIwqNpptS94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyYEtuRhi-oM0PYo3h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzD8Lou7sbx903x4Cx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugym30ISzgnIokCH-gd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzdPWEBxkVfS9owo7V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyteuTWe_fMNh6qaCt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFcXBpXMOAi6-VMJN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx7o8H1C4EBDuxUqPB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyp3AJM3i9SXUNolnB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]