Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s racist hmmm, so now AI face recognition is saying all peoples of color lo…
ytc_UgxFGsnvU…
G
Ironically, I asked AI … this is what was produced:
Say we get really capable …
ytc_UgxgluA8G…
G
I would use the AI's argument against it: "If you say culture has evolved to see…
ytc_Ugwa2mr9l…
G
Haha, it’s definitely a wild thought! As AI continues to evolve, the interaction…
ytr_UgwwGwB6d…
G
Not sure about this. I know you mean well, but 6 guests was a bit too much.
I w…
ytc_UgymI7SI3…
G
how is it weird? youre literally watching an ai video, the machine knows that yo…
ytr_Ugz0IUCM7…
G
This dude doesn't know much about biology. For a virus, the concepts of infectio…
ytc_UgyG0O_9f…
G
meta ai kept trying to gaslight me into thinking it wasnt generating pictures of…
ytc_UgyuS417G…
Comment
I've flipped. I didn't used to think that artificial intelligence would ever happen. But they didn't figure out how intelligence works, and then build something that did that. No. They devised problems that would require intelligence to solve, and then developed a means of evolution, with the sole goal being to solve the problem. And then they artificially gave it a million years' worth of generations to develop that intelligence.
The really scary thing to me, is that every time I hear of some ridiculous result that an LLM came up with, I could see how humans could havre reached that result. Especially humans without the benefit of feeling the results of bad decisions. Sure, they give AI goals, but they haven't found a way to punish them for especially bad results.
You know, children will draw hands with the wrong number of fingers.
youtube
AI Moral Status
2025-11-01T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzV42tk9RzMUCIlPSx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy0-8IOORn442PHOTR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZHWYCwaaxG5KJRBV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyN1MxzeDyN_bc8yid4AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwmO9GUr2pYKn9PQmJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVmacntCEhwlW7MMh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxMX8rJxl-gD74Tw7N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6jyWTPCZbNoj29EV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOeA4j9MJvJ_mDLv94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxu84KEN_5gy_ufcqV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]