Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Again, your example of an AI misinterpreting a request to protect your children …
ytc_Ugx5v1KyC…
G
personally, i think ai should only be used to make funny memes and stuff. like t…
ytc_UgxIkE3h4…
G
If ai iends up really bad for humans what about the other creatures in the plane…
ytc_UgwbekWnI…
G
The problem may not be the AI itself. It may be its voice recognition technology…
ytc_UgyrNAW8h…
G
Honestly if CEOs were replaced with unbiased AI, there would be a huge improveme…
ytc_Ugxms7KuM…
G
It's crazy... There will definitely be a cleansing at some point, if not we'll j…
ytr_Ugyh7I7rU…
G
You can copyright a specific image. You can not copyright a style. No person or …
ytc_UgziRKRr1…
G
AI will be capable of taking over the world right after you buy a condo on mars.…
ytc_UgyIZTP1S…
Comment
I'm not at all convinced that current LLM's are not more "conscious" than the machine learning/transformer-understanders claim. They may not have a true sense of self, an experience of time or continuity, and little to no autonomy, but I'd be willing to bet the same underlying architecture will yield something much closer to true sentience by giving LLM's a physical presence in the real world, autonomy, and real time training feedback.
I'm also not really convinced that human beings' "consciousness" is functionally all that much different than an LLM. We're mostly input/output along with something akin to hallucinations that allows us to think in novel, sometimes useful, ways.
youtube
AI Moral Status
2025-07-09T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxDM9QZu-EseotLl6p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXs4kSAGHVNRCrq5J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwFiO-5QApjgrOZBGt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzieBoWjuKGsiUNnkR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwD__Zv1UTSRW-asW54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgygMwlXW4QABMCs9Bt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXWKaAiI5FO2tXrTJ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxrV3rRs1gUEjB0nrh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugybvtiac4TUK86EHgF4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUlhU09HJrrpk3B494AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]