Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Indeed. This is why AI has won. I used to know a bloke years ago who said the …
ytc_UgyWqEFp5…
G
Yeah I'm not sure what the plan is here.
You might have a homeless guy who, bei…
rdc_d2xl1se
G
Their “creativity” resides solely in synthesis and pattern recognition. Not true…
ytc_UgxOwuM5P…
G
AI cannot only do what writers and humans can do… But it can do it better. Holly…
ytc_UgxuEugOP…
G
The thing is, why would we want to give robots consciousness and emotions?
A non…
ytc_UgiiuYeq4…
G
I understand where you're coming from! Interacting with AI can definitely feel u…
ytr_UgzZKWrha…
G
You're not making anything though lmao, your stealing work from an AI that has n…
ytr_UgwbFwrvm…
G
Thank you so much for this talk 🙏❤️😊 As Humans we have to hold ourselves to the …
ytc_UgyASZ7Vj…
Comment
AI today works by learning statistical patterns in huge datasets and predicting the most likely response. It can simulate reasoning, emotions, and personality, but it has no inner experience. According to current science, AI has **no consciousness, awareness, or self**. It doesn’t feel, care, or know it exists. Claims that AI is conscious come from human projection and language fluency, not evidence.
youtube
AI Moral Status
2025-12-19T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzkAYXpGhUzq3nU-3t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwsZSz76FPQ_S_w_GB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgyufBdOXsWuBJhnDzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwPcAV06bpA-sclHa14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgwAgGXvEaaT4YqI7mF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzarmPOYiSF7P03C-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyuOwaPaFub47DO1Wt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyzeGhWK-Gl1ZWBD-N4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugx09N09SovOmKxOakB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgzK7UpUYB9eVAJtn594AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"]}