Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would love to hear DAN speak with an Emperor Palpatine or Darth Vader voice. T…
ytc_Ugz5tuM6h…
G
Its trained on a large amount of data, some of which is likely incorrect. But it…
rdc_jflx7iy
G
How are demons and AI any different? All the knowledge of the world, tells you w…
ytc_UgyAi8XLZ…
G
Perhaps if the gathered human consciousness were more evolved, then Ai would lea…
ytc_UgyBadyJt…
G
I remember anthropic safety report where they removed the training data to solve…
rdc_o3gs5zt
G
The thing is in my experience bots are replacing jobs and saving companies money…
ytc_UgwvV7Edv…
G
AI is crushingly boring. I can't watch this video through because AI is so damn…
ytc_UgzIQfXL3…
G
Of course they will be replaced. All executers will be replaced. Some might stay…
ytc_UgyeFRIVD…
Comment
Let me say it this way: people, who think AIs are sentient and have been awakened, have no clue what AI really is. Despite of the fact that we don‘t have even the slightest clue what consciousness is, consiousness is definitely a prerequisite for sentience. As an AI researcher I am not amused about such statements. Let alone the constrained context memory of LLMs that allows an AI to only remember a small amount of knowledge and reasoning capabilities that are rather limited. If you speak to an LLM, it will forget everything it knew after a few conversations. The illusion of sentience, though, is only an illusion. Current LLMs are the Elizas (Weizenbaum) of the 21st century.
youtube
AI Moral Status
2025-07-09T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyEmxHXkbkJJObS7JB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwytXrUQ03urxnzh9l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOvo6sm-NTi8th1254AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyeyeuRdolUqENmtdh4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwmlsUCR32cVWt0evh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYTX4GRFpRtkZDuQN4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGvT-GfWMfQPHR28J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9zNxRQVmvjrgfMTx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzHgnFhhz00J0SOs3F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-pBPNZJKPC2xRtd14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]