Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only thing overrated is the opinion of this idiot. I would not judge AI on t…
ytc_Ugw6ruBlU…
G
this comes under learning of the ai , as a book doesn;t give me experience but g…
ytr_UgzJ-PyHD…
G
Where dide you find that non sense mate ? Always been the same dude making the v…
ytr_UgybBpWvp…
G
Everybody has a right to use our own formal way of doing things So yeah how abou…
ytc_UgwZzB16X…
G
Everyone is entitled to their opinion, but it’s a problem when people who have v…
ytc_UgxSR60aa…
G
The thing is, it isn't even just artists. As a physicist, and really all the sci…
ytc_UgwxpU1-W…
G
You're basically saying that AI can become conscious but would not want to live …
ytc_UgzTlk72L…
G
1. AI Trainers 💻 2. Renewable Tech 🌱 3. Creators 🎨 Which would YOU choose? Comme…
ytc_UgxFWidI8…
Comment
If you give an LLM the same prompt, and the same seed, it will produce the exact, same "conscious" response again. You can't define consciousness this way, because it is deterministic. But if you define consciousness as a repeatable, self-referential, recursive experience. Then you could indeed give it a self-referential, recursive prompt, and have it re-produce the same internal experience over and over. (But only while the machine is actually running). And some people live their lives this way, doing the same routine every day. So we can't really escape the conclusion that it depends on how you define consciousness.
youtube
2026-04-18T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx9tCi2PWXEpjA8nQF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwPkfsH9a_6zmdlY6x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwqkOUnnDFxVJ9jSbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugzv56gN91YBqoBWUaB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwrNl8CaufYUou_H-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwmmVPXDN2QAFMhlQh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzrzygFfq3gEoaXH0V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgylKNmY_Bd17tfL_1t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugz3nG_vxHFSmi7JW_F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzMe0yyaTzsKRQ2le94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]