Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Absolutely.
If she is pardoned then she NEVER needs to answer another question …
rdc_oi2tc34
G
I believe you should be able to copyright a style when it comes to AI. A copyrig…
ytc_UgxFFFAnY…
G
Ya you say you’re leaving the zoom. But soon you won’t be able to tell the diffe…
ytc_UgzjtJIk4…
G
To me a readjustment to intellectual copyright laws is needed to better reflect …
ytc_UgzUvEcEn…
G
Most thoughts provoking of the uncharted, unlimited possiblibilities for humans …
ytc_UgwxGg_rV…
G
There's a scene in Persona 5 where these robot workers for a major corporation w…
ytc_UgxOhCINW…
G
Not really fascinating. The fact it's hard coded to say it's an AI proves it isn…
ytr_UgzvQ_-bC…
G
Idk why some of you even call it Art. I always called those AI images or videos.…
ytc_Ugzs7aVrU…
Comment
I agree a computer will NEVER gain consciousness, because it's following an algorithm. A computer could emulate consciousness, but that is not the same obviously as a being having consciousness. Consciousness as come about in living beings from organic evolution. Nature or God didn't write programming for us to have consciousness. It gradually built we became more aware over millions of years. The human brain does the best of a bad job, in the way the human brain have evolved over millions of years. Consciousness is awareness turned on it itself, so we are aware that we are aware. So to emulate such a process the computer programming would involve a while loop. WHILE awake be conscious. What has not been considered yet in the West, is firstly, the human brain as quantum features which allow us to have precognitive visions, or precognitive feelings , such as knowing beforehand someone close will contact us. This is before we get to understanding the soul.
youtube
AI Moral Status
2025-07-02T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxvWZQwC15w7ssOLiF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzI_azA5RwTBwO6f-t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7tHhfuZ7GqxYWVXh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwdQV72TltEFkramqh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyi1eERQeZ2WBZzYtt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3vSZvCRBN3UImvp94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXupdi0pVh_K0MsYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzV3QcGZaU4X-0IQDt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnVUsD54bOEBPLPER4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgygvcndPTywJ1IjU_d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]