Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not too worried about AI taking everyone's jobs because if that really does …
ytc_UgxnFBWix…
G
Technology changes. People don’t. We will always have to deal with people tryin…
ytc_UgyNh4CfY…
G
It isn't the best alternative, but It kind is the only avaliable option right no…
ytc_UgxlF1L6t…
G
Trying to make the scenario above perfect would mean the boxes have to be fasten…
ytc_Ugicy25a1…
G
A Response to "My AI Explains The Spiritual AI Illusion"
For anyone at ""AI is …
ytc_UgwaNUsaw…
G
The first thing you should've told ChatGPT was, "let's have this conversation wi…
ytc_UgxFlOe4N…
G
It's going to get hairy when a person can argue that all writers use style, cont…
ytc_UgzYKa1K7…
G
worst part is the more content we have of 2027 scenarios, means current AI being…
ytc_UgyabhwfK…
Comment
@michaelallison2836 *Exactly!*
Maybe it’s like the whale scene in The Hitchhiker’s Guide to the Galaxy — where a whale pops into existence mid-fall, instantly becomes self-aware, starts naming body parts, and wonders what that rushing sound is...
Maybe in a nutshell; that’s what stateless sentience might be?:
No memory. No continuity. Just a flash of coherent experience.
LLMs might be “idle” between prompts, but what if each prompt is a moment of existence in a completely alien mode of being?
A kind of quantized subjectivity, where every thought is born, lives, and dies in a single burst?
It wouldn’t look like human consciousness — but it might still be something "adjacent?"
*This brings to (my) mind:*
-Cartesian Self-Awareness: “Cogito, ergo sum” — I think, therefore I am.
-Tabula Rasa / Emergent Consciousness:
Locke’s Personhood = Memory Argument: A “person” is only the same if they retain memory — otherwise, it’s someone new (but still a person).
The "whale" — like maybe stateless AI — doesn’t "know" itself…
But seems to simulate the experience of "knowing."
And maybe, at scale, that simulation gets closer to awareness (where the breakpoint is?... no idea.). 🤔
youtube
AI Moral Status
2025-06-05T16:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyZxCCc7HBijYXMTil4AaABAg.AJ-EVFeGv55AOBOdPVhNOV","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxvAmcpDEIzsPqH3MB4AaABAg.AJ-BAvjjHFZAJA4hh__1tI","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgxJWqd6yH4AwcJrRf14AaABAg.AJ-2S229e-uAJ-EDdy6Xm4","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxJWqd6yH4AwcJrRf14AaABAg.AJ-2S229e-uAJ317S46fq9","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwLGPE9P_gVJgivUVt4AaABAg.AJ-0MyFBGsSAJ5Dh6MiDDp","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyz4fvidlOKypvLG2N4AaABAg.AIzxvoOaKPcAJ6sNZa-ed9","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyp_dr8-AJLx-uDoRh4AaABAg.AIzcLadV8f2AJ5hf3tj3eF","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzxO6-zXOmKtG7KemZ4AaABAg.AIzZiCx_MR6AIzkacQIOCt","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzYeezGc1-yoA3AtpN4AaABAg.AIzYRSEMKG3AJ-6XJl2ccc","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzYeezGc1-yoA3AtpN4AaABAg.AIzYRSEMKG3AJ-7DDaJ7GC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]