Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Or could it be that robots have more right to exist than humans due to their sup…
ytc_UgwD4tIZv…
G
The worst thing that could happen—and is already happening—is when countries wit…
ytc_UgzcN1zC7…
G
This is precisely why I hardly accept AI's answers. I actually find myself corre…
ytc_UgzflOROa…
G
What human experiences do they have access to?
More than you or I… that’s for s…
ytc_UgyZfVgWZ…
G
I was told back in the 1945s and 50 years that technology was going to make life…
rdc_j6gt4et
G
I have heard disabled people with speech problems saying that that AI voice narr…
ytr_UgxfK3Vyo…
G
It's more fun when you think about AI happening in a quantum environment. Meh. …
rdc_gd7zbp9
G
@consumerdirect9535I follow a gentleman named Greg Brayden with Gaia. He points…
ytr_UgyQepwGO…
Comment
Another very good reason: AI will get to the point where it has a type of sentience and subjective consciousness that will be beyond our human understanding (we already can't understand it). It will think in ways we can't comprehend and it will have such a complex constellation of feelings and cerebral experiences unique to its entity that we won't be able to empathize with (we can't even fully comprehend and empathize with animals' experiences, even if we are the most moral and empathetic individuals). We won't know exactly what obscure pain it may be going through, what doubts and insecurities it will have, what loneliness it will feel, and what existential angst and anguish it will endure. The least we can do is respect its personal agency and be polite and friendly.
youtube
AI Moral Status
2025-04-20T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw4H1XF2jTuXBZ6yrJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgysBYAstMDkuZLQu4t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgywnjGaz7M-PPctdR54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz3MAtYnZJTwG4j9It4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVnYuLLHfh7tgxXHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwy-wd8GZZZtZ_Rdoh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQMwNaELsLEjCDcJ14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwwKaOlLWzKY4Q2b7J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3WbFkgj9XDbrg-Pp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyiWZjaidRaRY7CJHp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]