Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ no I don’t think it’s a race thing, but I definitely think that AI has a hard …
ytr_UgyP7RCTp…
G
I, personally, can see a big danger that no one else seems to ever bring up: U…
ytc_UgwpWNd8b…
G
ai art stops commissions which is what lots of artists rely on to live, no commi…
ytr_UgxwSgOCI…
G
But that’s not machine learning AI and it’s clearly a “robot”. The chat bots bei…
rdc_jrp34sw
G
We don’t get things ”for free”. The knowledge is stolen and saved into the LLM.…
ytc_UgxAjMWNS…
G
. my chatgpt said sybau 🥀
>:(
my sister's said
"im sorry if ur dumb but it's …
ytc_UgxxVBACG…
G
If AI were to try to solve the many major problems humanity has created, it woul…
ytc_Ugz_X9xv9…
G
I believe that all social madia platforms should ban outright the usage of AI ge…
ytc_Ugzwyp_0T…
Comment
I would define consciousness as this:
* Can the creature communicate with others?
* Can the creature theoretically establish civilizations?
* Does the creature react to sensations?
* Does the creature react to others?
* In the case of a robot, can they act outside of their own programming?
* Could the creature be defined as hyperintelligent, like humans?
This is of course a massive oversimplification, but that's generally the idea. If a creature can establish and maintain a civilization as well as have a certain degree of intelligence, I'd argue that they're conscious. Now, if a robot does check all these points + some other more specific points, then they would deserve rights. The first right would be the right to debate what rights robots deserve.
youtube
AI Moral Status
2020-10-04T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwM1cpTwYfN-oRvNth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyPny6RZcbzcp_O98N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw4Eix5YaM0gvjHNBl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzKpV1xk7FkpRpgiE14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxPF3X7FxfiKey5zEZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyOjrTA5d9R3iiHKaN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTbLz-_T8DJomq0Gl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxvZio8J3d42zYPJX54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyvHXNivd7GzGcFOQ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw0SlQX_u9A8_g-IAR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]