Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To be fair though, how different is that from a human therapist or an author ove…
ytc_UgwSEa_cS…
G
Well I bought super grok4 and its all problems. On my pc and phone so many glitc…
ytc_UgyadPWXG…
G
It’s not AI yet.
It’s too many degrees and even worse the degrees are in poli…
ytc_UgyO4wqRa…
G
We have dug a rabbit hole with no way out. Humans own arrogance will be its own …
ytc_UgwKr0BBY…
G
The discussion of anthropomorphic raging triggered a flashback to 1968: I was he…
ytc_Ugyl4G-jG…
G
The solution is survival and humans are designed to survive , so if it comes do…
ytc_Ugziy3tOi…
G
Thank you for your question. The robot's design includes anthropomorphic feature…
ytr_Ugy8l_DQS…
G
Senior dev here. This are bs arguments. AI is a massive help tool than can defin…
ytc_UgyONU1np…
Comment
Robots are programmed to take orders. AI are made to take prepare codes and orders to come up with an answer and solution. They cannot learn nor feel emotions or sensations. Therefore, the only rights they’d get are how they can be operated and maintained, not treated and given voting power.
Also, slave owners only used the pretense of teaching Christianity to the slaves in order to not look or feel bad about what they were doing. Christianity never had a part in slavery.
youtube
AI Moral Status
2021-12-15T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzzIQUhlqpmZWxR5r94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLbcAZCmnPK48jcl14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwxoX6CErICy53sEsV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0Cwz5BFyeVerbH3d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzq4Ua5zWAIesZvxrl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwYnzEHozWs7Lliuvp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGcqqMO8vgYVQNlQd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJYiZ-MhIuaurLqnN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugzr5tFR0Vcm6EpLAA94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNUacGL0h1P_jvH114AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]