Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
don't worry, stupid people like politics will never allow AI to take full contro…
ytr_UgykPqCUS…
G
@laurentiuvladutmanea I swear, with ChatGPT being financed by _Microsoft_ so muc…
ytr_UgwEPy4Os…
G
Fun fact: ChatGPT can’t generate images 😂😂😂, I just exposed all these videos, oh…
ytc_UgzsbkDBZ…
G
Good. Let them be wrong. 💕 make art. Make it bad. Make it often. From your own h…
ytc_UgxLbOKJu…
G
Most people don't understand computing power doubles every 18 months. That means…
ytc_UgxNhQfo1…
G
Thanks for your comment, @alexanderjamesandoy3093! Did you pray to the Robot Ove…
ytr_UgwKos-bw…
G
Chatgpt is a language engine. It's not perfect so you can trip it up. Leave the…
ytc_UgymA79iH…
G
With a self driving car.................the cabbie won't be chatting you up !!!!…
ytc_UgwdlO31g…
Comment
"Consciousness is to Psychologists, like life is to Biologists -- we know WHAT it is, but we have a harder time defining it."
-Hank Green
I think robots deserve rights when people start to recognize AI as full-on I. When people understand that they are fully conscious and aware of that fact, is when we understand that they deserve rights.
The question is, when will that be, and how soon would humans implement robot rights?
A better question, and an argument for the 'Aware of pain' argument, is that, if you program a robot to feel pain, is that already denying them the right of not having them feel pain? If they need to be artificially altered to feel pain, is that immoral?
youtube
AI Moral Status
2017-02-23T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgguP2YK6Evav3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugh2mlfZdYufvHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugha9jkNx4mc5XgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiWDs3vwBNUm3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgheyRWddzrNB3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghnyBtTnaUr-HgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghS3cv6xjhhLHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjyf06LYA3O0ngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiuONZhggJmMHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj-mRAJ04tKY3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]