Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The more incels deepfaked into man on man adult films, the more interesting this…
ytc_UgzGkSNhZ…
G
Reminds me of a story where an android encounters a broken robot who's systems s…
ytc_Ugxh931ol…
G
These people who think AI will take over many jobs are the people who will lose …
ytc_Ugw_Us9BM…
G
Yes its all scripted in a interview with will smith you can see a man with compu…
ytr_UgzIZKm4Z…
G
It's similar to what happened with the E-Scooter/micro mobility companies: let's…
ytc_UgzUj44MX…
G
Oh my god this is so infuriating. You're not an artist if you don't make the art…
ytc_UgzlntvEQ…
G
It already is and everyone knows almost evryone is just in denail. (this doesn't…
ytc_UgxZmgJDQ…
G
The reference arguments are extra infuriating because it's now so much harder to…
ytc_UgzCLviUS…
Comment
I think the question is less "do robots deserve rights" (since humans are just super complicated carbon-based machines) and more "what rights would they need?". After all, out concept of rights is build on the needs of our own programming. An AI would have completely different needs to the point where it'll be pretty hard to even determine what they are (especially if people keep treating AI like humans-lite :/)
youtube
AI Moral Status
2020-10-21T15:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz7gb_iY38f6JbmYrl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUHl3in4V1rfNxB594AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwEYpXOLbe-ozb2HtR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwOjJNxICTqiQYueCB4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEVBfiHkf-JrL-5_N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFMTY9zcl_Jmc86YR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgytTmGjOLb6-7Mg1U54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwjoY0Qjx_Ra7lz1k14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBJ9VPDmyA9aeXfIt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOqho_RZvRdZm0e754AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]