Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We might well end up being impoverished small farmers in the future, ignored by …
ytc_UgzcdGmJT…
G
Claiming to be an "ai artist" is like driving a marathon and being mad no one is…
ytc_Ugx3cD6ul…
G
Chatgpt is not very good with questions, it will repeat the same response a lot,…
ytc_Ugw_Ei7AP…
G
They have everything prepared for the Ai takeover big businesses like Walmart wi…
ytc_Ugzl1pS-a…
G
There is no AI,to begin with.There is only human intelligence.Robots can learn t…
ytc_UgyLMznhg…
G
I don’t think people go to AI for “good” art. I think they just go for quick art…
ytc_UgzI71u-l…
G
There is ending number 4 where ai liberates humanity, but the government convinc…
ytc_UgyphzTEj…
G
Using AI to write a book doesn't mean you've written a book. It does mean that …
ytc_UgyncrdrA…
Comment
That's a very difficult question to answer. I guess no human inherently deserves anything, because we are just a complex collection of atoms.
We collectively built up our rights in order to keep us safe. We are evolutionarily programmed to feel the need to keep the rest of us safe, too. This used to include only the white people in our own country, but now includes, for many people, all people and even many animals in the world.
The deciding factor on if we will grant rights to robots will be just how much they feel like "one of us". Probably, for a long time they will feel like machines and not at all like humans. But with all the AI and Brain-Machine-Interface research that is going on, we simply might merge with AI. Cyborgs feel like people.
Or, an AI might get so smart that it realizes acting like a human will lead to us giving it beneficial rights.
youtube
AI Moral Status
2017-05-14T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UghFzdPE96-vgXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh85duhMW553XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghuQ76Mtq7bmXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggbfSnvIR1GcXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgiMadlbSIBUj3gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgippfQcZ5eF2XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ughq7T7pcmvSuHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UggYnPBXwk_QsHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughho5exH_I7x3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggduMjarUQUYHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]