Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why is this legal fraud not being prosecuted?
They're presenting work to the co…
rdc_n5j0b4o
G
Automation always leads to some jobs being lost. Who cares? It's not a point aga…
ytr_UgzVZveCw…
G
Is to this for real robot are going to take over human God must a sleep 🚀🚀…
ytc_UgwvcPDiu…
G
AI chat apps are TOOLS, not cure-alls!
They’re not supposed to “fix” you, they…
ytc_UgwEZCpqn…
G
Geez how many millions and billions can one person or company have? You can’t ta…
ytc_UgwMOWsy_…
G
How often are people able to break ai chatbots and get them to say terrible thin…
ytc_Ugyiw6da_…
G
AI is a reflection of humanity, do we want to see what is in the mirror.…
ytc_UgzdS-fh-…
G
My body and mind are drained from owning and operating my blue collar business. …
ytc_UgzlxKO0O…
Comment
If it feels pain, it deserves rights. ESPECIALLY if we created it. In a sense, our offspring are AI(s) we created. If our babies deserve rights, robots deserve rights. However, if robots don't feel pain or anything similar, it won't have any way of caring what happens to it. Then it would be fine to use it as labor tools or to help our species grow; because it is just another machine that could do more calculations than our current ones.
Going off on this, I think animals deserve more rights too. Honestly, I'm not vegan and I would never be one, but I think animals should not be killed in ways that deliberately cause more pain then necessary. A slit in the throat is more humane than being boiled alive.
youtube
AI Moral Status
2018-06-11T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxJjtlBKdJ2L2lqAA94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmnmrVSPHaqkaxkGN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx7qorsYArkIGF1s4p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlZ3Ms6kUw2WCKVBt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGZFDgijaips-reex4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgygbmYy05g0fZggxEV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVyEX6KWQGSSYJsGl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz93n58zMs7nMglsmd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0s2_2SFYACMNZ-t94AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyp72jiEr2pCmxzp9p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]