Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"The US is leading the way"...maybe in commercializing AI, but I'm pretty sure a…
ytc_Ugx04xxwx…
G
Forget words you simple minded humans if AI wants it will be so smart that it wi…
ytc_UgxeodOxV…
G
😂as soon as ai creates super ai and you dont have a underground bunker and ul we…
ytc_UgyHttiu4…
G
If ai was used for good, to help with healthcare and poverty, increasing housing…
ytc_Ugy8mTX3X…
G
there are fully autonomous mine sites, factories, warehouses, and, soon other in…
ytc_UgyWJJAqu…
G
I take comfort in this oddly, because if AI is unable to navigate in a human wor…
ytc_UgzKAZy5f…
G
so... IDF's AI augmented or lowered the 50k death toll in GAZA?
It's scary in bo…
ytc_UgygsxcbE…
G
Mandate profits to be shared. The true reason AI was thought of - was to free up…
ytc_Ugyc2Qol9…
Comment
i think that as long as it has a conscience, it should have rights. Just look at humans and animals ; look at how the brain functions : connected cells that use an electric impulsion to communicate. Now look at how a robot works : electric impulsions are used as well. Don't tell me you've never thought about it : a human body, especially the brain, is nothing more than a very sophisticated micro-processor. We are complex "robots". You are nothing more than an object, your conscience is nothing more than electricity and a complex disposition of your neurons in the object you call brain. Don't try to argue with me about your religious thoughts ; they dont matter here because you have the right to believe. But science is only made of facts, and has nothing to compare with religion's goal that is to let people believe what they want to.
So yes, a bot should have rights when it gets enough conscience to be compared with an animal/a human.
youtube
AI Moral Status
2017-02-23T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Uggkpj0484okHngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgimK6yyUxqCPngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjdaVMSD2k3GXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiIzZZq-qyAXXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgggDM17Tp1NPngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjdtF2J32eE-3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UggWtTsvmDUhMHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiCmZjZisr1angCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkPA0VqLLUbngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggCs_iuvqXwUXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"})