Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Your brain is only the motor, but God is the gas that it takes to run. if you co…
ytc_UgytXRDeT…
G
How many generations did you create to say "this is what Suno gave me"? First sh…
ytc_UgzQsWYtQ…
G
Hey @hardhittertvname, thanks for commenting! You're right, fighting a robot can…
ytr_Ugz_dAw88…
G
They don't even look like real Ghibli characters, they look like those soulless …
ytc_UgwAlATW8…
G
Saying that using AI (or any other kind of technology) makes you less human see…
ytc_Ugx0_Ly6o…
G
Is it legal for govt to step in and pass policies limiting how much ai can repla…
ytc_UgxZZwp4n…
G
i think 1 thing is sure education going to change so need to accept it, AI educa…
ytc_UgwBAhotX…
G
Germany has notoriously bad intenet. I went from supporting better internet in G…
ytc_UgyWNb1Rk…
Comment
If we can't even give basic rights to animals that are being used in malicious ways, what point is there to champion rights for something that will never actually feel pain, or is as disposable and dispensible as an automation? Robots don't deserve the same rights as something naturally brought into this world.. any emotion a robot will feel will be artificial, just a byproduct of it's programming, they will never have a true spark of life like an organic being, and If we can't champion the causes of real life (Plant, animal) what actual right do we have championing rights of a non-feeling tool? It would be morally wrong..
youtube
AI Moral Status
2017-02-23T23:2…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Uggkpj0484okHngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgimK6yyUxqCPngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjdaVMSD2k3GXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiIzZZq-qyAXXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgggDM17Tp1NPngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjdtF2J32eE-3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UggWtTsvmDUhMHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiCmZjZisr1angCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkPA0VqLLUbngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggCs_iuvqXwUXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"})