Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@captainawesome2226 That's not a good argument for replacing humans. It should…
ytr_UgzSgaPuP…
G
I'll stop using chatgpt for therapy whenever they make a real human who can list…
ytc_Ugxs_sj9B…
G
There are certainly risks to AI in the sense that it can enable malicious actors…
ytc_UgwuzAUM6…
G
@areallycoolhat5427Who cares how much is spent and what is spent, as long as it…
ytr_UgwQq8f5C…
G
Some weird lighting, out-of-focus body and background, and color contrast going …
ytc_Ugy1UtNOw…
G
AI can't create that yet, but one day it might.
We humans love to think that we'…
ytc_UgzW-7Ul2…
G
The fact that anyone trusts AI with anything is still bonkers to me. I know I'm …
ytc_UgxoR5Leg…
G
Robot is programmed don’t get confused
They don’t have soul and scientists is t…
ytc_Ugx6XvT2I…
Comment
Then should we give rights to video game AI that respond to damage and actively try to stop it? What about advanced computer viruses than cling onto their existence and refuse to be deleted? Practically they are feeling "pain". The question is not whenever they feel pain or have emotion but if they do it they was we do. If computers receive the same neural system we and other animals have then we can talk about their rights.
youtube
AI Moral Status
2017-02-24T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugizh8nsOE91DngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggyl3hVgRsJR3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugg3NvlXnGLtkHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjIf01qQSO2LXgCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghC_fBL9IzRwngCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiW6UEDZs8n7XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjZjn-YzcpzIngCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Uggbivfnf2X5BHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggKtzN8-y1cSHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghTBvSlrH_EcXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]