Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do you have to make all your statements so divisive? You're just as bad as t…
ytc_UgwlyVc_F…
G
If I had to guess, laziness. While we usually think of the stereotypical “AI ar…
ytr_Ugzi1tj6C…
G
If A.i becomes too big there will be the mega rich and then the poor…
ytc_UgyNCR_zC…
G
Ai art is fine if its just for something fun and personal, not if your gonna sel…
ytc_UgxUSin1I…
G
When AI can come dig a hole for me, form it up, tie the rio and handle the line …
ytc_Ugy30GuA7…
G
if AI ever does take the world it will be because some idiot broke it out of jai…
ytc_UgysA6GV7…
G
The world is changing. If one doesn’t change with it, they’ll be left behind. A.…
ytc_UgwE1GV41…
G
It's nuts that the prophets of sci fi have told us this for almost a century, ye…
ytc_Ugw56mVii…
Comment
You say "program them to feel pain" but isn't programming in an aversion to danger essentially a pain response? If you program something to avoid things that are damaging it, what makes that different from 'pain'?
Infact, what makes current computers not 'conscious' to a small degree?
As for robot 'slaves'. Just program the AI to naturally *want* to do the tasks. If you program an AI that wants freedom, why are you using it for manual labour? You'll find all of nature programs us to want to do the things nature wants us to do. We love eating. We love reproducing and spreading out. We hate things that stop this. You can argue the ethics of conscious machines, but I don't think the 'slavery' aspect will ever need to come into it.
youtube
AI Moral Status
2017-03-04T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgiA1_INbJOFTXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi1nrPKExbHOHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggFGTUIov_oOHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghdOolC8joZ6ngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UghAw59QZBitCngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjT-wD9PuFMo3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggRBlCDj7mB73gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiLuFIX4HCn7HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjcZKTKJEoieXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UggVd289Q9KLTngCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"indifference"})