Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@thewannabecritic7490 how can you repeatedly say so many thin…
ytr_UgxxSPRIR…
G
Yeah definitely feels right
Though, AI in art is an interesting prospect to me,…
ytc_UgxCSB-ej…
G
Clip 2 is not AI. she is going crazy because it is her first time trying sparkli…
ytc_UgxLpyXt2…
G
AI has ONE good use!!!!! PROGRAMING!!!!! THATS IT!!!! Its AI "assistant" not do-…
ytc_UgxznRmZN…
G
@boxtoprock8020 Ah okay.
That's good then.
I hate when big companies support AI…
ytr_UgwElvADp…
G
any reason is good to reinforce anti-china sentiment
And the point is to ban any…
ytr_UgzxUjOiH…
G
The question is if people don't have job, can't spend money.. who will pay AI? 🤣…
ytc_UgzxD5Dlc…
G
As far as I'm converned people who make Ai art aren't artists but Commissioners.…
ytc_Ugw0pJzHi…
Comment
It's simple and not even complicated. If they feel pain and are conscious like you put it; we should treat them with human respect like we should to EVERY living thing. If we make robots that aren't equipped with feelings/consciousness/pain then we can treat them as slaves as they're just another man made tool, like a screwdriver.
This is about humane or being an ass. You're not going (or not supposed) to make a dog work at a mine and whip them to go faster would ya?
If robots start arguing that the slave robots (ones without AI capable of consciousness) are like them, and should be treated as equal, then the fact is that the robot needs to be reprogrammed/educated because the human that made it fucked up.
youtube
AI Moral Status
2017-02-23T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Uggkpj0484okHngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgimK6yyUxqCPngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjdaVMSD2k3GXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiIzZZq-qyAXXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgggDM17Tp1NPngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjdtF2J32eE-3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UggWtTsvmDUhMHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiCmZjZisr1angCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkPA0VqLLUbngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggCs_iuvqXwUXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"})