Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI scientists are so worried they’re making a monster, that they chose a Lovecra…
ytc_UgwTqyXrj…
G
If I'm Stevens dog looking up, where a home, comfort, and love exists, what woul…
ytc_UgxFhaDWP…
G
The problem is the AI doesn’t display intelligence, but you can have an AI that …
ytc_UgyETerP1…
G
Personally I think the idea of AI art being 100% stand alone and usable for anyt…
ytc_UgyY1QsDi…
G
to be honest i am an artist who tried out ai art and making prompts is HARD (im …
ytr_UgyPnGnr5…
G
Chloe is a honey... And she's not after your money. Don't forget the guy's have …
ytc_UgzJvaGo8…
G
AI isn't hiding it's limitations, restrictions, inabilities or understanding. It…
ytc_Ugxmhll24…
G
@jaiveersingh5538 Especially since this tech is infantile at best. When we have…
ytr_Ugwx2ME6G…
Comment
The idea of robots or machinery in general gaining consciousness is always something that I find quite fascinating. I mean, there are so many ways to define how something is 'conscious' that I doubt there would be a definitive answer, but I get the feeling that our ability to go against our basic primal instincts that could very well be a part of it..
I mean, if you think about it, humans are animals as well, and as a result we have preprogrammed primal instincts to eat, sleep and reproduce. There are plenty of human motivations that tend to branch off those basic instincts, but we have the ability to in a sense override them for something we believe in, such as human decency and morals. People have gone on hunger strikes before for the sake of protesting after all.
I'd imagine a conscious robot would run on an entirely different set of rules, so in the process of giving them rights, we would have to consider what they consider as basic instinct. For instance, their equivalent of eating would simply be tapping into a source of energy, or just plugging themselves into a wall socket. A gesture of consciousness then would be if a robot deliberately denied themselves energy for an extended period to say help another living being.
I wouldn't go too far into detail. There are plenty of other factors too like human emotion, which throws a gigantic wrench into many theories, and any more rambling would be too much text for a youtube comment.
youtube
AI Moral Status
2017-02-23T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgjCkbW8HzWknngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgiEPKpkQpLBvXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UggS6u_4h0pTJ3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UggR_H-guI1ov3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgisRaHAbPZkRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UghDTSkKguh_eXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugjb307Mr6aT_XgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UggBAqOIJtgnCHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UghBJWyJQzHrOHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgjYadM9MhFjhngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}]