Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The idea of robots or machinery in general gaining consciousness is always something that I find quite fascinating. I mean, there are so many ways to define how something is 'conscious' that I doubt there would be a definitive answer, but I get the feeling that our ability to go against our basic primal instincts that could very well be a part of it.. I mean, if you think about it, humans are animals as well, and as a result we have preprogrammed primal instincts to eat, sleep and reproduce. There are plenty of human motivations that tend to branch off those basic instincts, but we have the ability to in a sense override them for something we believe in, such as human decency and morals. People have gone on hunger strikes before for the sake of protesting after all. I'd imagine a conscious robot would run on an entirely different set of rules, so in the process of giving them rights, we would have to consider what they consider as basic instinct. For instance, their equivalent of eating would simply be tapping into a source of energy, or just plugging themselves into a wall socket. A gesture of consciousness then would be if a robot deliberately denied themselves energy for an extended period to say help another living being. I wouldn't go too far into detail. There are plenty of other factors too like human emotion, which throws a gigantic wrench into many theories, and any more rambling would be too much text for a youtube comment.
youtube AI Moral Status 2017-02-23T14:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgjCkbW8HzWknngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgiEPKpkQpLBvXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UggS6u_4h0pTJ3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UggR_H-guI1ov3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgisRaHAbPZkRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UghDTSkKguh_eXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugjb307Mr6aT_XgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UggBAqOIJtgnCHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UghBJWyJQzHrOHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgjYadM9MhFjhngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}]