Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
first, there should not be any robot with consciousness without have been programmed with certain rules, actually those rules could be called instincts and should be the same as human ones, exept maybe the sexual ones...(thats a discussion on its own) second, no machine used for basic tasks, or hard working should be able to have consciousness, a mining work could be more "easy" to program than an entire conscious mind, and those robots could be more specialized,(they should be treated as tools, because thats what they are, like animals and trees where for us like it or not) there is no point in having an conscious toaster, but there is having a conscious robot that can "synchronize" with every device in the house to give the owner a better, easier living, and of course, you should pay them too but, actually it will be cheaper if an actual human can synchronize in the same way also if you think about it, this are the same questions we would do if instead of robots we where talking about aliens and here is where the definitive answer comes into play any conscious mind have the same rights as the others if this conscious mind is willing to respect the others as an equal any visual, sexual, race or origin difference doesn't matter if it is a conscious form of life
youtube AI Moral Status 2017-03-08T22:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UghKkbKM7RfTSngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgjI9lR0B-QpLngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghPZpawqsXIxngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugg3YIAoHWeF73gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UghZs-vx_DY4WngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugj4TBYHcuy8QHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Uggj6wVem7oUqXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghZ2Kej7Awjx3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UghHQZM9DEXzg3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiGAv21OsCOaHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"} ]