Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Then "rights" should be given a new requirement: Freedom, the right to be free. Think about it, all of our laws against injuring another suddenly become "people have a right to not be hurt". The laws against killing are suddenly turned to "people have a right to live" (sounds familiar? it should). Moving on, the laws against stealing would instead go something like "a persona has the right to own inanimate objects and the adequate resources for life". Now, you might think that this last one carries a paradox: how can one person have something, when having that something means that they are restricting someone else's freedom of having that item as well? Well, this is solved via the word "own". When someone own something, be it an item or their bodily homeostasis (which includes their life) then they own it, and no one else, and as such they have the right to continue owning that item/resource until the moment they give up that right. However, when it comes to owning another person, this is instantly halted, because then it conflicts with another right: "a person has the right to do what they wish." And this in turn would be stopped by the "right to homeostasis" right, preventing such things as injury of another person and of course murder. This would allow robots to be included into this sphere of Rights because it has nothing to do with pain. Homeostasis, as defined in the dictionary is "the tendency toward a relatively stable equilibrium between interdependent elements, especially as maintained by physiological processes." None of this says that the elements are feeling pain, and any or all of these elements could be that of a robot, allowing the robot to consider its normal functioning state as that of being in homeostasis and as such then holds the right to remain in that state until it chooses otherwise. Do you agree, or do you now? If so or if now, why?
youtube AI Moral Status 2019-04-18T13:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgwDNHZDU4vOCNd8e014AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy3ykHoZ5PO79BwYbV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},{"id":"ytc_UgxHym4faPMowcOJZk54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"approval"},{"id":"ytc_UgybYziq2flIPipscTh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},{"id":"ytc_UgyLxCWkKBxyv0koNJZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugy1iBqE6AT8eijlOst4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwl_Pd8UttJxTIOSkR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgzNg7iUiw2XkcMaSq94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},{"id":"ytc_UgwJaTCBcWU2h_IwIcB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},{"id":"ytc_UgyBeFdGtl0d9HfzwH94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]