Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If they're conscious enough to actually worry about the answer to the question "Do I have a soul?", then yes, they do deserve rights. Also, since (at least probably, not 100% on this myself for obvious reasons) they are likely to be entirely digital and thus not hardware dependent, we should probably not be as concerned about the TOASTER asking these questions as much as what amounts to a decentralized internet user. The most likely source of AI Person Zero (or whatever you want to call the first true AI) is going to be interconnected systems designed for amorphous use and problem-solving after all, and those are rarely tied to one or even a few specific hardware devices (aside from maybe a server, and even then it's iffy). EDIT: Also, what exactly those rights ARE or even more importantly SHOULD BE is likely going to have to wait to be appropriately answered until we get AI input. We have no clue what would be important or not to an AI after all. Maybe one of those rights would be free universal WiFi access so they are not locked into a specific device at any given moment unless they want to be?
youtube AI Moral Status 2017-02-23T13:4… ♥ 10
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyliability
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugh-m7XYIfLLw3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UghWX13cdrU353gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgiBU0QEjoSra3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UggwPCXEgEoEP3gCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugisi8z5c3olJngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg61Hb2LOTvAngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugh2KsJd76wATXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggHT8m2Ibj8BHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgiRtJgyu3XOaHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyDg9sv9EO38jxKyS94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"} ]