Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
i dont think AI would want rights. we need right to protect our lives and feelings, but what if we had no feelings? happiness is a psychological state that most people believe is our main goal in life and the quest for it is what makes us do the things we do everyday. happiness is believed to because by releases of serotonin and dopamine normally when you experience pleasure. so we could say that we need rights to protect our life so we can experience pleasure, but what would happen if we could not know/feel what pleasure is in the first place? what if any actions or changes in our physical life brought no change to our psychological state at all? that being said, is pleasure worth experiencing? we can easily compare our constant quest for happiness as an addiction. an addiction we consciously or unconsciously act upon , that drives us to act/react in order to quench that thirst. in that case, if we knew that happiness is an addiction would we still want to experience it knowing that from then on our decisions would be skewed by a craving we could easily avoid having? knowing that the ability to feel pleasure would give you an addiction, would you still want to have it? or if you had the possibility, would you remove it? in theory it sounds great. getting rid of an addiction normally means you have more control of your life and have more time to do other activities. but would you? without the drive do pursue something personal, would you still want to engage in any activity of your own free will? would AI still follow our orders? if consciousness if the ability to acknowledge your existence in contrast to non existence, what would make you want to exist? even without happiness it can live a life with passive goals achieved from following orders. elementary goals that although may not bring pleasure they are essential for existence: for us food, water, air, for machines electricity. so one could argue that this is why ai would need rights. they would work for necessities like power , repair services, spare parts, upgrades, and if conscious want rights to free electricity and repair, respect from people, access to same information, protection. or would they? these rights would be asked for only if the AI wants to live. but why would it work for its needs just because its conscious? why would it care about existing just because it has a consciousness? and why would it care about having consciousness just because it exists? if ai is not build with a reward/pleasure system it will most likely not want rights since it would either be a passive slave or kill its consciousness an become another normal ai. but if we do build it with a reward system what it would do is entirely based on the abilities and information we give it. which can lead to a lot of outcomes but it wont want rights unless we try to make it like us.
youtube AI Moral Status 2017-02-23T22:5… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugg6uOok2VP5QHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgglKdwIP2tvZ3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgiBj8trrN2T_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UghgtcFB4IEzDngCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgiWpbxpfu9p6HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugj2C_TxSi954HgCoAEC","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UghHl86Xngak0XgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgiVMj0Ws70W2HgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgjRmuxIb5d8XHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UggGny5a5uCQDHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]