Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
but here's the question. why would any human-like AI be put in anything other than a robo-friend. a toaster has one use, to toast bread. to attempt to allow it rights based on the fact that a futuristic toaster can make different kinds of toast based on predictions rather than user input is absurd. tools are the backbone of what makes us human. should we be careful to not offend our clothes before we put them on? should we ask permission to take a sip from our coffee cups? these are more extreme examples, but what is a machine other than a tool? it's created to achieve a goal. like you said, if there is no injustice programmed, there is no need for rights. the miner robot doesn't need to be motivated to work by simulated torture, it can be motivated to work by it's programming. Machines do not deserve rights because that would be a major step back in technological advances. this all seems very much what an extremist liberal would say, and I'm fearful they try and push this. thankfully if need-be, I can program an AI that expresses love at anything happening to it, so that I can show that any discomfort, and injustice, is only felt by a third party.
youtube AI Moral Status 2017-02-24T15:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgivVbsw4Xna5HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj_ypQQgCMlY3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgimjZIms5XFuHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjyZ86QaI4isngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugj_RptSNZh00XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjzbQdL9YKm8HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UggfrijmLIHsQ3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugi7_U1vyGWcxngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgiBd7HdCU2ShHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UggTp276p_PxP3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"})