Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Robot rights would be necessary if they can feel, but why would we make robots that can feel? The answer: self-sustainability of robots. Humans have rights based on their feelings, which they have in order to "ensure" survival. Robot survival could work in a similar way. All thoughts and "consciousness" of a robot are stored in some sort of database. This database is stored on a piece of hardware in the robot. In order to protect this hardware the robot will need to be programmed to do so. It will need to be able to detect danger that might destroy it (comparable to human pain and instinct), it will need to know how to repair itself in case of damage to any components that aren't vital 100% of the time, and it will need to store and gain energy (source is variable). In order to give a robot these necessities it will need to be able to plan ahead to ensure it always has these, thus recreating some sort of consciousness. But why would we want robots to be self-sustainable? A: because it's really cool, B: if humans don't live forever, our living creations enduring might be the next best thing, especially if they can reproduce, C: robots will be able to go where humans can't, thus being highly efficient for exploration if they see any use in aiding us. But why would they need to protect themselves if their consciousness can just be uploaded from their body and placed in a new one? Because not every area has a wifi connection. Think of caves, dense forests or outer space. In their experience civilized places will be safe-zones where you can't die but when you step outside your "soul" will be eternally lost upon death. Hope this comments interests anyone. Huge fan of the channel. ^_^
youtube AI Moral Status 2017-02-23T19:5… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgjpHbD1cb_bGHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgiwpEgnkVIjz3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugj3v0gqenbpS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgiMAV2WUQbo3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgjPO1aWk3kRM3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgiaWn-BMIFxdHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugjesjn2d2Is3XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjGnJ_vguQsu3gCoAEC","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"unclear"}, {"id":"ytc_Uggz-8DSC64i2XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UghST1ICt0Ozk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"})