Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
+Bolby Ballinger, robots might not appear to have limits, but they do. We have biological limitations, they have energy and space constraints. They also have reliability issues and the inability to repair themselves. Also, I do not believe it will be upset at not having received payment in the same way a slave is not upset that they aren't being paid (they probably care more about their freedom). At this point, it's impossible to say what an AI will care about as we can't anticipate which AI will develop and how. For example, you can program a set of functions to learn things such as how to walk with "falling = bad" and "up=good". Eventually the robot will teach itself to walk through trial and error. There are videos of this on YouTube already. But this is a very limited structure to begin with. Similarly, cats and dogs have very advanced intelligence. However, they don't seek money, revenge on their masters, or any of that stuff. So it is possible, within the right framework, that an AI could develop that would not have a negative connotation.
youtube AI Moral Status 2016-11-05T04:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugico5LpRpGANngCoAEC.8L1QFU5Bk6v8L3pUG3RhjH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugj-Pbakn7exB3gCoAEC.8KzSQwZh2bJ8LK_hlDTWNP","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_UgiwLipMoGnvzHgCoAEC.8KvqafHEuQZ8KyfxtleBu2","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UggGWRbW-RsxW3gCoAEC.8KuuexCbPMr8L4mDTg_I6G","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_UgiulyL63POSLXgCoAEC.8KuniCG3liy8L67UbJ2zfh","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiEZUx0Er76jXgCoAEC.8KnkpFrh0lq8KrLhmxuZkP","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgiEZUx0Er76jXgCoAEC.8KnkpFrh0lq8Kt5YMADtYz","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiEZUx0Er76jXgCoAEC.8KnkpFrh0lq8Kt6O2q787C","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiEZUx0Er76jXgCoAEC.8KnkpFrh0lq8Kue1Nma3eI","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiS952mNV0w2XgCoAEC.8Khyerca2BV8ME9Y3YkNeg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]