Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"Robots don't need this "to survive" thing programmed into them because if they were to be broken or destroyed, all you would do is replace them." "Any robot deserving of rights would not be a good robot to force to do labor, so the robots doing labor will not have the capacity for suffering so we don't have to worry about it. " I wouldn't be so sure about that. More often than not, you want your robot to be able to avoid damage and destruction, because you don't always have the luxury of being able to replace or repair it at will. Most obvious example being self-driving cars - typically you want the car to get from A to B reliably and to be able to reuse it afterwards. There difference between "avoiding crashes due to programming" and "avoiding crashes due to fear of feeling pain or failure" is very hazy and possibly non-existent.
youtube AI Moral Status 2017-02-23T16:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ughgv7iY07dgTHgCoAEC.8PKQh0lPF8S8PKbc3nzIAa","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ughgv7iY07dgTHgCoAEC.8PKQh0lPF8S8PKyy545rVu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ughgv7iY07dgTHgCoAEC.8PKQh0lPF8S8PL3H-oWz5B","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgguDvg9CgsPlXgCoAEC.8PKQg1Pf6gK8PKc_pBnsUA","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgjA4P2zvsANW3gCoAEC.8PKQ9Hag29m8PKZKqNPj6-","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytr_UggjLJg0B5wF13gCoAEC.8PKQ1GdiHQu8PKTwojH3zq","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_UggjLJg0B5wF13gCoAEC.8PKQ1GdiHQu8PKU72pei1P","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytr_UggjLJg0B5wF13gCoAEC.8PKQ1GdiHQu8PKUhss0zjQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiF1GtuwNWeLngCoAEC.8PKQ-ggL5F38PKWOWuvUml","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UghhM8kfbs8KNngCoAEC.8PKPW_ZjQHb8PKWuhrMaot","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"indifference"} ]