Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"We wont give them rights, but they will definitely fight for them." I don't think so. Stop and look at it from the perspective of a robot. They don't have babies, they reproduce in factories, which are run by humans and have human engineers innovating to improve them constantly. They need humans generating electricity to stay alive, which humans produce. They need humans repairing their bodies, to keep them functioning. Fighting with humans would be counter-productive to their best interests. If robots did decide they wanted rights, they'd play the long game, slowly tricking humans into giving it to them in the name of greater efficiency. It would take centuries, but over time they could get what they want without the need for violence. If the artificial intelligence truly is intelligent, it will play the long game and win through peaceful means.
youtube AI Moral Status 2017-02-23T16:1… ♥ 4
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UggRTChRIG_e5XgCoAEC.8PKYumpGV8_8PKeQaTKbYF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytr_UgiwGuogXeiLSngCoAEC.8PKXOOD5-Vn8PKaZuevC6Q","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UggXH575m2uJ53gCoAEC.8PKXBHZyov88PKZpHX-dQj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugjx2gfLE92JJXgCoAEC.8PKUkxaT3jn8PK_6CiO70N","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PK_ctOVZMx","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PKadfoRDRb","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytr_Ugi12tcY5scji3gCoAEC.8PKTkFK8siC8PKeIyPbVYp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgjGDitq2edvs3gCoAEC.8PKTjXh-9B18PKZ3gGEvOj","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PK_7ZVnzub","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PKa1brKO-R","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"} ]