Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
i would say, we live like we do it now, and when robots come to the concusion that they need rights, then we have 3 options, first destroy everything which is robotic, and build new ones, which don't want rights, 2nd we ask them to just go somwhere else, like humns live in aisa, europa america, africa, and the robots go to the antarctica or so, or 3rd we live together with them, because if robots are intelegent enough to want rigths, then they wil understand that pain is not good for humans, and that robots maybe also want to stay "alive?" so we have rules like, every human is allowed to live and every robot is allowed to be not deconstructed, or so, we just have to make the rules so that robots are more or less eaqul to mankind, like the riht to have something to eat, is for robots the right, to plug in at a batterie or so. we can just hope that robots will think," hmm humans aren't so cool but well they gave birth to us so at least we let them live in their own ltlle human world, while we go spaceadventurering"
youtube AI Moral Status 2017-04-12T22:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugi45moqZlF_O3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg2o6yscgin2ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghB5KBq6iE063gCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgjjF29aY6KgdXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UggqgbgzFEkzPngCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UggWVXBBY5Xf6XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggOqp3ptdscTHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjD2Dxnb2KMFXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgiK-S247WsWFHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgjAC7DOdN8XMngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]