Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
renge9909 i see what you mean, but i dont believe it to be the case. the reason we evolved our emotions was reproduction, robots wouldnt reproduce.. our programming is not simple at all, we just tend to generalize saying that we just seek dopamine and serotonin release on the brain while minimizing time and effort. altough it is not the only thing we do, this is the base, but behind that we have a very complex system of interconnecting emotions and feelings wants and not wants. sometimes we want something but cant even make ourselves do it, it is extremely complex. Altough we can, without a doubt, know that smoking is bad, most people who are addicted to smoking dont find it sufficient to go against the body needs and thus they keep smoking. some stop but it is always a mental calculus we do on our heads (utility/cost). what would be the cost the robots would have working for us? none. they would just win happiness. unless we give them sufficient perception to alter their own chemicals of happiness, but still, every robot, unlike us, would link work with a successfull life. and if theyd react as humans, they would just rationalize those emotions they felt saying they do it to help us, and robots who wouldnt work would be considered as outsiders or something xD.
youtube AI Moral Status 2017-03-02T07:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgjLYJhHPMsUEHgCoAEC.8PtuUTIQEvX8PwSVSJcIhI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgiVwjkV9NIQk3gCoAEC.8PrjymS4JXo8PsaiuqXSnb","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgiVwjkV9NIQk3gCoAEC.8PrjymS4JXo8PtiwpvP6H8","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugi318Nm44FAC3gCoAEC.8PqhuLdkTCr8QUrkX658J7","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytr_UghZs-vx_DY4WngCoAEC.8Ppcp7Bn9RL8Pq9acEMmtW","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UghZs-vx_DY4WngCoAEC.8Ppcp7Bn9RL8PqvLGRI204","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugh4-V51At2SPngCoAEC.8PouRlAd_Sn8Q_7dKAoD2q","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgggUPfJ9n2pw3gCoAEC.8PnRiYml4Pl8Q03__JyheV","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytr_Uggczad5RakHtngCoAEC.8P_l9quOfj68PacjLfPG-g","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgirsstFTRgqcHgCoAEC.8PWp2MxMP0z8PWqqddcvWJ","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"} ]