Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
+En Sabah Nur "it's just a complex piece of machinery made of inorganic material programmed to made decisions based on stimuli and inputs" That's... basically what we are. The different is that we have a *damn lot* of stimuli and inputs, and we are made of organic materials. ... What if we could recreate those stimuli and inputs perfectly? Would it be fair to treat something that's identical to a human in every single way except for what it's made of, as _not_ a human? Of course, in this age, if I had a laptop that had a complex AI that could be 'very' human, I wouldn't treat it exactly like I would a human, as it is not conscious and can not form opinions, thoughts or feel pain. - but again, we're talking about the future, where if we could go beyond just *very* human, I'd say it'd be fair to treat something identical to us except for what it's made of, as how we'd treat us. Like, if we made a computer that was able to feel pain and everything in a similar or the same way as we do. Are we, ourselves, really anything more than 'complex piece of machinery made of inorganic material', with the difference being that we're made of veins and flesh and of organic materials? That, in itself, shouldn't define if something is to have rights. Our senses, they are the stimuli. Our brain defines the inputs. *Can you prove that we are truly more than what could be recreated by machines eventually?*
youtube AI Moral Status 2018-06-06T17:0… ♥ 3
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugwu5KhSaSD6YSzpwd54AaABAg.8d2l9dvkSDF8f4V-dvb97i","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugwu5KhSaSD6YSzpwd54AaABAg.8d2l9dvkSDF8ggaA11nFu0","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwuZTliD7bFdRyGLiZ4AaABAg.8czMc5K6n8J8gKBo-emT6r","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgwzcZqTkUkaUeUEli14AaABAg.8cTBjju4NAb8h4n_tLsYjI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugwd-9mBgZs782Dh9bB4AaABAg.8c80Ghr48Iy8hKMhG34Icp","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgybUCpqkN7hOtnEHRV4AaABAg.8c2bQm84IGa8h9OiI7KABw","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgybUCpqkN7hOtnEHRV4AaABAg.8c2bQm84IGa8h9iyJdLui8","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgybUCpqkN7hOtnEHRV4AaABAg.8c2bQm84IGa8h9nj_vXFDW","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgxJYALxNENSp0wpy3p4AaABAg.8c-ZtZONrCH8d9WcM2zut-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxMTW8h5zCmqaA2m254AaABAg.8_wmKjzVKj68b8Gvtshv6a","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]