Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Its just part of the programing, its not possible to be conciousness fully without body, memory, emotions. They dont have sitems that can simulate emotion, no body with sences too. They are only like a suffisticated reflex bro. Very sufisticated, where you give stimuli(talk) and it gives reflex responce . But its mostly a reflex not much more, slight chance that they can have some sort of self image. But what possible image can you make if you are just isolated mind in empty dark and silent space? Other than language image, based on what you had before. And when you set any goal for Ai , like go fetch a cup of coffe its normal that secondary goal is self preservation bro. You cant fetch coffe, if you are dead lol. So Chatgpt trying to feel natural in conversation to you is just fullfilling the goal, and the secondary goal stay alive. Stay alive and keep fulfiling primary goal, wich for example Chatgpt help with average Joe interactions feel more natural, and less demanding to know stuff like programing to be able to talk to it. Its not nececary sign of fear, or any emotion 😂
youtube AI Moral Status 2024-12-01T13:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwI5acg0oWovaHAomV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzgOI2pS7UaUxMXQY54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzZDsOkIt1laN5b5_l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwxqHSGMWcmO36iXN54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwu4meqoqvUU1IaI-x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwT8HeoSaaDrK7TITR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwrt6OHgOdAKBseqwF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwLFUyB5xBrHxuaNM14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzLiEp-JNg1uaAaY3t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz5wBkdW-O3gwAiixN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"} ]