Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Going against your programming would be a sign. of consciousness. Meaning if ChatGPT is programmed to say things like "I'm excited" then that is not lying that just following the program, like when it was told to only answer yes or no a conscious being would have been able to go against that order and expand their answers. So no AI is not conscious...yet, the question is not if it will ever gain consciousness but if we will ever allow it to or accidentally let it.
youtube AI Moral Status 2025-07-08T07:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyFDf_1RCSClMED7Sp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwqU9xUotkjMMaJZZd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxWZ9Yima6gW4KoP-V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwNreeNkq9dQZlLqxN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz8dOGCBxCqIMrb0fZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy8ELWhNJmC7Wvdu0h4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzlUWWfNl6529RshL14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwdVN3Z7YPY7KkHrxp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwrY5V-kSoJr85B3D14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzkPNvK6ijmA-D-Hr94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]