Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well, I do not agree with what ChatGPT just said. Inm only at the beginning of the video but I'd like to let you know that emotions are not linked to consciousness. It's a really important philosophical point, no, if you have emotions you are not necessarily conscious. That's what Descartes and other philosophes had stated, it actually makes a lot of sense. While animals can have emotions and can feel emotions, the human is the only living being that possesses a conscious. Being conscious is the fact that you can think that you can elaborate statements based on your proper point of view. Indeed, it does not apply to animals that imitate something, like the birds that you train to say hello, and all these type of examples. Also there's this matter of communication because if you want to say something and if you think about something you will do everything in your power to communicate that thing. And it's not just communicating your emotions because dogs and cats do these all the time. I'm talking of people like deaf and muted people, even though they can't talk they will do everything into the power to communicate what they think. So yeah I didn't watch all the video for now, I will of course, but just following this principle, chat gpt is not conscious and since he's based on a definition of consciousness that is not right I think the whole protocol after that will be biased. He will take consciousness as the fact to being able to have emotions and he immediately said that he was excited so I think it's going to be his argument but yeah no it's not the fact.
youtube AI Moral Status 2025-02-02T21:4… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxJClueUIuaUNpvJwV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzpSPcYEaLV69Z6oeV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw_2Oij6nI7VlFOBNB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxuHg5mftULRx1NwFl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz93v05Cn9wofUKOTt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwKxnz7VCwAlX9bKc94AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxWkqWEwBnmlfxL_Jl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyzPWI3ujB_00z8k894AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwdIP2gFhA4_HZuYiB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwndf9omGpPd_hBEZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"} ]