Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If we take the "What Is It Like to Be a Bat?" definition of consciousness, then the best we can do is ask ourselves, "What does it feel like to be ChatGPT?" What does it experience? The language models receive some text as input, and give back text as output. Does it "hear" the text? Does it "see" the text in a black screen and then think about it in words? When "thinking" of the output, does it hear it again? How does time pass for this AI? Is it continuous even when it's not working, or is it only when it receives an input? As far as I can tell, there is no computer program that can make a computer conscious, because all that a computer can do is work with bytes. At the end of the day, it's always just a processor shuffling bytes in memory. It doesn't matter how complex your software is, the hardware is still incapable of consciousness. Even if you had a computer that perfectly mimics what a human would do (with cameras for eyes, and microphones instead of ears), we could still affirm that it isn't conscious, because the data that it's processing is still discrete. This is more or less what ‎John Searle was saying in the paper that coined the Chinese Room argument, that you need more than behavior if you want to assert that something is conscious.
youtube AI Moral Status 2023-08-24T23:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyZMUEYPkEMBOu1PLh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxsexWjYwaKvBWX0KF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugztb5F7S8mkBmMTctV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugy1zofO29LOVjSZHXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwia89r2Fd87LS9fLN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy9B68jrdzL4K8Q0T14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwUVoPfJVEd972_w4N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyiOaMGEhKnq7rQAeZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwVWJQA1dQ6vm3UeuF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxwta0vfDgqfdc3xbx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]