Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I've worked on AI as well, and your views seem awfully short-sighted to me. The technical aspect of writing AI doesn't give you all the answers to the philosphical side of things. We might not understand everything about how our own brains work, but in general, we do understand neurons and neurotransmitters. Now, do we know how these lead to consciousness and self-awareness, the sense of an ego? It is possible that consciousness is entirely emergent from these physical entities. Either that, or consciousness exists in some wholly different Cartesian realm, which is an equally massive claim. To think of it another way, it's less about people thinking AI is bestowed with magical powers that allow it to attain consciousness – which seems to be your interpretation – but rather, that consciousness is less 'magical' than it seems, and could just be result of a highly, highly, complex system of neurons. You might say that ChatGPT is nowhere close to the complexity of a human brain, and of course I'd agree. But now, instead of stopping there, think deeper. Where do we draw the line? If we had a 1:1 digital replica of the human brain (as in theories of [mind uploading](https://en.wikipedia.org/wiki/Mind_uploading)), would that, then, be conscious? If so, where's the threshold between consciousness and non-consciousness?
reddit AI Moral Status 1676638890.0 ♥ 5
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyunclear
Emotionindifference
Coded at2026-04-25T08:06:44.921194
Raw LLM Response
[ {"id":"rdc_j8wf290","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"rdc_j9lzz62","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"rdc_jazbzhe","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_jccolzx","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"rdc_jdkcidg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]