Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Also, while we haven't worked it out, it seems to me that most of our major theories are in some way guessing that consciousness is likely an emergent phenomena. Most theories are basically a variation of because neurons come together in a special way, this weird thing emerges that wasn't necessarily in the neurons. So, if consciousness is something that emerges out of a complex system, then the arguments against AI consciousness on the basis of how the next token prediction is working between simulated neurons is arguing at the wrong level (sure, yeah it looks like just a statistical process at that lower level). It's like trying to say families don't exist because you don't see it within a single individual, or that geopolitics isn't a thing because where is a country in an individual brain, or that human consciousness doesn't exist because neurons aren't conscious. That's not to say that AI consciousness has already emerged or not. It's just that we should be arguing at the right level of analysis. We need to be asking whether consciousness is possible as an emergent property.
reddit AI Moral Status 1743829345.0 ♥ 32
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_mljjj2n","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"rdc_mlhi5o2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"rdc_mlhsa3t","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"rdc_mlhv02z","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"rdc_mlhpmqk","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]