Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
All summaries are lossy, but here you go :) We cannot ask *why* we experience without asking *who* is experiencing. A convincing theory of consciousness that hopes to explain **experience** must also explain who is **experiencing**, and how this **experiencer** is put together. Consciousness is the constellation of *past* experiences experiencing the *present*, assimilating it to act and prepare for *future* opportunities. Quite a few implications of this rephrasing of consciousness, but the one that might have the biggest significance for humanity is this: It potentially shows what is needed for AI to become conscious (they are not, right now) and how that even might be inevitable. And if that is true, we better start thinking through all the implications carefully.
reddit AI Moral Status 1663141784.0 ♥ 14
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_ioda5zp","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_ioijtq3","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"rdc_iodokdp","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_ioeir86","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"rdc_iodxez9","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]