Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's possible, and it shouldn't be surprising or depressing if it's true. There's a view in Neuroscience that the brain has a lot of different areas that are constantly generating signals in a kind of latent language. Some of these areas are responsible for combining these signals and redirecting them to motor outputs, and some area is responsible for combining everything into a conscious sense of experience. But in this model there is no conscious entity, just a whole room of unconscious zombies yammering about various topics. The collective behavior simply appears conscious. Now you could model each of these areas with a large language model. We'd have the memory LLM and the "seek food" LLM and the "Make decisions" LLM and the "Seek sex LLM", and they're all wired together by a "Feel like a human" LLM that generates the conscious experience. That might be all we are. And that entity might act just like you or me. But finding this out would be amazing, since it would bring us closer to curing mental illnesses and understanding human suffering.
reddit AI Moral Status 1674694116.0 ♥ 3
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_j5xac3a","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"rdc_j5wdifv","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"rdc_j5xahme","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"rdc_j5vqkk8","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"rdc_j5w3drg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}]