Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@TheRatsintheWalls The problem is that we de facto have very small knowledge about how does the human consciousness work. We barely can understand that phenomenon, so simplifying it to just a "neuron that either fires or it doesn't" doesn't describe the reality really well, because the brain is not just a binary meat computer. If it would be, you could (theoretically) create consciousness even from the analog wooden computer. The theory where you assume that the human consciousness is absolutely algorithmic creates a lot of other paradoxes anyway. Instead, I find Penrose's quantum consciousness theory much more probable. He described it more in his book "The Emperor's New Mind". This theory assumes that human consciousness can not be algorithmic, and to "work" it uses quantum phenomena that are happening in our brains (especially in microtubules). That is why I think, that if we'll create a real AI, that can actually think, we will have to use some sort of computer, that is based on quantum information science, but not a conventional one
youtube AI Moral Status 2021-04-22T13:5… ♥ 6
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzWimeXE844iE4Jvyp4AaABAg.9KYXHMB4Xrz9MRcW1RF4H4","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzWimeXE844iE4Jvyp4AaABAg.9KYXHMB4Xrz9MSFnbR3c3_","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzMQ0II6vtFlBModZ94AaABAg.9KLmzoDbv-69LFG5TACHZR","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzMQ0II6vtFlBModZ94AaABAg.9KLmzoDbv-69LFKXl_-iV-","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_UgzMQ0II6vtFlBModZ94AaABAg.9KLmzoDbv-69M85Ubjwu14","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"approval"}, {"id":"ytr_Ugwj362hz3hdF_vKqQN4AaABAg.9K6EGjJIiGu9KzZKMJ7QbQ","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwj362hz3hdF_vKqQN4AaABAg.9K6EGjJIiGu9LF50aWjru4","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugwj362hz3hdF_vKqQN4AaABAg.9K6EGjJIiGu9LFGtiAqvee","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxcfW4fyyasqr0L2kB4AaABAg.9JmYvajVNST9LF9YwRlx8k","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugyc7iVFUnlvOXBGU6l4AaABAg.9Jg5FXNq8z19ag4QJDbwDv","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]