Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think reality is an illusion. The reason I think that is because subjective experience can't be explained in terms of non subjective experience. You can't do reductionism with it. You can only have 1 mind. 0,5 minds don't make sense. But you can have 0,5 brains. You can never point to a single neuron or a single object and say "here is the consciousness". A mind is a 0 dimensional point in time and space. But no such thing exists in our universe that displays intelligence. Which is why I think brains are not the reason behind consciousness. I think that our minds create reality. When we look at each other, we see each others brains and bodies. But we are just seeing the gadgets of the other person. We use physical bodies when interacting with the physical world. Even if a persons soul was at the same place as their bodies, we couldn’t observe their soul because our bodies don’t interact with other people’s souls. Though the psychic staring effect suggests that there may be some spillover. As for whether AI is conscious. We should make a double slit experiment. If the AI collapses the wave into a particle, we can say that there is an observer there, therefore it is conscious. The reason why I think unobserved stuff exist in waves is because of taylor polynomials. When you’re not looking into something, it doesn’t exist. Only you exist, the stuff around you only exist to the degree that they effect you. You can take a derivative at your current location and predict what is happening in another part of the universe. But the other parts of the universe can’t collapse the wave function, because every collapse is a decision and only the minds can make decisions. Minds that have brains have a great control over reality. You can change a single neuron's signal and make Napoleon conquer Europe. If you can change the path of a single atom in the air, you can't do anything.
youtube AI Moral Status 2025-04-06T23:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugz3DNwDbJ3Hvw25S7p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxkYacp5TeY7HiEXeh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzK7DkBvD9HT6a8RvB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwPYhnO4vBkNyPTp1B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzqA70FX9le0VnRGz94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwDo0b8uPcUUhqGQ_l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyHCus9geBRZapsM154AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyVj3hPAqSZUYXgw_J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwZ1OO2Vnu9cuA_AjR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugyekf2NV9bau8DqOLR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]