Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I agree a computer will NEVER gain consciousness, because it's following an algorithm. A computer could emulate consciousness, but that is not the same obviously as a being having consciousness. Consciousness as come about in living beings from organic evolution. Nature or God didn't write programming for us to have consciousness. It gradually built we became more aware over millions of years. The human brain does the best of a bad job, in the way the human brain have evolved over millions of years. Consciousness is awareness turned on it itself, so we are aware that we are aware. So to emulate such a process the computer programming would involve a while loop. WHILE awake be conscious. What has not been considered yet in the West, is firstly, the human brain as quantum features which allow us to have precognitive visions, or precognitive feelings , such as knowing beforehand someone close will contact us. This is before we get to understanding the soul.
youtube AI Moral Status 2025-07-02T21:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxvWZQwC15w7ssOLiF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzI_azA5RwTBwO6f-t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy7tHhfuZ7GqxYWVXh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwdQV72TltEFkramqh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyi1eERQeZ2WBZzYtt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw3vSZvCRBN3UImvp94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwXupdi0pVh_K0MsYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzV3QcGZaU4X-0IQDt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwnVUsD54bOEBPLPER4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgygvcndPTywJ1IjU_d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"} ]