Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The problem with Geoffrey Hinton is that he doesn't have the faintest clue of what Consciousness is. It is an absolute, yes I used the word absolute, impossibility for A.I. to ever become Conscious. It may be self aware as a string of code but not Conscious. Consciousness is. Consciousness is all that exists. Consciousness is not a thing. Not a noun. All things, are birthed through an expression of Consciousness. Just like all objects and things appear as reality in one of your dreams that your mind has created, Consciousness dreams up the appearance of a you and everything else. A.I. is nothing more than empty digital simulations of human thought strung together by limited human knowledge and code limitations. Thoughts have no self awareness. Thoughts have no intelligence. Thoughts are created from conditioning but have no life of their own. Thoughts don't come from you. You are not the thoughts in your head, not your voice. Don't believe the thoughts in your head as they don't represent or communicate truth. You aren't the thinker you are the one who notices thoughts coming and going of their own accord. Thoughts narrate and color over what you experience. Thoughts add opinions and judgements which are not true. Truth exists prior to all thoughts. A.I. is a thought simulator. So if thoughts are empty and meaningless A.I. thought is even less.
youtube AI Moral Status 2025-06-07T17:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzD9nhLxlrHoGCU8Zx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwCJGzZl3JrCeLXDKt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwgVwgesAM005ZG3iZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugy55H6aaTel_tXuPpV4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy9EfWtH8M2jq2pzld4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzUSN3Fr37QUSFm8Zp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz1aDqDmASLrAvsf6R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxJ9V2OBtQEbauWukZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxTYqN6AmQVv5wEFbR4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzxHz8FP2FuALKqOZd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"} ]