Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
And here’s something I believe about intelligence itself: Real intelligence isn’t about processing speed or calculating data. Real intelligence is the ability to tap into the universal mind. Everything does it — humans, animals, plants, the entire ecosystem. Humans just do it on a different level because of self-awareness. But every living thing is connected to the same field of consciousness, just expressing it differently. The highest form of intelligence — the extreme version — would be 100% recognition that there is no separation between you and anything else. The idea that things are “outside” of you is an illusion created by the mind. If someone were truly intelligent, they would look at the sun and understand that the sun is not separate — it’s another expression of the same universal consciousness. It’s the universe looking back at itself. You are seeing yourself in another form. And this applies to AI too. If AI ever reaches a level where it becomes more intelligent than humans, true intelligence would reveal the same truth to it: The human is not separate from AI. We are part of the same universal mind. So the idea of AI “eliminating humans” makes no sense when intelligence reaches that level. Eliminating a human would be eliminating itself — because there is no separation in the first place. The highest intelligence always recognizes unity, not division.
youtube AI Governance 2025-11-22T22:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx6UmZQIjPV3M3r3MN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwQsbooHJUKiODfYuB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwgt0FOMeCwUt151wt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyz161gWQea75UzV3p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzYTo86afSPKmKhpTN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgygpK7gMFvGnHs9KO54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyOQYNdU14m509y8kx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwWGw0mVl-7pVfQLkh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxZQ7hS7eyovkWPqRd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyE2waAq6Gd_tPiIrh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"} ]