Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you look at the YouTube comments section, you would think we're centuries away from even finding intelligence to begin uploading. Regarding your question, I asked AI. Here's what it said. The idea of uploading human thoughts or consciousness—sometimes called mind uploading or whole brain emulation—has fascinated scientists and storytellers for decades. But here’s the reality: we’re still a long way off. Current estimates from experts range from 50 to 100 years or more, and many argue it might never be fully possible. Why? Because: We don’t yet fully understand how the brain encodes consciousness, memory, or even thought in a digital or transferable way. The technology needed to simulate a human brain—every connection, every chemical interaction—is staggeringly complex and requires computing power we don’t yet have. Ethical and philosophical questions loom large: even if we could upload a brain, would it still be you, or just a digital twin? That said, we are making progress in related areas—like brain-computer interfaces (BCIs), memory prosthetics, and even thought-controlled devices. It’s not full-on consciousness uploading, but it’s pointing us in that direction, step by step.
youtube AI Governance 2025-06-22T23:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzGBv_-g3XVBfcNbdt4AaABAg.AJg8YAonlQrAJgE8De03-y","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzGBv_-g3XVBfcNbdt4AaABAg.AJg8YAonlQrAJggEsl2-f0","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyUmM3yVOiWHJuRwxR4AaABAg.AJg6Qj38JOfAJgh9ZVJ2Bs","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyL7Aalfce2ZxuDIT94AaABAg.AJg3t9tkg2HAJghL4DqMN4","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgzILXPA3rnGOls7W8t4AaABAg.AJf_8kRdUBTAJg518fZ5fj","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgzILXPA3rnGOls7W8t4AaABAg.AJf_8kRdUBTAJijA041Hsh","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzP18bPyUBza5379U94AaABAg.AJfMUPg1IznAM86VGjvKtI","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzrHO1dhI7IZOODKDJ4AaABAg.AJexZUqXfC6AJj0xLHXydn","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugx7AxnPseOW9-W1-I14AaABAg.AJetkU1d5hHAJeueq-wO7w","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgyRCwPB2tfPxrgkhep4AaABAg.AJeXAxdhhm8AJiMUvzEqYM","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]