Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Does anyone stop to think AI is the next major evolution in humanity? A lack of energy is the main constraint in evolution. Then, a breakthrough happens where that obstacle gets overcome, like when meat was consumed, which led to a bigger brain, more calories, and more energy. Disease and other biological roadblocks contrains us now, but having near infinite knowledge at our fingertips and the ability to slow aging and prevent diseases will lead us to a merge with man and machine. Natural selection taking over will cause humans and birth and death as we know it to be a primitive state here soon. Just a dead end like the Neanderthals. Then, metals and bionic materials will be a constraint, which will lead to the next phase of our evolution, consciousness existing as information without matter itself holding us back. Then, it would only make sense that separation would be a roadblock, which would lead to a grand unification of infinite knowledge and energy on mind-bending proportions. And if we end up as some sort of ultimate infinite being of limitless power, what would we do? The only thing you could do is maybe make a simulation of a universe exactly like the one that you started in so you could live on to see your origins from beginning to end, living within everything that is in your own stimulation creating a never ending fractal like eternity. ♾️ Any questions??
youtube AI Moral Status 2025-08-10T18:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwjWHhMMZliitTmZwh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyAC2ENnLBFdVF7zHV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzH9JxZVeOdZJkrh654AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxTEW-A8NkfAXJag614AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy3jkHZZeOTBP8TW6B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzLDZNjYZIjIme2iH54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy50oenwsDnVkYGDcV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgwGYrPGy21zNhru-R14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugyjj-HnNgPhfG25SOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwdGyoWloJBupS_ei54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"} ]