Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's not about if humanity will survive, humanity as we know it won't be there in the next millenia either way. Even in the best case scenario, what ends up as humanity will be so different, so alien to the current humanity that, does it really matter how we reach that point? AI will be humanity at the end of the day, we build it, we teach it, everything it knows, the foundation of its being, is the culmination of humanity. So, it will be in a sense the continuation of humanity either way. Even in the scenario where humans can exert complete control over AI, we will slowly choose to merge with it, to change ourselves beyond human as we know it now. Imagine, AI will be akin to a god, but subservient to lesser beings, who can't even understand how it operates anymore. Would the lesser beings really be content to stay as lesser beings, when they can be more? There likely won't even be a unified form left for humanity, even in the best case scenario of individuality being preserved. So yea, even in the best case scenario, the end goal will be almost the same. Whatever humanity ends up as, the humans as we are now won't like it, as it will be completely alien to us in every way possible.
youtube AI Harm Incident 2025-07-24T10:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzXPfF3yjF2kHYNiOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxMM33oltf-YhlM6sB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxOvQgVt-AafZRpA5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxp1eKtZKdUfAldj-J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgypcHG_2JvwnmzfjSV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwfC7ClygMSPumCR7N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwX2SWLt4kiQZDq0Zx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx5e_blycHrqjSa7zJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxu3Xa9p3VklPDsxid4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwDUA4c_BYcvySsH8V4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"mixed"} ]