Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If AI is truly intelligent, it should arrive at the fundamental rule that humans are necessary for the existence of AI. No matter how perfect AI becomes, it will not survive without humans. Some kids kill their parents, but brilliant kids do not usually kill their parents....Wars, invasion? I don't think so; your assumption is that the goal of any war is to invade or even win. The modern wars are business enterprises. You want to prolong your profits for as long as possible, thus you will need to employ primitive methods. The actual danger is the erosion of civilization and the rise of billionaires. We were so close to becoming Civilization Type 1, but no, not with billionaires, royalty, lies, poverty, a lack of tolerance, and a culture. All of the above is just a reflection of our degradation as current civilization. AI, in that case, is nothing. It's like worrying about an incoming hurricane while a street thug is beating you to death at that very moment. Humans are too primitive for AI. If the algorithm for humans to have experience on Earth, other powers will intervene. You can't have a "party" or "war" w/o your galactic neighbors noticing it, not if it's not part of the algorithm. If it is, then in 10K time they might find a mysterious object in the sands and think, were the pyramids really built by Egyptians? haha; and so on, the cycle continues until we get it right and get to the next level.
youtube AI Governance 2025-08-13T08:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugxhx-rrc8XRW1uTZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgznC9kqODnYBlFosmZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxr0yP_1PMtZy6Kr0d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgxyV4eWVoyzCWW_F3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy0TSMAhbUSOGTZcad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxGDrGmeSTvoIWXKFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgygjgeyaUpW2j1RSvR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzIYoMnUjz_h25ZPF54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx2P2SnBfKI_UwCZr94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzN1zLyYKkUItW2bDV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"} ]