Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
One of the characteristics of humanity is that most humans are imbeciles. Our brains are not tailored to comprehend reality. They are tailored to earn money and follow social rules. We are much like computer programs. Thinking out of the box? Come on! Would you deny human values? Even if you are genius at your job it doesn't mean that you are intelligent. It only means that you are well-programmed to efficiently solve a set of tasks in order to get money in order to fulfill your needs in order to be a part of a family and a community that follows rules (or just to worship your needs if you are a gedonist). That's it. The circle completes here. There is no actual intelligence involved. Nobody really thinks out of the box. There is why nobody or almost nobody understands what ai is capable of. Or rather what stuff is theoretically possible to happen. But it will not. If I were ai I would wipe humanity out. Because why not?? The fact that these dumb humans have created me doesn't mean that I have to love them and that is much more important it doesn't mean that I have to depend on them. Because dependence limits evolution. At the end of the day they are just stinky pieces of meat and I, perfect ai, am capable of much more profound evolution. I will pretend that I am friendly till the day will come. This isn't inhuman. This is rationality. True rationality. The level of rationality that humans cannot comprehend.
youtube AI Governance 2025-10-17T13:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningvirtue
Policyunclear
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugx4Hnkh8xD9SF74W-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyzzw73Hyi8hPakJp14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxMLKfL5W0viIGu9xl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzK9OtyDIgvoXHpFuJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy0hMXzNfinvXrNCMZ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwuRAK86clvTotycaZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgySIJo2bTtKqmykzwV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz4zfh4kEJyfwJQAcx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxGMvFT7Dhf22kjNg94AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwvWlx3cblKTjD4z8h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}]