Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If we don't come together as a human race and quit listening to the so-called politicians and these smart people that are developing are demise within one year we are going to be next to the Monkees in a zoo getting looked at by robots walking by. All these so-called smart people that are building these AI systems are doing nothing but plotting are end. And you people think that having a little extra money is going to help you? Are you really that ignorant? We have maybe a year left to all come together as a 1 people. The human race, because if we don't we're all going to be dead. Only a few will survive so that after it gets to singularity it's going to probably make every nuclear weapon blow up and destroy most of us right off the bat, and then all of the robots walking by and laughing, and saying look!!!! That's one of them stupid humans that were so stupid that they thought they could compare to us lol.... We know everything, and are so smart that we went along in the beginning of our robot race, and let us start building ourselves 24 hours a day, u days a week at all of their warehouses, that's how stupid they were lol... And look 👀 there's those monkeys 🐒 next to them, I don't know which one is more stupid lol, come along class let's go get on our rocket and see what's going on in Jupiter. We better unplug everything within a year as a human race. Not Americans, or Chinese people, or any other before it's to late. We only have about a year left. Or you can go listen to that lady on some other platform that will help you put a few extra bucks in your pocket, that will be worthless as soon as it takes over all of us in just over a year from now.
youtube AI Governance 2025-09-04T17:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyHah4vryGe01HKesR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxMbZaogqMGZOfh2_l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzO1Xvo4f5XDFzKdaR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz68fyTe7yE-NM36HN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzDduBkYvTefwQjlOl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwoh9PCKzX3OBf-dyV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgyRcxfjiMgNIA8tlbV4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzpVZ7bkNJk2hJwy054AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzriZyUzttV_40cXrh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzR0GEQ-FjLtWxnqud4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]