Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So the very first country that gets this AI thing really going low on the whole world. Basically they will be person that takes over the entire world because they will be able to do that if you don't have. It's just like nuclear weapons. Like you said, I mean it's like we got to hurry. We got to hurry to be so that China doesn't do it and then because if China does it, they're going to anybody reaches that level. First, we won't have any machines because we won't have any robots. They won't allow us to have robots so you know it's like nobody wins. There's no winning no matter who gets there first. All of humanity is going to lose but then you look at Trump. You know if we had a even a smidge of a smart person in the white house right now this would not be going on. I'm pretty sure I could be wrong but even me okay. I have a dI have a degree in teaching special ed and so forth. I'm retired so I'm not an idiot. You know I went to college for 6 years. I'm good but even I can figure this it's like tech technology race or the person who perfects this from any part of the world if they are going for blood or using this new power to hurt others and every country does every country that way pretty much everybody wants to be number one. I want to be number one. The race is really not about who can come up with the best. You know super intelligence. The race is because it's just like nuclear weapons. The one that has the most or the one that has the biggest or the one that has it first this AI thing you know they're going to rule the world and it's going to be. We will be doing okay. They live in a mansion because you know now the humans clean it in their house cuz they're not going to clean the house. DJI monsters are not going to clean. They know it sucks if they're smart. They're going to know it sucks so they'll hire a human to do it. This is ridiculous l
youtube AI Governance 2025-12-25T13:2…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxhPPKZY2KTp88jVPN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy8sfFtHfTKqK-gO4h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugzs1TP6JhqHwmMW4gN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzNobJa9Q2YPODre954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxKSwSo7KesN6UouIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzYzV7CHoJr0Vq34YR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy10c9rFGpEIWyZn994AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyJZoZKzRL647WxmZB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyQkMsvgOPuFd9HUR14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyxwntR-cQyxP8esnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]