Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
THANK YOU. And you other two commenters are totally unaware lol. Of course it's business. Look at the world around you and people that interact with AI, either they hate it or are mildly curious, few everyday people are excited about it, but companies? Oh fuck yea. AI is smarter, it can't claim IP, it won't sue, it doesn't get hurt, overworked, annoyed, it doesn't have to learn to work with other people, it doesn't have a life outside of whatever it's doing, it's what they crave- unpaid, good labor 24/7. I promise you anyone paying for AI to be developed(except the billions of our money our government is paying these companies in the name of "progress") is thinking about what AI can do for THEM personally, not about what it can do for US as the human race, that should concern you. Peter Thiel envisions technology in general as a "unilateral alternative to politics" i.e. technology (including AI) will decide how things work, and you'll just be along for the ride. Again, is that better for us? Fuck no, but it's better if you're someone at the top that likely won't be subject to whatever is implemented or are too stupid to understand that you would be. They're literally already pushing tons of services that are trying to take the jobs of entire companies not even within companies; HR, movies, art, writing, editing, analytics, design, and it'll creep in more and more if we don't decide we don't fucking need it and shut the shit off, but it has 12+ 0s attached to it in $ and that's all those people see
youtube AI Moral Status 2025-06-05T20:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyunclear
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgyGhVVJ-DBt2RcPDit4AaABAg.AIxKNAHQ3dIAIxu_dJn4HP","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzzVMLE3jrabbB84hh4AaABAg.AIxK9dyfbZMAIxR9ikFakh","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzzVMLE3jrabbB84hh4AaABAg.AIxK9dyfbZMAIxRrulGhSB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyfyEk62hotI59ygxp4AaABAg.AIxJybnVniOAIyctKKx1T3","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxGJxUlMJYlDyNjQUJ4AaABAg.AIxJVNmGrZkAIyEt8RkbIs","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyhrSQGBbH1SgOhc-B4AaABAg.AIxIrwWAWejAJZX0gZBx1T","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwZxDOvQ9E5OQ5svi14AaABAg.AIxIDJvfh55AJ-aA0npjwy","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytr_Ugwi6nCJs16RmoTdltp4AaABAg.AIxIAxILB-zAIxucuR-3V5","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzOrjyAz-1VraG9ZGh4AaABAg.AIxI4gPOMj0AIxgJCbTt91","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzOrjyAz-1VraG9ZGh4AaABAg.AIxI4gPOMj0AIzyIhIUGEl","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]