Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
*sigh*… I don’t think I made myself clear before, so I’ll make myself clear now… You cannot simply get rid of technology. Sure, you could delay the inevitable by making it more difficult to develop machines that could make art that surpasses human capabilities. But as I said, you’re waisting your time, you’re delaying the inevitable. As someone who makes art, I agree this is a problem. Bluntly it’s a problem that, once it’s started, it cannot be undone, it cannot be stopped, and it will not stop until humanity dies. “How will I get money?”… get a job. There’s so many of them out there right now that I’d be surprised it you didn’t qualify for even one job. And bluntly, it’s not the end of the world if AI art exceeds past human expectations. I hope the best to all of my fellow artists. And that we can learn that some problems cannot be stopped, Even if we began to violently protest and the government tries to do something about it (which they won’t, because our well being is not something they care that much about.) and before you call me an “ai bro.” I’m not, in fact I wish people would stop being lazy little fuck-tards and pick up a pencil, pen, paintbrush, or stylus for once in their useless lives. But they won’t. My dad being one of said “ai bros”. Good day to everyone who reads this, and to the AI-bros, PICK UP A FUCKING PAINTBRUSH!!!
youtube Viral AI Reaction 2025-03-31T21:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzTc6eawMVe7z1m4ed4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwGNVdRkjP0MKJj5Y14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy1lUPxIVJH5TZO57F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxSwYTvoC9-tuR5S454AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgweDA7SNFmOrrIIDGd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxBYRy6ZXRS9ugAQup4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzYscp5jv9N2pmoind4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwOJJkHYypo9CHjPap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwIhvnYO2wynGb0e4l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxNf9PhTE3jomSis-54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]