Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I did read yours in its entirety. Here’s mine. AI is far more akin to tracing another’s drawing than what you described. A practice that IS heavily frowned upon if you don’t give credit (same goes for using music without permission and many other things that people DO get rightfully angry about). While the idea that led to its creation (or in the case of AI, the prompt) IS owned by its creator, the work of others, stitched together or traced over to fulfill that idea, is not. Taking that work without permission is different from getting inspiration from something. When you take something as inspiration, you’re adding many new elements to it, be it an idea for a scene, a quirk of your art style, or something else entirely. Thats the human element that allows art to evolve over time into completely new styles and concepts. AI doesn’t do that. It’s a complex algorithm designed to look at a prompt and use it as a guide to piece together something that fits whatever patterns it’s seen in the gigantic data set it has access to. Anything “new” it adds is a bug of the program and one that will act in a predictable, mechanical, and repeatable way. It’s an amazing technology that WILL become a tool of the future, but it’s not a replacement for the human process. Btw, what counts as permission? Well, it’s not always just the act of giving credit. Sometimes you have to get it explicitly. Sometimes it’s given as a general item to the whole public, such as something being made public domain by its owner. And sometimes, it’s simply given in the form of social norms. The connecting tread is that in ALL of these cases, it is normal for the original publisher to be aware of the practice, to the point where saying you “didn’t” know isn’t seen as a valid excuse, and perhaps more importantly, to be able to say they DON’T WANT THEIR WORK TO BE USED IN THAT WAY, be it through laws, personally responding to an ask, or putting a statement in your bio, and have that be respected by the community, so much so that others will condemn violations of it. This type of permission is CLEAR. INFORMED. CONSENT. Sadly, AI does not often follow those guideline. Artists are either not asked for consent or fact that they are giving it is hidden deep within pages of agreements, and the technology and practice have not been around nearly long enough for any sort of agreed upon social norm to a exist in support of it, hence why people are so angry. Why should people respect something that has barged into their community and respected none of the standard customs or manners that all the other members hold themselves to? I think that this entire thing is just as negative for AI as it is for artists. As I said before, AI is the tool of the future, and innumerable innovations that its use will lead to. But there are ethical ways to approach those innovations, including innovations in the world of art. Imagine a company creating a program, based on their own set of images, that could instantly depict any pose you could imagine so you could have it for reference. All supporting unethical uses of AI does is push those futures farther away as the public consciousness is made to see that AI only has two aspects: it’s bad for those whose fields it enters and a cheap way to cut labor. IT CAN BE SO MUCH MORE. But we need incentives to inspire people who don’t see that sort of future and instead just gravitate towards the extremes of “bad” and “good”. If you truly see a future in AI then treat it right and make sure it grows up with some dam strong morals. Now with all that said I do want to acknowledge that there is a gray area to be had when it comes to using characters and settings from large media without their permission. It is something that should be explored, but as a topic separate from the problem of AI art as there is a massive difference between the two: the power dynamic. Large companies can and DO take action when they believe that something has been appropriated to a harmful or simply unwanted extent. Artist often do not. Aside from a few, they don’t have the financial means and time to stand against these companies. In order to win a lawsuit you have to have the money to get a lawyer and the time to personally invest in winning that case as well. Time what would normally be spent on their normal job and/or art career. The big company can just send a representative. Unless you’re incorporated, you have to go yourself. Imagine if every time a company faced a lawsuit they had to stop EVERYTHING and show up in court. Hell, even to get there you have to know that your stuff is being used without permission. Large companies spend millions of dollars to do exactly that, scrubbing youtube, twitter, and every other media imaginable, and still miss a shit ton of stuff. An artist has to rely on their self and their community, if they even have one, to be able to spot a closed source algorithm spitting out something large enough to be legally recognizable to know they’re one of millions being taken advantage of.
youtube Viral AI Reaction 2025-04-01T23:4… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgyohEC0pM16SB6qi9l4AaABAg.AALeKg5CVfKAALztUTzeby","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyohEC0pM16SB6qi9l4AaABAg.AALeKg5CVfKAAMKYbv40a8","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwTBCZCSjP_39Cef2F4AaABAg.AAL91wu4r2WAGOZLv9r5Iy","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytr_UgyPwxIx0Qr9uIamXot4AaABAg.AAKsySTTlOmAAKvWprGSWg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgxmHrSomY-c9BsVWMV4AaABAg.AAK1HxsGKnOAAKB6Akwly6","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytr_UgyN3cdiApxGfec-7dt4AaABAg.AAK1AFe5rCCAAKC0v2DSOW","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgzZWd1AR8jOcSIkWi54AaABAg.AAJekRar0PSAAJg5l63iRR","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzZWd1AR8jOcSIkWi54AaABAg.AAJekRar0PSAAJoqU4c73J","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgzZWd1AR8jOcSIkWi54AaABAg.AAJekRar0PSAAJsEwVfVnI","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgxWbnuxjNKqr89ZU4J4AaABAg.AAJMUp_gRIuAAJNR-BkwMt","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]