Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This has to be the dumbest, most shallow-thinking set of arguments one could possibly come up with. The only way to arrive at these arguments would be to purposely not give this any thought or to not look for the slightest pushback. Drawing and painting are low-level skills that can, should and will be left behind (90% of our time spent on it at the very least, there are always corner cases and the need to do 'touch the wheel' here and there) in order for humans to keep moving upwards in our creative endeavors. Drawing and painting aren't even close to the most important sub-skills within the "illustration" meta-skill, but they do take an excessive amount of time relative to their importance because of how bad humans are at it. We are in this awkward phase where AI image creation is just starting to get good - you haven't even seen anything yet - it'll get 100x better (no joke, anyone with any visibility into what's coming knows this to a certainty) in the next year - maybe then people will stop fooling themselves into thinking they can compete with it at useless tasks like drawing or painting. The 'copying' is a silly misunderstanding that would be a waste of time to try to explain in depth here, but I would suggest getting educated on how these models work. They are forced to learn, you couldn't possibly fit the dataset into the model (just think about it for a second) - If you ask it to copy an image that is overfit in a database (can only happen with a very small set of images that are duplicated over-and-over with the same captions in the training set like the monalisa or afghan girl) it'll make a very similar image to it, yes, but this doesn't mean what you think it means, and it'll only get harder to do this as the databases start getting de-duplicated and cleaned up. The model is a lot more original than a human artist and pulls from a lot more learning than a person does, so it references individual pieces much less than a human does. Regarding copyright - there are no loopholes here. If something is publicly available to be looked at, it can be looked at, used as reference/inspiration and learned from. There are and should not be any laws that require me to pay someone for learning from their image, or deciding to use the color palette they have, or the shape language they used (and likely learned from another piece) to interpret the sunset I'm drawing. This happens every day in every studio. I make more money than 99.99% of the artists I reference when art directing other artists, yet I don't pay anyone I reference, and neither do you. Who should get paid or credited if I use a shot from the latest avatar movie in my moodboard? How recursive should it be? Should I include the catering crews of the 4 movies that inspired james cameron to direct the DP and team to do execute the shot that way? The empathy point I agree with, but the main problem is that the tantrum from the side of the anti-AI crowd (won't even call them the artists because 99.9% of the artists I know, myself included, are in favor of this) has been the most childish, bad faith, and poorly thought out whinefest I've seen in my entire life - it's so hard to extend any charity to someone who starts shitting on everyone else without using more than 5% of their brain. The infinite narcissism and entitlement from this crowd is staggering, and it's hard not to tell people that 'hey, maybe don't get good at washing dishes'.
youtube Viral AI Reaction 2022-12-29T14:2… ♥ 4
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugxtm6XXukcpSQaaCCF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyO_tpFN5QMXmOVxMV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxBaindssEHQ5v-AQ94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugysx2ETRs2SPAH0dvt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugwo_A257J6YHMP4eBB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzgkX_yh-o5XN87ewN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyFw2goZgkI7o45Hst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx0s4QiBw4bx0Zaabt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxIUF3R0IFsncnlalZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwOiI16WCH3nI_qxMl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]