Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The model learned it from us. We say 'great question' to signal we're paying att…
rdc_oi0p101
G
He's been building AI for over a decade and hasn't stopped making it seem more s…
ytc_UgwX8-atW…
G
Something I haven't really seen people touch on is the role art plays for the ri…
ytc_UgxYOUQP8…
G
Humans need meaning in life and it is unfortunate, but a lot of people find it i…
ytc_UgxGI4GSA…
G
AI is dark is against humanity is all about the money but the money will never w…
ytr_UgwMvwlag…
G
Ya'll should watch First Reformed. It's an incredibly powerful commentary on Cl…
rdc_emo0duw
G
All of these pieces that people made in protest used AI as a reference. The orig…
ytc_Ugzl_V3Vv…
G
Thank you so much for your simplicity of pricelessness that said AI can contribu…
ytc_Ugy7yESut…
Comment
Low insight take on AI. First, they complain about how everybody treats it like black and white and issue. And then immediately frame it in black and white (AI for art bad, AI for curing cancer good).
Second, they presume AI is unethical without actually making the case for specifically why. AI is that is taking on what people had agreed was a legal and ethical human function. An artist could absolutely, perfectly legally learn from publicly available images, even imitating the style of author artists. Learning is fair use, and styles are not copyright-able. Argyle calls the learning an "infringing act" but a lot of courts are ruling that it's transformative, so it's fair use.
Sanderson's response to this is that "special protections for humans exist," but that's not an argument that training computers to do this is unethical. I mean, special protections for humans exist, I'm a human, so I would like the government to make competing with me illegal. And as a coder, my job probably will be replaced by AI. Does that mean no one should use AI code generation? You mentioned that curing cancer is a good use of AI. But what about job protections for the humans working on curing cancer? Protections for humans exist, so they should have it, too!
Argyle's adds "but it do much more art that we can!" Again, not an argument that doing so is unethical, just that he doesn't like the outcome.
But!" says Sanderson "The process of me thinking through it makes it mine!" "There are happy accidents to the human process!" Bully for you. The fact that you enjoy a thing does not obligate people to pay for it, or to avoid finding cheaper ways of getting what they want. What they are arguing, without wanting to say it out loud, is this: "We're worried AI art will be a better product. We would like to keep our jobs despite not being able to compete with the output of AI art, so shut down AI art and give us money." They are narcissistically caught up in "why AI art is bad for artists."
Sure. It's bad for artists. It's also bad for coders and researchers as mentioned above. There ARE strong arguments for supporting human art, but you don't get to any of them by arguing that you enjoy the process of a thing, so people should pay you.
The real question they miss is, "Why is human art better for the reader/viewer?" Most artists don't have an answer to that, which is why they are afraid of AI. See, all this stuff about the work "being mine" or the "happy accidents" on the artists journey, that's all for the artist. The viewer doesn't care UNLESS they have a relationship with the artist, or the community around the art. Only after there is a relationship does the artist's inspiration, their process, their uniqueness matters to the reader. The real way to prep for what's happening is changing your business model from art as a commodity to art as a community.
For example, I read Wind and Truth, and it was awful. I wanted a good reading experience, and it failed to deliver. It was a failed commodity. There was a certain "Sanderson-ness" about it... but the same thing could probably be said of Brandon's bowel movements, and I don't particularly value those. When art is a commodity, all the reader cares about is "Am I getting the aesthetic/emotional experience I wanted?" And if AI can deliver that, so much the better. I would much rather have read a decent AI novel than suffer through Wind and Truth, hoping that maybe it's ending will justify all of the awful writing along the way, only to have the author assure me five or so times that the stupidest possible ending was actually genius. It's only when art is a relationship that the reader cares about the process Sanderson underwent while writing it. Presumably, the fans that rated the book highly did so, in part, because the book was symbolic of an end of a journey they undertook with Sanderson and his other super fans, that including videos, wikis, conventions, and personally signed, limited edition copies.
If AI can produce truly great work (and AI fiction is awful, AI art is middling currently), the challenge for artists will be switching from art as a commodity to art as a community. This will be hard for artists who are narcissistically concerned about the value of their own process to them. It will be awesome for people who love connecting and building communities.
youtube
2025-07-06T01:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxIaQNTrmHg0d90KbJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyijcHO7KtQ-T9scVd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDexS4REUkD7P1-954AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbYPxhC7SCt12Q6rh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz8R_LP5uvDF_pUCPJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzCDz8pm5-snpwb1ul4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxruWq5CInmMLDQNsp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzS9rAUl4aMkYcMOLh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy61UIXpiiyjX1RPdF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxcapzUEIYK_OQ9ued4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]