Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1. The idea that nothing will ever get done is already false by the way. Regulat…
ytr_UgxRl8gXu…
G
They're kicking our butt with agroecology, doctors per capita, and keeping citiz…
rdc_f9fcsl8
G
God his constant jumping to people who don't like AI use wanting them dead is so…
ytc_Ugxpjp-yL…
G
Sir Mr. Godfather of AI, with all due respect to your work. Humans are special a…
ytc_UgyiXRu2_…
G
It would become the cult classic everyone refers to as proof! 😂 Like whenever pe…
ytr_UgwZ8Bbek…
G
Definitely we will need to work on AI as it develops. Kids should be warned tha…
ytr_UgwnXDIr3…
G
We’ve been living with risk for decades. The nuclear bomb and moronic leaders. I…
ytc_UgyzKpcfA…
G
Using AI for reference is literally the dumbest thought I've heard of in YEARS. …
ytc_UgzMbXNLz…
Comment
Point 1. The images are not copies if they are mostly imperfect. That is contradictory.
Point 2. Theft has a more direct definition. Theft is the act of taking another person's property or services without that person's permission or consent with the intent to deprive the rightful owner of it. If I were to go into the Louvre and steal the physical painting of the Mona Lisa, that would 100% be theft. Me taking a picture, studying the techniques, and then painting my own rendition of it would not be considered theft.
Point 3. You do not need the consent of the artist under fair use. Even if copyrighted. Fair use is a legal doctrine that promotes freedom of expression by permitting the unlicensed use of copyright-protected works in certain circumstances. (primarily talking about US. I'm not familiar with other countries so I do apologize.)
Point 4. The courts have a subjective view on this topic already, so they're not entirely useful
Point 5. I don't want innovation to stifle because of absurd regulations imposed by people who aren't even familiar with fair use or the courts who are not objective in their rulings.
Point 6. Ask the Luddites who stood with them, you'll find their kinsmen who would be affected by these changes. Not artists
Point 7. People like what they like. Don't hate the player, hate the game
As far as I can tell, these are the two concrete demands from artists.
1. You must ask permission and get consent from artists before using their art to train an AI
2. You must compensate them each time their art is used to generate an image
I'm going to break down these demands in a realistic scenario.
1. Lets say I'm the creator of an art AI, and you are the artist with a unique style. I ask for your permission and you say that you don't want me to use your art. Okay, fair enough. I'm not legally allowed to. Jack on the other hand, is an average person using my AI and he decides to screenshot your artwork and uses it to train it; he then posts that resulting image. Who is at fault here? Me? Or Jack?
2. Okay, that was a terrible outcome. Lets try again and say that nobody can train their own images except for the owner of the AI. Jack then decides to take up learning your specific style and nails it. I ask him if he would be willing to use his own artwork to train my AI. Now you, the artist, are out of luck, and a very similar art style is now being generated. You can't claim your art was stolen because it was never used in the first place.
3. How do we even compensate the artist? A lot of it is a sort of black box, much like the human mind is. There are many inputs that generate an output, but it is never 1:1. How do I compensate an artist when I'm not even sure what percentage is actually used? What if there is a large training set and your art ends up being 1 piece out of five billion artworks? How do I give you 0.0000000002 of any form of profit?
I'm sure there are many other scenarios I can bring up when discussing this, but the only way you could ever prevent any of this, would be to kneecap an entirely new industry. You'd have better luck reversing the industrial revolution or the printing press.
youtube
2023-02-04T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytr_UgxPbZgoSZsPUduQO754AaABAg.9ljrnMSiA8Z9lkSnj9fLiV","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugy7pDtq9gro0NBtqbh4AaABAg.9ljlw1Wi-2r9lpTEW7dGEA","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgxQ7FJFdRgLhRKp2JZ4AaABAg.9ljj_itfuNK9ljkAwkvLGN","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytr_Ugw9Kd4xZfffJLZXRmR4AaABAg.9ljQCt2Mh3Y9lkTGnMXCfN","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytr_Ugw9Kd4xZfffJLZXRmR4AaABAg.9ljQCt2Mh3Y9lkWJ9Unhj4","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugw9Kd4xZfffJLZXRmR4AaABAg.9ljQCt2Mh3Y9lkzaLXfwjl","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytr_Ugw-MJfdpdxflWKapBV4AaABAg.9lj-XKb-qbW9ljZWGmyFTw","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytr_UgytsbYJQlPVyAujimB4AaABAg.9li_w8q2jQn9llfpdomyLN","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgwfEmxuFMnlVn8D5EN4AaABAg.9liXkZqA95A9liZoO70BLw","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytr_UgwfEmxuFMnlVn8D5EN4AaABAg.9liXkZqA95A9limEFN-M8I","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]