Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly, AI Image Detector should be used by everyone creating or sharing visua…
ytc_UgwYivLsC…
G
Ai can be used both for good and bad. If there is not clear rules set in place t…
ytc_Ugw1mvAum…
G
This is real people but they actualy are dead but they make it a robot😰😨😨😰😰😱…
ytc_Ugx5iN49x…
G
Yes, it is clear that innovation itself (like production efficiency eliminating …
rdc_jifraqb
G
When all this AI stuff started, I remembered that in college (back in the '70) I…
ytc_UgyGDbUVF…
G
My perspective is that rights are a natural and necessary consequence of machine…
ytc_Ugh5Sf2tv…
G
I might be dying alone but at least I have a dignity to not play pretend with ai…
ytc_Ugxv6JaC7…
G
That’s why you do Marshall arts the robot can talk and piss you off but when yo…
ytc_UgxcEKqx4…
Comment
A few things, I want to start a fight, copyright protects the owner from theft of their art work and things produced to an uncanny likeness(assuming gain of profit or otherwise a loss of revenue). Compyright doesn't protect use in areas where there is no profit gain, and using it for AI training, producing a free to use trained model that then gets used for profit by a third party doesn't violate the law. That being said, training a data set on an artists style will not give you the same image unless you spin it through img2img. Now, that being said, a recent ruling by a court made MidJourney produced art uncopyrightable, that may eventually extend to all latent diffusion produced art.
In order to create this art, a trained dataset and a finely worded prompt are neccessary. So long as the user isn't trying to replecate an exact art piece(ie img2img) the AI will spit out something that contains likeness of style to an art piece, but with no similarity to an existing piece. An art style is not copywrightable. Copywright also does not protect the download of images, or the use of images to train on any neural network. Thusly, no rights are violated.
Using data scavenged from online sources is not illegal, but has been a concern to privacy advocates for decades. Artists are late to the party as it has only now affected them in a way that they consider uncouth. By defininition, online resources that the public can access are not private, and LIAON5B is not in violation of any law.
Lensa is a buisness venture by Prisma Inc, Stability AI's resources are research and free to download, Latent Diffusion itself is a new research method, everyday, new technological research is monetized by capitalistic backers with the financial capability to do so. mRNA is another example of free published scientific research that for profit companies then turned into a product, ie in vaccines. This is a normal thing that happens with research, the only difference being that Stability AI opensourced their utilities and training data, much like OpenAI did with GPT2 and Jukebox.
The way AI works, you can't just feed art from Leonardo Da Vinci and get a Mona Lisa when you ask for it, the model takes weights based on features in the art, when given the name of Leonardo Da Vinci, it will produce art in his style, based on those weights. Given the phrase Mona Lisa, is will produce a portrait in a similar style, in order to truly replicate it, an img2img is needed.
So my issue with these arguments is worded thusly, AI training doesn't violate the law, it also isn't a violation of ethics unless the user themselves violates ethics. This isn't an AI problem, but a people problem. AI also doesn't prevent creativity.
For photorealistic AI, or even art that resembles an existing art style to a high degree, it still takes significant training and constant prompt revision, making the AI user turn into a prompt-smith to get anything of use.
For music AI, it may get trained on copyright free music, but can just as easily use a copyrighted song to base its generation off of. Not to mention that anyone can train these AIs with sufficient computer resources.
The solution to your issue would be to disable saving like OnlyFans, it technically doesn't prevent download if you have HTML knowledge but its a start for the average person, then include what use cases are allowed(on the website with these works). ie you'd need to make the art private to some extent, a paywall is the best way to legally do this, but most artists don't want all their work behind one, but universally disabling right click would help.
youtube
Viral AI Reaction
2023-02-27T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxOu3JTpjebsdriaoF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDs6KShyhUefK7LF54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjInT1LDF0ZOuZQgx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1nBxyK1jOfFSbQf94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyLnWbkEfxYt3Yz9_94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyY-iCAJd5seVIsujd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyubf3LNfQ8vq4Yn9t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz1UCPwzA1zaFCHvdx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyNysLMjqTk9IvQEz94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzoW53rBoDfNnImZtR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]