Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
are you for real ??? an algorithm to "predict" crime ??? HAHAHAHAHA !!! do…
ytc_UgxPyCpxK…
G
most of what you said vis-a-vis the job market applies to software development a…
ytc_UgzCI4tA-…
G
I’ve had 7 different chats of ai trying to rape me, MF I JUST WANTED TO PLAY TRU…
ytc_UgzCYJzch…
G
> They have the additional advantages of leaving no evidence that they are be…
rdc_cq6o7cd
G
I instantly knew they were people as soon as i saw the last guy😂 that guy defini…
ytc_UgyFua1O6…
G
Thanks for the tip, @TheWesleystylos! I'll remember to bring my trusty lightning…
ytr_UgxJ8pzCR…
G
This is how shit will go as far for America. People will get laid off, the rich …
ytc_Ugy_eZKh0…
G
I hate to break it to you, but it is well known now that AI failed at the beginn…
ytc_Ugz9aklKi…
Comment
I think the problem is more how AI is used and less how and with what data AI is trained or even the whole concept of AI.
Copyright, law and ethics, tells us we can't just use other people's art in our own art without permission. But what does "using" actually mean? In music, it could mean, I use a sample from some band and build a new song around it. Perhaps even a sample everyone would immediately recognize. What using other people's art does not mean, is when you go to a museum, an Instagram page, or wherever you consume other people's art, then go home and make your own art which is inspired by all the art you've seen before. That's something no artist would have a problem with and that's how all artists in the history of art did it except for the first artists ever.
What happens with AI is much more like the latter one, not like the former one. The training data is NOT part of the model. It's only used to create the model. But not the training data but the model is what's being used to create AI art. Just like other people's art is used to create a new artist's skills by learning from it. Because AI learns, AI doesn't just copy.
Just ask yourself the following question: Would you tell new artists that they are not allowed to get inspired by or learn from your art or do you think other artists should be able to do that?
The best example for the difference between copying and learning is when I gave Dall•E the prompt to give me a man with a mohawk in a leather jacket on a Harley Davidson in the style of Vincent van Gogh. And that was exactly what I got. There aren't any originals from van Gogh where the AI could have copied the individual elements from. The AI actually had to learn how to do this.
I also heard about cases where people were able to reproduce virtually exact copies of original artworks with AI. However, in the studie I heard about (for transparency, I think it was done or financed by Google) they only managed to do this A in really rare cases and B only with the knowledge of the training data and C by purposely pushing the AI to produce a copy. That is NOT a realistic scenario.
Leaving that studie aside, it is just plausible, that the creators and users of AI are not even interested in that happening. They actually want to create something new, so they will actively try to avoid something which is already unlikely to happen in the first place.
However, there's one thing where the problem already CAN (doesn't have to) start with the training of the AI. Artworks of a certain artist can be part of training data, but training a model with the sole purpose of copying the artists style, is something only the artists themselves should be allowed to do. At least as long as the results get published. But when artists do that, AI could change from being a threat to being an artistic tool.
I think it's already a little bit lazy, not necessarily problematic, when you copy a certain artist's style by hand. But that at least is good for practicing. With AI, that doesn't give you the same benefit at all.
I also think there should be transparency about AI art. Just as with so much else, there's a competition in art. Even when it's not about money but "just" about the limited attention of your Instagram followers. Recognition is a totally legit currency for artists. You might win their attention by them having respect for you because you actually created this artwork. But they probably wouldn't give you the same respect when they know that you used AI to make it. So not being transparent about it, brings a bias into that comoetition.
However, you should also not overshoot it with your honesty and also loose the respect you might actually deserve. Because AI art does not equal AI art. You can type a prompt into a web based AI, generate 50 versions and then just take the best one, without having more influence than the prompt and the selection. Or you can use a local install of Stable Diffusion, where you can use several plug-ins like ControlNet which give you back some of the creative control, you load some of your own sketches which the result will be based on, perhaps you also create a completely original artwork which will be used as a style reference, you also don't just select the final result out of 50 versions but refine the result over and over again by slightly changing the prompt and doing some inpainting with new prompts and perhaps also new sketches for ControlNet, ...
Those are two completely different things which deserve a different level of respect. So yes, be transparent and mark your AI art as AI art, but in your own interest, just like at school in math, show your work. That way, you don't create any unfair bias to other "manual" artists, but you also regain the respect you actually deserve.
youtube
Viral AI Reaction
2023-12-05T00:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwibk0KWS2WkgOHxHV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIpmh8QVZmbE0HdIt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyJGRktD_RTEA2UyvZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwX38zQd6i5UKFfd0p4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyuhLw730QWvOnFA7d4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxlSx8Oh20RYf8IlP54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwyCPUeYtOHIgUUBod4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzI44Tekfb4azNu3Xl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwiKSYykZP3r2B8r154AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwekfu-r1eaMDxNvEp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}
]