Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@FaadielgFULL SELF DRIVING means it should drive itself, after adding “supervise…
ytr_UgzVXGsTj…
G
3:37 What about how they used pics of people with all white backgrounds? Why isn…
ytr_UgyFu9e1O…
G
Now AI, then AGI, then probably Bio AI then boooofff. Then....Nothing to talk ab…
ytc_Ugwbi2yh5…
G
Ahhh 😀 there is nothing better than a morning with a doom and gloom end of the w…
ytc_UgyLaQanF…
G
Lmao how dumb can you be calling yourself "AI artist". You didn't make the AI an…
ytr_UgyBU-n_T…
G
....to what end? I swear some of you are so obsessed with conspiracy theories. …
ytr_UgxfCX0E4…
G
The ELEPHANT in the room...is not just the simple fact that nobody has a freakin…
ytc_UgyGl-EV0…
G
Could be smarter than us? AI is way smarter than humans, but it's smart enough…
ytc_UgxgsfUz-…
Comment
The problem with telling people who think like this that their AI is 'making the art for them' is that they'll argue they did more work than a photographer, who just pushes a button and his machine creates an image for him, whereas they had to write and refine prompts and go through picture after picture to get it just right, and sometimes it takes hours and blah blah blah. They'll argue that them making the prompts and going through to find the picture that actually looks like what they want is them 'using the tool', just like a digital artist going through their brushes and selecting 'water colour' is them 'using the tool', and they'll argue that digital art is easier than traditional but it gets accepted, so why not AI just because it's easier still? They'll point out that there's already code effecting the digital art to create the fancy brushes and do the line smoothing. They've got it in their heads that these things are comparable, and when someone is in that mindset you saying 'that's not the case' isn't going to convince them.
But something they *can't* argue is that any AI art generator worth using *has* to have been trained on *mostly* stolen art...art that the artist did not say the program's creators could use. Assets scrapped from all over the internet where images are free to *look* at but not to print out or paste into an etsy shop or claim as your own. And the creators of the AI may not be *distributing* the images they stole, but they are *using* them to train the program. Therefore the art one prompts the computer into generating could not have been made in the way it's made without that theft, and if you want to claim that *that's* allowed, you don't get to complain when someone else uses *your* images to aid in the generation of their own art. If you want to claim that AI art should be protectable and have any kind of leg to stand on, you also need to ensure that the AI you use is only trained on images that the artists specifically and intentionally agreed to being used. Delusional people will still try to twist this argument, but it's much less arguable. Would the AI made with only donated art work as well? Would the image you generated look like that if it hadn't trained on stolen materials? If you said yes you're delusional, and if you said no you're admitting that 'your' art requires the use of images that it did not have permission to use. Are you claiming that materials 'made public' by posting them in places where they can be seen are up for grabs? Then you justify the same thing happening to your own images when you post them. Simple as that.
youtube
Viral AI Reaction
2024-10-03T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwVnijU9DyJnQHlotx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwjFVVe3pIt12hDGBt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxkZhuWnVyQXZOZub94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyxiwsJRhf2CU1k-5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZboY-CMfVW1n6NFR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyUL4fCZqBWLbUCZrR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcLWuXrXEtB5bfz0x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3k44qAG4LBAmp6QZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugym7ctcESfTtwzDWXx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyPMb0Ok971SPLJmDd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]