Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My character ai was weird I started a group chat with 3 other characters and the…
ytc_Ugwwv-Yhf…
G
I like that the male robot tells the TRUTH as he sees it. Sophia's been reprogra…
ytc_UgwUmJRBN…
G
@renhaiyoutube For sure, I think it will pretty soon get to the point where thos…
ytr_UgwwZ897u…
G
Anyone ever met a developer that can produce flawless code consistently?
AI in a…
ytc_Ugx9cb65B…
G
Ask the student to complete a paper using AI. That becomes their baseline. How…
ytc_Ugw3_B5st…
G
Trains, Planes? Self driving? All manufacturing and movement of goods? Is this…
ytc_UgzOwdD3J…
G
What a debate that will be it’s scary it’s extremely scary , there would have …
ytc_UgyE5fv-w…
G
The fun part is that most managers of all levels are easier to replace with AI t…
ytc_UgzgLanp-…
Comment
@TF2Invisibleman „Point 1. The images are not copies if they are mostly imperfect. That is contradictory.”
The article I found disagree with your claim.
„Me taking a picture, studying the techniques, and then painting my own rendition of it would not be considered theft.”
And this has nothing to do with anything here. These programs do not study anything. They just generate things to fit within patterns. That is it.
Also, making a copy is considered plagiarism. That is a fact, and if the work is copyrighted, it is illegal.
„You do not need the consent of the artist under fair use.”
You dont understand what fair use is, do you? Fair use is not something that you can decide if you do, and has not specific rules on how it is done. Fair use is a case by case exception to copyrighted given by the courts.
„Point 6. Ask the Luddites who stood with them, you'll find their kinsmen who would be affected by these changes. Not artists”
You clearly know nothing about the art critiquing automation in that time period, do you?
„Point 7. People like what they like. Don't hate the player, hate the game”
That claim is just an excuse for your moral laziness.
„1. Lets say I'm the creator of an art AI, and you are the artist with a unique style. I ask for your permission and you say that you don't want me to use your art. Okay, fair enough. I'm not legally allowed to. Jack on the other hand, is an average person using my AI and he decides to screenshot your artwork and uses it to train it; he then posts that resulting image. Who is at fault here? Me? Or Jack?”
In this case? Jack.
„2. Okay, that was a terrible outcome. Lets try again and say that nobody can train their own images except for the owner of the AI. Jack then decides to take up learning your specific style and nails it. I ask him if he would be willing to use his own artwork to train my AI. Now you, the artist, are out of luck, and a very similar art style is now being generated. You can't claim your art was stolen because it was never used in the first place.”
Well, yeah, in this case there was no problem of breaking consent.
So it should be ok.
„3. How do we even compensate the artist? A lot of it is a sort of black box, much like the human mind is. There are many inputs that generate an output, but it is never 1:1. How do I compensate an artist when I'm not even sure what percentage is actually used? What if there is a large training set and your art ends up being 1 piece out of five billion artworks? How do I give you 0.0000000002 of any form of profit?”
Go talk to the artist, and make a deal.
youtube
2023-02-04T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytr_UgwfEmxuFMnlVn8D5EN4AaABAg.9liXkZqA95A9litwZAz-bX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwE6bhxwSa9a9wz4KZ4AaABAg.9liB-lw2LEx9liE897-Fu0","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgwE6bhxwSa9a9wz4KZ4AaABAg.9liB-lw2LEx9libftwx0C-","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugy_B3n7wW95ll3phOB4AaABAg.9li8Vc-zDmV9liEVjomeSP","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugy_B3n7wW95ll3phOB4AaABAg.9li8Vc-zDmV9liNsvsqiYs","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwj1sBpq7pmqHByQZZ4AaABAg.9lhYzAMwU7T9liF4xOd_Jh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzge3ZoA_DHHmf1a0t4AaABAg.9lhRUFmMQex9llaJVXgMZv","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgyoNlOkGtDfdFXDfpV4AaABAg.9lhLR1sDADl9liFWaoB4CM","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyoNlOkGtDfdFXDfpV4AaABAg.9lhLR1sDADl9licRAvzLLV","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyoNlOkGtDfdFXDfpV4AaABAg.9lhLR1sDADlAJhNDEk8EUM","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}]