Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Damn how can we know it that exurb1a isn't an extremely efficient text-writing, …
ytc_Ugww_dwgn…
G
Whoever built this is a lonely individual. They built a female robot so that the…
ytc_UgyIGg_LV…
G
Me: chatGPT show me picture show me pictire of nothing
ChatGPT: *show picture o…
ytc_Ugy4ZdxeW…
G
Given dead internet theory, could this be a sort of AI advertisement carried out…
rdc_mukxehu
G
Honestly, the only reason I'm using AI at all, is because, at some point, I have…
ytc_UgwXHa_gQ…
G
When a human artist creates something, there's thought, emotion, and soul put in…
ytc_UgzZPwB2v…
G
What most of these discussions is missing is that we are still limited by physic…
ytc_Ugw_t8IXK…
G
@ranzu3138 Except it’s not the original artists work, is it? Unless I’m misunder…
ytr_UgxSg4HUj…
Comment
Regarding the "In the style of..." (let's say van gogh) argument, one thing to consider is it doesn't actually need Van Goghs art to learn his style. You could still achieve the same result by training it purely on human made art that isn't by van gogh, but was made to mimic his style. And this is already happening, how many "x in the style of van gogh" artworks have been made and uploaded then scraped to train the AI?
Regarding watermarks, if you took a human at birth and locked them in a black box with the only outside sensory being images you slipped into the box, if all the images you slipped into the box have signatures on the bottom right, or watermarks in specific locations (often times in the same position), and you asked them at 20 years old to draw you a portrait, don't you forsee them thinking "Well these are usually supposed to have a weird squiggle on the bottom right"? And they draw a crude one that is not a match to any they have ever seen before, is that theft? If I was asked to draw a stock photo, and I poorly sketched the getty images logo on the image, is that ethically theft?
Regarding Lensa and legality/morality, Stable Diffusion devs had the option to license SD in a way that disallows commercial use or requires paid license for use, but chose to open source it alongside allowing commercial use. Their plan for money is to give the tech away for free, and monetize via "premium" models, the "brains" that stable diffusion can use. So once they have a model that's been trained on millions of very specific images, at higher resolutions (Currently it's trained on only 512x512 pixel images), they will sell those while the worse ones are free. Think of it like dalle mini vs dalle2, the stable diffusion models we have now are dalle mini, expect a similarly sized jump from mini to 2 when they start selling models. In short, what lensa is doing is 100% intentionally permitted by SD, no loopholes or backstabbing included. I don't know anything else about Lensa so I can't say if they're scummy for other reasons.
"It's indubitably mass producing art, turning art, a very personable thing into a mass produced commodity" This is similar to what painters said about film and cameras killing the art of painting portraits. Similar to what the Luddites said about textile machinery. Traditional media artists said about digital artists. Etc Etc. I disagree, it's not commodifying art, it's making the act of creating the art and imagery you want much more accessible to the average person, and not as a commodity, but almost exclusively (currently) for not-for-profit entertainment. Sure, services like Lensa are making profit off it, but the tech itself is free and open source, as long as you have the hardware capable, you can access it for free. Just like cameras allowed the average person who could afford a camera to effortlessly capture the world around them in ways previously only possible with years of training, AI image generation is allowing people who with little to no artistic skill summon up the images in their head. Yes eventually it will be used by big business to replace human workers, but we have no time to spend worrying about artists specifically because this is going to happen to all of us. Literally all of us in the end. You *can not* fight it, the only move is to fight for social welfare programs to protect those displaced by automation, something like a universal basic income. Even if we managed to make AI image generation illegal or heavily regulated to not allow use of materials without permission, that's just one field saved, AI/automation is still going to replace literally every other job.
youtube
2022-12-10T23:0…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwfBjcDqbEgWtoJd_l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxzwLXuAf5o8hvdXA54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyGSLdXWAohUuAOlFd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFv9wDzJe2WUDghRB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwP8UewHbnLRJhu7TV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwh2QMUpdQGhnnJRYR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxdh238l2MA7l_oa9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9HWVAh7UrvP1oq-14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyLm8CLbM-lEdwjmJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyCL30gmNhJAeghClx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]