Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is just way too expensive to train. The ethical way would be a company like D…
ytc_UgxCc0ZAz…
G
The scariest part isn’t that AI will take jobs, it’s that it will take away the …
ytc_Ugy197zGx…
G
I'm black Ai is not racist by choice they only make decisions based off there pr…
ytc_Ugz73WT4b…
G
What I don't understand is: what is the deal for the people defending "AI art"? …
ytc_UgwyNr9mT…
G
AGI well never come out of existence from LLM! Check Yann Lecun comments on this…
ytc_Ugz8n3QZF…
G
What about plumbers. Everyone will demand and yell for plumbers the minute your …
ytc_Ugysoh9w3…
G
These morons racing to make that sweet coin w AI and thusly, endangering humanit…
ytc_UgyY3aNQ7…
G
Asimov. the robots should not have rights, so the human destroy and abuse on rob…
ytc_Ugw5Rjfn8…
Comment
The model is open source, that categorically means it is okay and normal for lensa to profit from it. They could have made it closed source or had a non-profit only license on it if they wanted, there is no ethical issue. Big companies, especially tech giants, use a ton of modified open source software on the regular, without paying anyone a dime.
I don't know how you weren't able to look up stable diffusion and check the license, I get that you were reacting to being called out by your audience last video but this is straight up sloppy.
youtube
2022-12-11T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzJ8vD-AnLDnN3vfox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxXMb_Bt7LAxdROxHV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwmza-5lC0D7iEDUsZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxlewP6lpCdaD1x5nx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxQoGBhEWm80Vc7LNF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgygJFelvf8SQaYDceZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrqHJhLlWq25wBjzx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxuT11ILHZk_Q1OY-V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwT7lfMox0ZBASNGEd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXoHPrqQz2KiNW0Ap4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]