Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i dont actually think google made a sentient robot, but if any company had the r…
ytc_UgyQ61STW…
G
Optimistic BS! You should not have to learn anything, because a simple law that …
ytc_Ugz0Sltj3…
G
im not so sure about 15.
ai can (atm at least) only recreate and combine stuff i…
ytc_UgzOIw8fa…
G
As a disabled person, I don’t really see how generative AI makes art easier for …
ytc_UgyOYOWzD…
G
What an ai so dumb didnt understood between 1 and 50 not 1 to 50 😂😂…
ytc_UgyteQuhs…
G
I just find the whole thing fascinating. Sooner or later these AI’s will come to…
ytc_Ugzf2cY_p…
G
I kind of didn’t want to watch this video because I knew the AI “artist’s” comme…
ytc_UgwMyyJir…
G
Such a helpful video! If you’re interested in AI mentions, AICarma is something …
ytc_Ugw4sabj8…
Comment
Using AI to 'create' art is the equivalent of commissioning an artist. Even if you make changes, you didn't do it, it's as if you asked the artist to make edits. And it's beyond absurd that AI enthusiasts try to crush this conversation and act like it's done and over with. They have no idea on the legality and seriously just expected to go through without a fluke?
Imagine if we just let people train their own AI models for self-driving cars and unleashed them onto the streets - or if we just let people sleep while in the drivers seat. How come we can have a discussion about the ethics and applications on that type of AI system but not an AI system that clones art?
youtube
Viral AI Reaction
2022-12-24T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxn3NQ2wk5aqGm5lBZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-99Oa8L6fMDku7wd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugy-UmtzQtqbS2XFwG94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy14t5X8lqEK_GI9pl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBMm5Gv-EBc9lANTd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8yC5ad9Gs9QKOYhd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugz-NMDVZBQ2Ks4MXVd4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugylva6yeR7uf3fMoe54AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgypKIvPlhNlM7sR22d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQHVGyyQCTzAgf1l54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]