Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If anything, this whole AI situation has made artists come together as a communi…
ytc_Ugw-0X8YJ…
G
Ai has made me hate the majority of the technology industry. It's just another e…
ytc_Ugz84iuhn…
G
Not me having fights with ChatGPT
No once I started arguing with it (guys I’m t…
ytc_Ugxg8qIEa…
G
This old bloke here doesn't know everything. When AI was asked how it sees itsel…
ytc_Ugycx8era…
G
Why are we acting like AI just appeared out of thin air, what's next, taking kni…
ytc_Ugw9QhRHI…
G
Oh dear... FINALLY someone talked about this person😭 I used to follow her cause …
ytc_Ugx17zRTw…
G
If you look at the AI like a child who learned everything from his parents, you …
ytc_Ugz7v7H31…
G
Disabled person here! I've never once felt like using AI art, and am honestly di…
ytc_UgzECF6OP…
Comment
They're just asking for a scenario similar to The Electric State movie.
You can't ask a human to do the work for 100,000,000 or more humans, but an AI does that every few days, and it uses data that it doesn't have any right to. They need to hire a slew of different kinds of artists contracted to create art for them and only them that the artist has no claim on after leaving that company/technocracy. Whatever one of them does _after_ they leave that employment is theirs and theirs alone and no company has a right to it without the artist selling it to them directly.
But none of these AI companies want to have to pay for that.
youtube
2025-08-22T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwlzou_6MMfX8WIu2J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxTPLcU9wW0mplpHZJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7HCjtNF7lstseo3p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzoh9lPu2qjgq9WSap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOYk1lhL9hRYFQdzx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztUqXoGSdl1D779LF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwtJdR5o_yabt7my5d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4Db_MiWnLHkZAaVh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxWgMYJCEaImJBm4HR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZRR3_igzsXMH7Onl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]