Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's sink this for a second...Universal Basic Income for not working or get wa…
ytc_Ugywz4hFE…
G
I have seen the argument "use AI to generate a reference image for an artist" an…
ytc_Ugw6la5_b…
G
I went to the doctor with a family member. Fortunately he knew what was going on…
ytc_UgxJDY5MZ…
G
Noticed? your fears are related to Humanlike AI. And besides, "control" is one o…
ytc_Ugw-arl-l…
G
Wake up humanity, they're going to be replacing all of humanity very soon. They'…
ytc_Ugxbc9N8A…
G
Lack of Early Critical Thinking Development: Experts increasingly agree that to …
ytc_UgxOStpzj…
G
If AI’s future development is based solely on human behavior, we have a big prob…
ytc_Ugymv6MtT…
G
For every job that is replaced by ai , ai needs to be taxed the same amount the …
ytc_UgwL-MN55…
Comment
there are 2 separate questions: 1) is using AI image generator morally justifiable. 2) will AI get good enough to destroy artist jobs.
answer to 1 is unequivocally NO, and people defending it are idiots. but I am afraid the answer to 2 is probably this, and Asmongold will end up being right if we don't do something about it. sure, AI will "never understand art", but that's irrelevant to (2), because most people who pay artists to use their art, don't understand art either (think people who are looking for stock images, etc.). and they won't think twice before using an AI generated image.
youtube
Viral AI Reaction
2025-08-13T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzOimJcJ2Hrqt9es3R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwgbmdRpV19XSJSSKx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxB-rIFm3aH4mHUsa14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzJwuBDB0gAYcHEBMN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_Ugx6VNlWsvymw_eNpHx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugw7WV5kEsbrGO6a3Xd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz-A0qgJbE5_Ff6pJJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyH5A0ywEUrDiUFtVV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwe2AGJCIN7517wd3V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2Vq1ZVGpTyaDmhNV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]