Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When is far more dangerous why he makes robots with AI ? Christian brothers dont…
ytc_UgwmEQLWg…
G
People never listen, you do not need AI for that and I am kinda dumb struck the …
ytc_Ugy9vjf5q…
G
How am I supposed to know if this entire post and all comments weren't created b…
rdc_mlm72w2
G
This guy has no clue what he is talking about. Somehow in 2 years we are going …
ytc_UgzO6Bj3k…
G
Look, do you know what pen & paper roleplaying is? That community has always dre…
ytc_UgzPY-sX4…
G
The recent layoffs are not from AI. They are for making bonus profits for CEOs a…
ytc_UgxXjJUJZ…
G
@ They don't "take art" off of other people, that's not how training data works,…
ytr_UgztAEH7u…
G
Just went to a conference by someone from stack overflow. It was/kinda still is…
rdc_nk75iuy
Comment
I’ve heard so many TED Talks about AI art from artists, and honestly the arguments feel weak and biased. They fail to identify the real issue. One major problem right now is AI companies training their datasets without permission. Is it true that training on someone’s artwork requires licensing? Then let’s resolve it properly. How? Through the right forums—courts, open debates, proper institutions.
Because in reality, artists also learn from other artists. The difference is just that we’re less advanced and learn in smaller volumes than AI does. I think humans shouldn’t obsess over any artwork or output that exists in the digital realm. Humans naturally belong in the physical world—performing, interacting, looking each other in the eye.
youtube
Viral AI Reaction
2025-11-19T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzTCqveFEB229smLs94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXclyAv53DhP2YUCR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzRuyXY5MtVGBd-brt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtVDFXIipoqflYjQ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxjSnZOoUdmavv31BZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzusAWlNTnAPmvLV6x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzLm8NiQHBSONq22V94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxMa0UfW3RJ5B31MrF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2Zt2Ljrn4l2BnzB54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwJEShoEL-3U3DIKHB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]