Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Professional software developer here. This was all too predictable. Technical de…
ytc_UgyKceelV…
G
Us artist actually feel good about what we drew/painted and I know some people t…
ytc_UgyNVYIbe…
G
i have never practiced creating any art except for during classes and drawing li…
ytc_UgwmIVK-y…
G
those types of people are forgeting the definition of "art".
they should pick a …
ytc_UgxWl5mes…
G
There are definitely data centers using water in the most wasteful ways. Some ha…
ytr_UgzxvJhfc…
G
I’m an artist who is neurodivergent and struggles with both an anxiety disorder,…
ytc_UgzMRtc9X…
G
Nothing was “stolen” in the training of AI. If you understood even the basics of…
ytc_UgxZ8dyco…
G
to be fair to the ai art bros ALL their art is ALWAYS half naked chicks and wHat…
ytc_UgxdYNcUB…
Comment
As a data scientists (the people who work on AI for research and statistic.) I can say that AI is to early to be released to the public. While it is actually very useful in terms of doing searches for you and doing research on things like geo-guessing and finding trends. The best way to collapse AI is to feed it it's own trash and/or if all artists only do art poison for about 1 year straight. While we see signs of uniqueness with AI it is not yet to the point of being able to make it into art yet, the best example of this is hotpot AI. While I know art poisoning does very little, it may be AI itself which will collapse itself rather than art poison, just as long as more AI spit out trash. Though AIs that stop collecting data would still be unaffected by this.
youtube
Viral AI Reaction
2025-03-31T16:1…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzSqgG2Jrtder4IhEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwjX6BJLRuSVNYHj454AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyzgJT0fcIjv5Oy2OR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzEAt1Dggeu2DnBj_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgymcB2ty9iGxlY1_i54AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzdLKxmKQdu3NuLvm14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzyOl7INi6H1NDHdlx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzsjzv2Q26rMKJ9l3Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxHKI-9Vbf2Y6cQxKJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdCKkgxov7dDMZzxN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]