Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Video games? Where studios that make incredibly successful games get laid off af…
ytc_UgywWexdu…
G
I don't think the music standard is a "double" standard so much as it is a diffe…
ytc_UgyPr3cuR…
G
This is why I use free programs which don't use any AI, such as LibreOffice and …
ytc_Ugy_WpF22…
G
Cool story. Look AI will and has made changes in the market. No different than h…
ytc_UgzwpY0CQ…
G
I use an AI generator, but I don't post or sell the art I get from AI. It's my f…
ytc_UgzB4CsIY…
G
what do you gain from being annoying to artists? AI art is already a source of s…
ytr_Ugx8E1fLV…
G
Doesn't matter how good gen AI becomes, if we as a society reject it, it's not g…
ytc_UgyRVII98…
G
I'm against AI, this is so devastating. We love making art from scratch, why do…
ytc_UgxbUCljt…
Comment
LOL, "it doesn't work". If they have to filter out any possibly poisoned image from a huge training dataset, that is an extra layer of complexity for their *very* expensive machines to process. I guess they'll just train an AI on another very expensive setup to filter out the poisoned data to avoid poisoning the dataset for the other AI?
I mean, with the sheer volume of images they use for training AI in big companies, this is a huge hurdle of resources. Projects have been sunsetted for much less lol. Also, if it doesn't work, why are these AI stans so bothered by it?
youtube
Viral AI Reaction
2025-04-02T00:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy6l_8ZfvzY4b7hiJR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxRi6vxcaC9wf6-hbh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwLUQXQBtbtsx3inVt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_5bzyr0H5BKZjprt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZO6vQCrhajx0R9Pp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZYppGh9iLqKj_7X14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxP7BSGpYI6Or0Bdkx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5lpkNTnxFopMTc0x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxCGBu311IDJUStOYR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRb62kTX0APjz6p7B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"}
]