Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've had lifelong depression and my vitamin d levels have been fine. I took supp…
rdc_ebycv8x
G
If you want to use AI, replace the CEO. There! Improves your company's image and…
ytc_UgwnkJHdR…
G
@Teuwufel i was thinking lately...
The whole ai debate is just climate change …
ytr_UgwSLkpXX…
G
AI taking jobs seems daunting, but OSVue automates tasks without replacing my te…
ytc_UgxUOnlWI…
G
Because we are in an era where “anything goes “ it seems like we are going to ne…
ytc_UgwIYdVir…
G
AI's changing the hiring game, but ShortlistIQ helps us bypass bias, which is mo…
ytc_UgzZ7Uyjp…
G
Once AI was introduced to these companies. You know all the call centers, opera…
ytc_UgxWHatCt…
G
Maybe U should study Islam properly and there's a reason he said "only the name …
ytr_Ugz7QeqA3…
Comment
There's nothing fundamentally wrong with AI models, but companies do need to be held accountable when they use copyrighted images as training data without an artist's permission. Especially in cases where the AI model is generating images with commercial use in mind. Nightshade is good, in this regard--it only punishes bad actors.
The unfortunate thing is that this whole culture war could've gone very differently. AI models have a lot of legitimate uses, and could be applied to many parts of the artistic process--better stabilization algorithms, improving animation workloads, special effects. But at this point, it seems like the well has been poisoned enough that artists will avoid anything with an AI label on it. And I can't blame them, even though I'll never agree with broadly rejecting an entire category of technologies.
youtube
Viral AI Reaction
2025-04-16T17:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx4rpcj9rK1NuF2ztt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwowDoc8N_XPYsK9jh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9wCKzAPoQuBnRphB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxH3DzvJMig9Hzvn254AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxWWZjXD1_mabutCEt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwSJdS2CtwTHOrzEyt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwiIMie57bPXb4Ppk54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySPVvxaWiPI5Q9Z4R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxR1dWwm9aJ0wiBdHJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxBy_9KN_Ks54p5h_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})