Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is Elon Musk worried that a scenario may arise whereby humans will be abused /…
ytc_Ugwb_rFYL…
G
@johndanes2294 Oh, did I? I must've lost track somewhere between video #1,240,9…
ytr_Ugy4cmq4v…
G
Why use this though? or any AI. It kind of defeats the purpose of learning and b…
ytc_UgzvpEMLj…
G
Im not an artist. Im shit at drawing. But. I keep practing and doing it anyway. …
ytc_Ugz7uPB4H…
G
On Healthcare though, the quality is so inconsistent and not trustworthy, I for…
ytc_Ugwg_TyFY…
G
AI might streamline some tasks, but without Codoki, I wouldn't trust the code I …
ytc_UgyXstq_h…
G
Medical malpractice is the third leading cause of death in the US. AI has a lot …
ytc_UgwQcsONs…
G
ohh god, the point about AI not writing reusable code is so true, i literally ha…
ytc_UgznkmZmh…
Comment
The worst part is that every moralizing argument in defense of AI "art" is equally in support of Nightshade. Theres no way to criticize poisoning AI engines while simultaneously defending those engines as anything but theft and still be morally consistent. Its a *purely* self-centered reaction - they dont care about anyone elses rights or needs, they just want product now, for their own profit, without their own effort, and without paying for the IP.
Its a fundamentally antisocial stance. Either nightshade is just as validly art and moral as the AI engine, or the AI engine is a theft machine worshipped by ingrates and Nightshade is neutral.
Theres no way to classiy Nightshade as immoral in the first place -- the most you can call it is Dadaist.
youtube
Viral AI Reaction
2025-04-08T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyhNZTY8JjC2c9wqat4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSuitqL6Mg5J-4FqB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzjHYePS607ITlH4X14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwNUmY2lhhR4r9NWcl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzWbhD03pHRElAjnRV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy2fcxB48aIDfy219d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0FxwMon-Tc4VyJid4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNDUhVszvDeZb7m2l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOcJNfrKkDlxLs3iV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwj4RPekXBJXuX0sUF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]