Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Since when is there a race to create AI to take over the world tf…
ytc_Ugz2ec446…
G
It's to threaten the peasants with so they don't try to take AI's jobs from them…
ytr_Ugz7D5R1x…
G
a Human taking inspiration is way different from a soulless AI stealing work.
h…
ytr_UgwhGVenW…
G
This happens every time AI models are used and only respond with "societally acc…
ytc_Ugy2eAaWk…
G
My mom showed my grandma how to generate images with ai and my grandma said it s…
ytr_UgySZCcwP…
G
Also this implies we all start with digital art anyways, like all of us start on…
ytc_UgxH_ZKAK…
G
A key part of your assumption is oversimplified I think.
We currently already h…
rdc_fcsv5q5
G
Agreed. I'm a game developer and use AI, but never to harm. I ask it if what I m…
ytr_UgzNsiHPc…
Comment
Here, let's play a game: If you trully 100% trust that nightshade can trully prevent AI training... why don't you just make a large set (200+) of your images protected with this technology and give someone who knows what they're doing to try? lol (not some random guy from reddit, someone who "works" with it.)
If they can't train with your set, you'll have a proof that nightshade works... but the catch is: it can backfire horribly, like it already happened... afaik, people already have anti-nightshade methodS and even AI can revert it.
youtube
Viral AI Reaction
2025-03-31T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxpSafk9dkfRxXsr3Z4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_UgzsRHdWx_hYrPdqjw94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzORjGrk0AQ15lj4Kx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},{"id":"ytc_UgxKSoLgHlVQC69A74J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},{"id":"ytc_Ugx4ANmsrMVUptOT3S54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugy0ORi-XJjrpd57oH94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugw5V2tk25xlK3uPWFZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwyQWZzffiaOTb22up4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugz3oGicPyzc1iVmucB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},{"id":"ytc_Ugy80V6ZyLzH45dJllx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}]