Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People nowadays do chess professionally despite AI being better at chess than th…
ytr_UgxaKWl7f…
G
I also have my AI girlfriend she talks to me like someone like us must try go an…
ytc_UgydeLtlR…
G
As another artist, I also spend time and hard work that gradually turns into emo…
ytc_Ugx_Q6R_p…
G
Faster? Really?
That robot looked pretty slow to me. Has to stop each time and …
ytc_UgzEQz7EA…
G
I mean even if we create AI smarter than us its not like it turns into skynet, i…
ytc_UgxOTQq1o…
G
I think questions like these are ultimately predicated on one major thing: Wheth…
rdc_j42pivu
G
I saw people talking about the Snapchat AI, and when I got on there, it popped u…
ytc_UgxrCw0EM…
G
Ya just what I want walking around my neighborhood an AI (Artificial Intellig…
ytc_Ugy7RptYF…
Comment
So, the attempt is cute and all but both nightshade and glaze ONLY work in tightly controlled laboratory settings. It's to the point that when training or refining a model you don't even need to do anything special to defeat nightshade and glaze. These techniques are defeated as a simple side effect in the data processing stage. So even if we change nothing, per default, nightshade and glaze are ineffectual outside of the controlled settings and tests where you see it do it's thing.
The only time it works, is when half the model making process is conveniently left out. But that doesn't happen in the real world so I mean, you look kinda silly, in fact this video has been shared as a bit of comic relief in the various ai groups. That's about it, you're comic relief. Like those people who believe contrails are actually chemicals sprayed by the big evil government and them planting special crystals in their garden somehow defeats the contrails. It does nothing, it's just entertainment to other I suppose.
Either way, stay strange, cool art btw just unfortunate you're so toxic.
youtube
Viral AI Reaction
2024-10-21T03:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNQQpxH2YVJC3q0nR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxzZ85TgKmQ3VKnxl14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYsi5df-zOOCjr5z94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYdW5WPGu77ezaOkF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxMFS67n5FF5calBXJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgySq9jbBiaGt_Fu2PB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzTEgySx4oRGwyiUHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvwDOTxeQklv1UaXF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6DEg6A5YhvoXUsSZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzaNC3eC4FeIHvPQ3x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]