Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I would say both you and the "redditors" are both halfway to being right about nightshade but neither of you are fully correct. You're correct that it's an issue for the AI companies, but you're incorrect about why. It's not an existential threat, and even if nightshade is named after something very dangerous, it's effect on AI is closer to a hallucinogenic mushroom than a nightshade plant. The effects only last as long as the model is influenced by the corrupted training images. You're right that they haven't found a way to mitigate the effects of nightshade - but they can search out corrupted photos in their training sets and undo the damage later. So, it's more of an inconvenience that costs them money than an existential crisis for AI companies. The art traps *are* doing something, but they're more like a drug that creates a temporary and potentially expensive inconvenience than a poison that causes irreparable harm that will eventually cause the AI models to cease functioning. In other words, the redditors *are* correct that *your* corrupted images aren't doing anything because the AI companies probably keep their models from training on photos that are publicly known to be corrupted. It's the artists who keep quiet about it who are actually harming the AI companies and costing them a lot of money. It's they they're trying to make this a big conversation. If you out yourself your efforts go to waste!
youtube Viral AI Reaction 2025-04-29T17:1… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxoOhid_XOW233nXml4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyVMrcTYpbHcBumyL54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxI1vKOcfDe9kkZCQh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyCr4f9aLnIjlkpKLN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx72IHr6s4oAKfSHtF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"disapproval"}, {"id":"ytc_Ugwhsl4ba9fu7pQkk4R4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgwjoLTt-Vkw2smhd0d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyezI_WbL0Dv46ies14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxu2CkK9FEqpEVKz4Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwQ0mX4OFsiV64gxu14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"} ]