Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Believe me ai will most definitely take over therapy ! Do you have any idea how …
ytr_UgwKOVs9O…
G
The AI must have found out that there are more biological differences other than…
ytc_UgylJu5co…
G
The question that haunts me isn't whether AI will become superintelligent. It's …
ytc_UgwYjJ4tg…
G
Though it's been years, I spent two years in college for 3D animation and video …
ytc_Ugy0rEnUg…
G
I'm sorry but how would AI replace as physical jobs as forestry, fishing or farm…
ytc_UgwPaSSfl…
G
AI is SO overrated, and is just repackaged google on steroids and I wouldn’t tru…
ytc_UgyDEtToi…
G
😂 I’m begging for ai to replace everyone. The more jobs they replace the less pe…
ytc_UgzaSZnBC…
G
Ai “artists” just type in a prompt and take credit for doing nothing. Real artis…
ytr_UgwyYe3BP…
Comment
@fawndeu When technology advances - it damages or destroys certain existing industries. It usually creates new ones along the way, but that's not a guarantee.
I also run a video game studio, and there are artists I work with. As a small studio - art is an expense, so what's bad for an artist is actually good for me, I get to afford more art for my product. You might think I'm just a stingy bigot, and that's fine, but it's a fact.
There are winners and the are losers, what's good for one might be bad for other. I hope to see artists embrace the AI and become more productive, being able to put their cool ideas "on paper", so to speak, much faster and with less effort. Maybe I'm an idealist.
Problems with tools like Nightshade - it's a grift, and a bad one at that. The researchers are actively looking for funding and the reason they did this project is just that - to drum up some buzz. They admit in their own paper that unless an ungodly amount of money is poored into Nighshade on continuous basis - it becomes irrelevant. Why is it "bad"? Because it just delays the problem, and the result of that delay is more electricity being used to rerun training, and more silicone being required. It's making the process more expensive, not in a way that would prevent AI from working, but in a way that just creates waste.
Here's an example of good poisoning - encryption. We make breaking encrypted message so hard that it's pointless to try, the attempt is not worth the effort. What if your data could be cracked, but it cost 10$ to do it? Do you think it would deter anyone? - some, sure. With Nightshade, we're talking about increasing costs to AI industry by maybe 0.1% or something like that. Worse yet, you promote industry that compiles training dataset, AI company sees poison, and they say "nah man, it's too much hassle", so another company springs up that sifts through this data and sells you "safe" results. It's all around bad, for artists too, it gives them false sense of security and wastes their time. This lady here, seems like a cool artist, she created these images in hopes of harming AI, right? If not today, months from now, her "poisoned" art will serve the exact opposite purpose.
youtube
Viral AI Reaction
2024-10-23T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgyuQYter_QsdJHvyFN4AaABAg.A9womGCEQfyA9wuptwbRmd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyuQYter_QsdJHvyFN4AaABAg.A9womGCEQfyA9x9TmztVJb","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyuQYter_QsdJHvyFN4AaABAg.A9womGCEQfyA9xTJ6TeudX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyuQYter_QsdJHvyFN4AaABAg.A9womGCEQfyA9xU_o03GFW","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugx51yb2sK0umgDMxAJ4AaABAg.A9wohSKyC8qA9x9juFVJXH","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytr_Ugytn8bDaqHr-oa8sI14AaABAg.A9wmKlFlrt_A9wtxtylYaC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy6agB8XDU4u9az2rF4AaABAg.A9wiTxu281AA9yyJM2Q2Jp","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugy6agB8XDU4u9az2rF4AaABAg.A9wiTxu281AAA1wEvJj0W0","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxbD1EPmnvQBVwzfvB4AaABAg.A9wiRMmavTPA9ws59pviTe","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgwDP9LInx5w6kE2WPt4AaABAg.A9wgCpZrLctA9wiEu5Rda0","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]