Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I disagree. Acting out of fear is certainly understandable, but I think it's a mistake. Consider... In 200 years, when somebody asks the omnipresent computer, "Please show me a picture in LavenderTowne's style".., if you have done your poisoning work effectively, the response will be, "That name is not in my files." Poof. You don't exist. All your work is gone. Your contribution to the human artistic record never happened. Personally, that seems tragic. For my part, I've put a ton of effort into my own work, developed a unique style which I like to think is a positive contribution to the art. I can see and express things in a way nobody else can, and I'd be sad to see that lost to the winds of time. I WANT to contribute to the artistic record. I WANT my style and works added to these new models. I'd feel relieved if I could type my name into an LLM and see my stuff come up for other people to benefit from. I try to put positive, encouraging energy into all my work, and I always try to make the world a better, healthier, stronger place. It's crazy to me to not work toward that. Beyond that.., I think it's actually selfish to restrict access to your work by historians and remember-bots. Sure, you need to make a living in the Now. But an artist's life is just a momentary thing. In history's long run, an artist should be working for something greater. Bearing in mind, of course, that I'm coming at the question from the far end of a long career now. For young artists, I'd offer the following: If you stay tuned into your soul, no matter the tools, you'll find a way to connect. An LLM can only copy and average existing viewpoints. You occupy time and space where you alone exist and see and express from. Nobody can overlap your head with theirs. But they CAN inject your head with averages. Toss your smartphone in the closet.
youtube Viral AI Reaction 2024-10-25T13:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgzZQJ-sWw46fMBNsf54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxfgHTNyyvSMNdhrTF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzGpECJE4qpuGAcKP14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz9bb_sq5XqgVsJvSp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgzQ6nOEb6rTj7uPqi14AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},{"id":"ytc_UgyiioO4ZJodjUrT7fV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgxxdLd0Eyhu9kFHppJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgwECumnX8w8ECVE-yh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwQqV-yDmqMweeOOjV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx478aQQgaAHGoNZpt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]