Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm not an artist but I'm totally against AI, not just against the unethical practices that you've mentioned, but AI as a whole. Because, I don't see a good future for art if AI keeps going forward. Let's look at it from another perspective: let's imagine there are AI services built upon an ethical dataset with the proper licenses and all. To make things more interesting let's say that artists which works were included in the dataset get paid royalties for every generated image. Sounds like a great deal, isn't it? But... Can you see the problem here? There are many, many artists out there. It's imposible that companies running AI services would be able to afford paying those licenses and/or royalties to everybody. So what happens next? Only a handful of high tier artists would be able to secure a spot in this "AI business". What does lead us to? Those artists that are left out won't be able to compete with AI that spits out several images per second, flooding Twitter timelines and every other platform... So, eventually no more artists will be born. Therefore, art itself will stagnate, because AI can't create new styles, it can only recreate patterns it already has, and without new artists there will be no more new patterns for the AI to "learn". In this dystopian future that we are headed to, art will be only a shell of its former self, lacking any creativity... I don't want that future.
youtube Viral AI Reaction 2022-12-29T21:0… ♥ 4
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugy96MpPPfzoKLDNMX14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgzcmK3FPf-uWNWWicd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgxuNJZbrLPR9nA0xGJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},{"id":"ytc_UgxGgTm9rIgqgQ6qxrp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzUBWhkatrC-1BilP94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw5QyArKC0hQPTqtQF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwEFTJolYhpZ6Fg86B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgwS2ZKdmrBt60sirPt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxZ7Pe7cbVFvQY8RWt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgyxQkpXS6EHBllu3It4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}]