Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There _is_ some nuance. Training an AI is a lot like learning, you let it do something, and compare the output to the real deal. If the model is not overfitted, then there is no physical way for it to store the entire image, no matter how compressed, what it _really_ stores is the style and motifs _learned_ from that image. And while some artists just _looove_ to think that they "own" a style, motif or idea, that's just not the case. "Copy" is the digital age is also meaningless. Your computer stores a copy of every image you've looked at in the past 30 days in your browser's cache, and non-AI tools already processes every image you save to make thumbnails. Technically speaking, AI companies don't do anything your PC doesn't do already. Of course the intent is very different, but technically AI training uses even less of an image then the thumbnail Windows generates. What this really boils down to is that we are ok with people looking at art to copy their themes, styles, motifs and ideas, but not machines. AI is only stealing stuff, as much as you "steal" a movie by watching it, or "steal" a picture by seeing it. Artist simply don't want a machine to be able to recreate their art, no matter how the process works. And that's not wrong, I just think we should be honest about the intentions, and not trying to hide behind technicalities. Also AI companies like Meta do absolutely pirate a lot of content, it's just they don't _have_ to. They could actually buy the content they are feeding to their AI, and the result would be the same either way. (As a sidenote, LLM-s _are_ overfitted and they _do_ store a large amount of the training data in them. That's what prevents them from hallucinating facts, at least most of the time.)
youtube Viral AI Reaction 2025-12-21T03:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxkohfjhkLTL1DnD014AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzclSieHZ8WQgXg00t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwN4FCMdklzdpEpFO94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwkozT4PDafBcuq4114AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzNlOPSkIx_LgK4sg54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyEko5hwnDiqZeBgiF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyFo7gY_mmJvUKO3md4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwrK3V31SAxKiaJxT94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwtmwYV60hLjSaSZHF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz0Ww65QpJFcl9eweh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]