Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In some ways, trying to explain a generative learning algorithm is like trying to link natural selection and evolution. You can point to known examples of natural selection and people will agree "yes that makes sense", but try to extrapolate it over thousands and millions of iterations (often with little in the way of verifiable evidence, btw) and people will tend to go "wait what?" and shout "OBJECTION!". I've watched a few lightweight explainer videos myself on how these algorithms generally work, but I think what makes the generative model look so compelling (at least from the company perspective) is the idea that you can train the algorithm using _nothing but actual examples._ Compare how you learn your first language from _nothing but actual examples_ of other people speaking it. Or, to use an actual modern example, a large-scale agricultural farm can employ AI-software-powered drones to deliver targeted doses of pesticides to _just_ the plants that actually need it (known as "spot" treatment), avoiding the need to apply it across the entire field, reducing the pesticide usage by _nearly 90%,_ which is not just a huge money savings but also a huge, genuine benefit to the surrounding environment at large. The core problem when the "ai techbros" claim that generative models learn "just like humans do" is that we CAN'T actually define _how_ information is physically encoded in biological neurons. Generative AI is trying to imitate something that even we _only understand on a "black box" level._ "Publicly available information" is SUCH a euphemism. dA, for example runs their own AI model (a Stable Diffusion implementation) and one of their FAQs is "does it train on art published here?" and the answer is . . . "well, we use Publicly Available Datasets" when a better answer should be "Not _specifically_ or intentionally, BUT we don't control the dataset we draw from so we can't guarantee it doesn't either. For example, if an artist publishes the same piece across multiple sites, it could get added to a dataset from any of those points." So on the final question if "is is 'stealing?' " I think my answer is "not in the way you're insinuating, but I don't know what to properly call it either." From a strictly copy-rights (note the emphasis) perspective, the "stealing" relates to having obtained a copy of the asset _at all,_ and not whatever else happens to it _afterward_ (stylistic imitation of specific named artists, etc). And *when people say "stealing" in an artistic context, they're more often referring to its **_usage_** not its **_acquisition,_* and that's something I feel I need to point out every time. Yes, the two do overlap but they are not "the same". There's an older Tom Scott video, a speculative fiction about an AI that was "only" instructed to enforce copyright laws on registered works, with as little disruption to society as possible, but when it gets access to cellular-scale nanomachines and learns to decode how information is physically encoded in human brains, now it can literally "read minds" just like accessing files on a hard drive, and just like files on a hard drive, it starts _literally erasing_ knowledge that it deems to be "unauthorized copies" of protected works. The entire population becomes "infected" creating a dystopia where artistic knowledge is forever limited to pre-copyright works (e.g. prior to 1900) because literally _everything_ more modern is under the AI's exclusive protection: - Watched a movie in theaters? You literally forgot every specific detail about it overnight, and can only remember whether or not you liked it generally. Because they don't have a license to reproduce ANY specific details in ANY form. - A musician tries to cover/adapt (or even parody) a song, but for the life of them can't remember how it goes. No matter how many times they re-listen to the original, they literally just can't, because doing so is an "unauthorized copy or performance" of the original. - A kid wants to draw fanart of a character from an anime or videogame, but is unable to recall what the character actually looked like, because they're not licensed to make copies of the registered work. - Even if somebody discovers the AI has gone rogue and now needs to be shut off? They're de-motivated from doing anything to stop it, because that would be a potential disruption, better to not let the humans do it. It's not like the AI is doing any "real" harm for the good it also does, next topic please....
youtube 2024-07-15T16:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugxa9JmiEt2orUn8FYl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwK9oDgbSUDXamjswB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzE7egpKRT2a9kexp14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxIVJZPQHS6eP0QDLZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyoFjKWmuo8PiU9Rip4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgydzWuQ5BolQJ_AYKB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw2HR5dXdI2JVQ3-_l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgykhxVAYKBLmP0KRyp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwQobBPItqKVQRsbJV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzZom4xQpYM9r3l2vR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]