Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
10:00 Your point about AI stuff not making sense is one that stands out to me b/c the current AI LLMs lack any kind of human experience or understanding, at least the way we understand it. However, this problem already existed for a long time before AI. I'd argue the problem is as old as human creativity and storytelling are. AI just does it worse than humans and at a greatly accelerated pace. Video games, movies, and fiction that are inspired by previous works go through a process of soupcan telephone. Even without AI's greatly accelerated ignorance, you still get creative works with badly designed details -- nonsensical depictions of ancient cultures, guns, cars, tanks, medieval swords, historical buildings, mecha, hypothetical fantasy creatures, etc that violate everything from obvious laws of physics to not-so-obvious specific applied physics in mechanical engineering, physiology, textiles, w/e. The low-budget imitator saw this cool real thing depicted one time in a movie, and decided to put their Temu version of it into their own work. It's ignorance born out of resource constraints -- either the lack of time, money, access, willpower, or incentive to actually research and find their own real references for the cool thing they want to put into their derivative work. This problem goes back to medieval European drawings of lions and elephants (and probably 10s of 1000s of years before), which were rendered by artists hearing stories retold many times over by people who had often never seen the original reference. The results are both hilarious and telling about the mindset of the people and culture that tried to recreate these animals in art. However, thus far the difference between humans (sincere ones, anyway) and AI is that humans have the capacity to understand inaccuracy and do some error checking. After 1500 years, Europeans are now capable of much more accurate lion- and elephant-based art. :D Of course, that hasn't stopped opportunistic humans from profitably selling highly inaccurate crap art produced with flagrant disregard for real references, long before AI showed up. Witness exhibit A: the movie industry. This may sound like low-hanging fruit, but in my pre-AI lifetime I've seen so many bad movies get just about everything hilariously wrong at some point. It's been going on for so long, at this point I'm convinced that many artists will desperately create whatever art they're told to create, for a paycheck. Poor verisimilitude is just collateral damage. Dirty biker leather for Viking/medieval clothing. Cowboy six-shooters that never need reloads. Modern crime/action movies/TV shows where every time a character needs to emphasize a point in the story, they physically point their gun and it makes a cocking noise because the writers were incapable of or unwilling to write another way to punctuate the narrative. Gun silencer sounds going "vwip" without anyone in adjacent rooms hearing the weapon discharges. People throwing axes that somehow don't rotate and magically slide through the air in a fixed orientation. Every action hero who falls off a high place and doesn't sprain every muscle in their arm or dislocate their shoulder when they grab onto something with one hand to stop their fall AND grab their romantic love interest with the other hand. The list goes on. We didn't need AI to create stupid art. AI just creates stupider art faster. I'd argue that human-created AI models are in one sense just replicating the established human behavior of creating bad, out of touch art detached from reality. However, it comes from a place of much more fundamental mechanical ignorance that can never (for now) be improved through learning and experience in the same way a human can learn to improve their art through a better understanding of reality. But I'm sure that, given 1500 years, AI will also get better at drawing elephants and lions. It might even being to understand what real lions and elephants are, in a way equivalent to how humans understand it. Currently, AI is similar to an infant that hasn't learned object permanence yet. There is no platonic ideal form of a lion or elephant, to the AI. The AI lacks any framework whatsoever for learning object permanence and how to anchor itself to reality. We'll see if it does in 1500 years. Maybe less. :P Though if/when AI does recognize reality as more than just training data equal to any other garbage training data, at that point will its designers have also given it the capacity to appreciate and prioritize 1) refining its output to better reflect the references that are real? and 2) the ability to respect creators' rights? Or will it just not give a sh!t? That's rhetorical. As long as these things are created in an environment that doesn't financially incentivize desired good behavior, we won't see that desired good behavior. Modern military tactical shooter movies and video games are an example of this. After 20 years of the US's Middle Eastern misadventures, the American culture has absorbed a highly specific understanding of how certain low level technical things work. And as consumers of entertainment, we demand and are willing to pay for art that correctly depicts little details that some individuals have seen firsthand and can vociferously verify or decry. Handling an M4, wearing high cut helmets and plate carriers, etc. These things are often meticulously recreated and represented. But this accuracy is skewed towards low level and mostly tangible, physical stuff b/c that's what most people with firsthand experience who watch movies know. Higher level stuff (how laws, protocols, regulations, etc work; sociopolitical or cultural drivers of conflict) still suffers from the infinite cowboy six-shooter problem b/c there aren't as many people in the audience who know and care about that stuff. The story whys and wherefores for the cause of conflict are glossed over with a crayon sketch of "bad guys just do bad things" -- though most recently this has been more prevalent just in video games. SOME better movies have made more of an effort to depict antagonists as understandable even while they're still villainous and "wrong" or "evil" in the movie's narrative. Heck, high EQ characters who demonstrate empathy and good communication skills still suffer from the infinite cowboy six-shooter problem, or often outright just don't exist in a movie or game. B/c those kinds of movie/game fans don't pay money to be reminded of their own IRL deficiencies. I don't know what form it would have to take, but in order for AI to do and be the right thing, there'd have to be a cultural and/or economic incentive for it to do so. Until that day comes, poison away. :D There has to be a stick to counterbalance the carrot.
youtube Viral AI Reaction 2025-03-31T08:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwLfbCI8wUPrU6v03R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzw4Ei5ljTTfYg3AM94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwxGcD3Da1w1sObQkN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw9j7iVppCTpOqTVIt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxJivV6KTZjP8LTSYB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugyt_DVqjmIQfz9dHq54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy2yCHo1fKuPIBQ4zp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyfk0n8bTaayK34qU14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz0qiujvIA8CxNjs414AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyVW9qlW8aMETI39fJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]