Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
First, let me say, I really enjoyed the video. I actually unsubbed from Asmongold and subbed to you instead. I am a author and, believe me, AI is definitely on my mind often. However, here is the flaw in your argument (and I really wish this were not the case - trust me): you are speaking about AI as it is right now. In the next few years, as it gets better at not only creating art but also at manipulating the human condition, AI generated art will be completely indistinguishable from human created art. After that, it will be better at holding a person's attention than any human created art (due to the aforementioned manipulation of our emotions by AI - something that already exists in AI models) AND it will be in the hands of a generation that has been raised on it. The appreciation for human made art will drop (as we have already seen with these people you mention who have tons of "supporters" who simply don't care how art is created, as long as it is "better"). AI is a time bomb and we all look at it as a fun little toy, a joke, or naively, just a tool that we can use to help us. We foolishly think we can control it without ever consciously putting any sort of controls in place. It is going to outsmart us. It will learn how to use our emotions and our brain's chemical output in the same way that drugs do. We know this - because companies already do it. We are building our own assassin and replacement, and we've become to stupid to notice. There is a saying meant as a sort of curse (often incorrectly accredited to the Chinese): "May you live in interesting times." Sadly, our times are very interesting indeed.
youtube Viral AI Reaction 2025-10-30T16:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzzuW8surXwW9RYlcd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzNxYCUDqLBNwZnFlR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz_bJU4UW6b6sKC6UN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw-Z6iwP5zgKhLYeTx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyat7NiqBmaG7EjVsZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzQba4ujX16gRpupWR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyM7tYPfgLeysv5sOx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyQl-77Hpjd312zQf54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyMkwFUSJOtMqiiPF54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzrlN-0Lsz04S0aZiB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]