Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is going to be an unpopular opinion, but the process of training an LLM does not infringe upon copyright, nor were the companies required to license or ask permission to use an artist's work. As an artist who almost certainly studied or analyzed another artist's work, did you have to get permission to do so? Probably not. Almost certainly not. LLMs are pattern recognition algorithms. They work on probability to figure out what comes next. These algorithms were built by analyzing the structure of sentences, and using statistical models to predict the output of a LLM response based on the prompt input. The only real difference between what ordinary humans do when analyzing another creative work versus what an LLM does, is that an LLM can do in a fraction of the time what it potentially takes a human a lifetime to do, but of course, do it poorly. If you try to make the argument that it was unethical for these LLM companies to train their models on copyrighted material without permission, just imagine how unethical you as an artist have been your entire creative life for doing the exact same thing. Shame on you. You never asked the copyright owners for permission. This is nothing more than the pot calling the kettle black. Hypocrites, the lot of you.
youtube 2026-04-10T10:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxplO5Dn2KidY4vkll4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyCSZaAl3Yyoymm2wZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"sadness"}, {"id":"ytc_Ugy949M7wrtULItcT5l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyluuMGcXiUx1tlu6N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgylXXHwkCQUkDc8wk94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzW-9bE4Tlrqujebup4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugy9iz1jchTFs9hiRQZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx54fw0pTB8kHF0u1t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxd67MavUDplSm9y794AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxW0X77637XLOEdgwd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"} ]