Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't think AI slop is that much worse than human made slop content, but that's not the issue human made slop requires time, money, editors, the weird youtube kids stuff still needed to hire voice actors and editors so it took time to make. sure it was as optimised as possible, resulting in usually 1 post a day, sometimes 3. the more you would want to post a day, the more money you'd have to spend on editors and voice actors and then you risk the content not being cohesive as all of the editors will do things slightly differently and all the voice actors sound different AI doesn't have any of those problems. I foresee an internet in which you have companies that mass produce content like a factory outputting hundreds of videos daily. maybe this is just me being one of those doomsday guys in movies shouting at crowds holding a sign saying "THE END IS NIGH" but I think it could result in the end of the internet. even if you think AI is good an allows people on small budgets to make stuff like the analog horror where the guy has a story he wrote and wants to tell, but doesn't have the budget or skill to make the art, he's a writer, not a visual artist, which is fine. that means he still has to spend time writing those scripts himself, where an AI factory account can produce hundreds of scripts in the time he'd make one, and those accounts don't care about quality or anything like that, they care about money and views we'll have content factories basically diluting any actual art someone makes ok I'm being a bit dramatic, I reckon what will happen is that ocean of AI generated slop for a brief period, people stop using social the internet because of it, and social media companies all implement something that detects AI content and blocks it
youtube Viral AI Reaction 2025-09-09T04:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx1BKJcnAaktgO9gFl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxMDf7DYDKNDeCsEAx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyHrj4taIDB1M1NC6l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzNFABH6PL77os3WFh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw2MTzqlU4Up1JB7GF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxZ3dVXV4zqxbgx73V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxYlG_WSqQv-zCQ72Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw_GTO_vGJPsoppKKp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzDrVnwDfk6SOFZD194AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzQARDamyg8YwwYEx94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]