Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’m a dyslexic author (dyslexia is a learning disability that makes it very easy for me to misread something or make spelling mistakes). Ai generated pros is functionally just glorified autocomplete. It constantly falls into ruts that make it very obvious that Ai was used. Odd word choice, repetitive character names, recursive plots, nonsensical metaphors. Additionally, AI’s are severely limited in terms of their word count. So generated “books” are always really short and the Ai keeps trying to end the story because it knows it’s limited in word count. I may take a lot longer to edit than people without dyslexia would, but the argument that I should use Ai to make writing easier is just… stupid? Why would I even want to off-board the effort I put into my art, something that makes me happy, onto a computer? Why would anyone take pride in generating a story? You didn’t make that, you asked a robot to make it. It’s the equivalent of bragging you got an A on an essay you paid someone else to write. The idea that Ai bros even care about disabled people is laughable. That comes from the idea that being disabled means you’re incapable of doing anything, it’s not, it’s just harder. Even then, we’re just a convenient talking point. And you know what I’m proud of telling a story even if it’s hard for me. Ai bros are just the same people who wanted nfts to be a free money printer jumping to the next thing. Remember when Facebook basically abandoned its entire brand to be “Meta” as in “meta verse”? It’s the same capitalist losers trying to make normal people buy crap for their own enrichment.
youtube Viral AI Reaction 2025-04-28T23:5… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugywlz5Y0kcLP_-mG3l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzrRCD95r_akG1xitl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwku5mQdf_Znn5dRM54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwjbgehftXEZ2Ck1ep4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzYFV67hDIMAHVomw94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy5za-4wbQhv7yoOy94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy6zFF-CofL4-Cl8st4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxCXKFCnatv1hQtHfd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwMWurBwSsUuQEhER54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzl1PaeoJjWr-Z0qvh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]