Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You got the Luddites bit wrong, and it's very important that people like you stop getting it wrong, because it allows these assholes to keep bandying the term about like it means anti-technology. The contention wasn't that a machine was doing the work, but rather that the wealthy factory owner was threatening the stability of the community for personal gain at the expense of the workers - i.e. exactly what these AI companies are doing. The Luddites were fighting the overnight destruction of their lives and livelihoods (and important parts of their culture) at the hands of a wealthy industrialist. And they were right to, as those machines went on to maim and maul their children and their children's children. Their assessment of the brutality inherent in the action of replacing them was entirely accurate, and society was not, in fact, better off for a bunch of peoples' lives getting juiced onto the alter of profit in exchange for cheaper textiles. Notoriously, things got so bad that the entire country kinda fell apart at the seams thanks to a cascade of similar actions by similar industrialists, and we call that the Great Depression. And now we're repeating that same process again, but with even fewer places for the people impacted to retreat to. Suggesting that the Luddites did what they did purely out of a phobia of technology is making the same argument Shad is making when he suggests that these tools are "democratizing" art. It was not a worthwhile trade to let a rich man gut a town on a whim, and it could never have been.
youtube Viral AI Reaction 2025-08-12T02:0… ♥ 2
Coding Result
DimensionValue
Responsibilitycompany
Reasoningcontractualist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwQwV2j05Ua9Ovpnfp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzkCbsDEZnnoTFoSMB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz4StxIJr8FphCKAul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzAQ-IkuFcnf7z8RXh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwDSe_nqJA4bViD0RZ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzxaZO8kQWq_LpB7Ot4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgxrCWd0bHox7qBfFRx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzDMLGKIaig3Eyuk-14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgwfXj3axvLgc_VIazN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw_1Bb03lgXQeIxj5t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"} ]