Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
13:07 I think it's worth noting that the real luddites weren't just anti-technology for no reason. They were against the use of new textile machines because they feared it would lead to job losses, lower wages, and give the factory owners FAR more power over their time and trade. AND THEY WERE RIGHT!!!! During the industrial revolution, despite all the advances made in society there was still far more exploitation, pollution, and play general disregard for the lower classes. That's not to say that the times before the industrial revolution were better, people's quality of life generally improved, but power became far more centralized than it was before industrialization which gave the elites even more control over society. The luddites weren't scared of new technology, I'm sure they would have willingly embraced it on their own terms, they were afraid of giving more power to people who wanted to destroy their industry to create mass-produced products and centralize their control over society. It was as much a social movement as it was a technological one. To bring it back to ai, I doubt people would have a problem with it if it used ethically collected databases, had more built in limits on the type of stuff you can create (its FAR too easy to create stuff like revenge porn), and the use of "watermarks" (I forgot the actual term) in the code of any content created by a generative AI so that it can be filtered out and easily recognized. That last one especially is something that would be super easy to do, basically free, and would allow websites to easily filter out AI spam or label things as AI, making it more difficult to use it for misinformation.
youtube Viral AI Reaction 2025-03-30T23:2… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugzwj-RZkth2cb0tC8l4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwP87I2eWzxIAmJuvR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxFWh3_IkCucql1HIx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwa5rIi-Hz8d5ov9Ut4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzbR7ZKJF9TcBuYuwt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"approval"}, {"id":"ytc_UgypnZSr-T8VT43FTG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwzmRQYk1UnGTJ1JK94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwETpu580rKG5I2B6t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgwVY6UuILQXA_AFg9d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx2l2Ad6k2uQnhcmBp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"} ]