Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Honestly, I understand the AI revolution in huge, but, I can’t believe the economy will be something exclusive to “rich capitalists” while AI makes everything. Simply because there’s no innovation in that world. Many people are going to say I’m wrong “ohh, but all the human innovation is based on past reference”. All the human innovation is based on past reference, with the inclusion of something that I don’t believe machines will ever have (ok, maybe one day, but not near future): feelings. Basically everything related to innovation based on human interaction, human feelings, and the innovation that comes from it, are going to be the jobs that are not dying. Going to the universal income, that’s basically what will happen, that doesn’t mean people won’t have an income. It’s simple: you have an abundance of goods that are now so cheap to make, that governments will tax way more capital gains. But that has been happening since the beginning of the Industrial Revolution. The more automation we added, the more productive we got. Okay, people lost they jobs, but the wealth generated was basically transferred to people. Things that require people interaction. Also, the amount of welfare provided by governments nowadays is only possible because of exactly that: productivity is so high that governments are able to tax a good share of that. Last but not least: people’s behavior. I am, personally, stop using services of companies that don’t have a human customer service available soon. I see a huge push of companies to make everything that you can’t get to a human. I’m stopping buying from these companies. First: they’re not very cheaper, and if I’m having a problem with them, I want to talk to the company for real, that has thoughts, not a trained robot.
youtube AI Harm Incident 2024-08-03T17:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxSsAKk8_vnhJUURfR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwGKnbCmz_PgilFnCx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgxZSbJXk9nPz9gpqJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwyqoKmf0HPJMJP3kd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzLJevKqHp5CxQ3j8F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugy0R8MtSwEYPAY6dTN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwzMusgdegQusReSDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxyWGfeVJkHvBYuFh14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzG9XwxMCGEaKJNeaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyi21vlj3DjCqRVnD14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"} ]