Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Huh, it's almost like trying to replace people with machines that are likely to make more mistakes is a bad idea. It looks like the investment in AI is actually made things worse than better. The end result of what corporations are attempting for is to replace people, to replace people's jobs, people's opinions, people's lives to save money. They don't care about people they are trying to replace them, they don't care if people have food, water, shelter, because they only see what can benefit them and what benefits them is whatever makes them more money. With people you have to pay them, which is a "problem" for a company, so the solution is getting rid of people, you don't have to pay a machine but you do have to pay for it to be operational which is where the end goal of these investors comes in, a cheaper more affordable option than paying someone "affordable living wage" which brings up the question, what is the point of the government trying to make a living wage, pay people more then prices will go up because investors think it's a problem for them when people get paid more, it's means money becomes more valuable for these investors, it's a repeating recycling problem that has no answer because nobody can control these investors, it's why America is economically bricked due to these corporations and investors trying to replace people with Artificial Intelligence. Investors are not people, they are financial predators and their prey is the economy.
youtube 2026-04-06T08:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwEgs5yBOIBpKTNlkp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw5g2K56a3V8-MaxRd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwD3rVMI6aDatodWUR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwHpMAQmfv0ayRNLrV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwABlW1bS17_kauqsB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzqp85-DZlGIP1_r8N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx8nm6ZR20KqEcq-u94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzXbQlkDfoyFIkTIkp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgylO7QfNid5rqlXNid4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz9rPzFdx9jOOLCPkx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"} ]