Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
All of these concerns are absolutely warranted, but they are somehow on a theoretical level. What I noticed that people don't take into account is that everything is driven literally by humans, including the global economy. The money (resources) that billionaires or corporations have mostly are a redistribution of the wealth of humans who are less capable of retaining or multiplying their own wealth - this is a vast majority of the population, most of them "employees", right? Well if the economy collapses, 90% of people are unemployed, the whole system will collapse and there will be no point for AI at all - it's not like AI companies exist outside of society, they need everything to work relatively well in order for them to maintain their business which further develops AI - if the economy is destroyed, there will be no one left to advance or need AI, so it's like a circular thing. The very existence of AI requires society and economy to be somewhat balanced. AI itself consumes resources and it will need a giant consumer base in order to be sustainable. And AI doesn't fulfil any basic human need, so if people suddenly find themselves without resources for basic needs due to unemployment, they will no longer be an asset to AI companies. The same logic applies to companies that use AI instead of humans - they all provide a service or goods that need to be bought by someone, who must of course have the buying power to do so
youtube AI Governance 2026-01-20T10:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw4Xu4d2Ogu3pNM0nN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugws_7g4Q7mNyFNLna14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyh37FtOZJCkO1IWZ14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyV16ZkRJAUJTuIoPl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx6o_P-DE9a1UR-GaV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugycs0vlkpvp4yWI4RV4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw8J-3uui0bItOlhVR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxozjYebSPs0q3O9EJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyhuGJmojopfb9kt5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwPckDPca-wIXTwSwJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"} ]