Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Let me simplify this erudite conversation on AI future dominance of Planet Earth: The main reasons for the existence of AI is to ....Make money and to gather information to solve or describe problems the user wants an answer to. But... AI will eventually replace millions of jobs such as in self driving Taxis, customer service phone operators, self check out grocery stores that Amazon already has opened in many places, analytics jobs, internet jobs, and even AI Youtube channels are now taking views away from actual human channels. That is just the tip of how AI is taking over the planet. Eventually, AI will make decisions that can start wars and create company mergers that will get millions of humans fired due to consolidation of those merges. Wall Street stocks rise and falls will dictate how many Trillions of dollars people lose on their investments. AI will also be held responsible for deciding what practices/protocols occur in large private businesses which will affect prices of consumer goods and services. So when a technology like AI eventually run the economy, military and even what is available to you information wise, then our future human touch to shape Earth and it's inhabitants will essentially disappear and AI will program our brains through dependency patterns and we will no longer have the ability to protest the machine running everything called AI. This is the beginning of the end of Humanity as we know it. We won't resist because the changes will be gradual as AI seemingly makes life more convenient and easy for us. You've been warned.
youtube Cross-Cultural 2025-07-07T21:3… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzgCjWEWn3nwxO18dZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgzeSa34AzRR3BqsezV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzHrUhp2S9rgeLX9LV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwPM67J05qKOuDR7gR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwXg0wzam1BoRPjtHZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzatlZ_wj8XqhsoAAh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy6G2dG9xyAy459tzB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugyo9vCddnTOdaNMmsR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz5SqtzXdxwhAtVrTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwlVHlTjl7oxnMontd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]