Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Humanity needs to really think about its future. Will AGI and Robots and Automation create jobs. Yes. Will they create enough jobs for all to have a job? No. We are going to see mass unemployment, something to the tune of 70 percent by possibly 2030. Why? Because our society is based on capitalism and capitalism is all about maximizing profits while minimizing costs. Humans are the single largest cost to a company. So AGI and Robots and Automation will remove most or all jobs in entire sectors. Again these industries will create jobs like AI prompter or Robots security or AGI human alignment technician who makes sure the AGI does what humans want. People who clean robots and maintain the robots. People who feed the initial products into the system like food or materials. Think of it this way a typical fast food restaurant may have 100 employees for a busy location. But with robots and AGI and automation you can cut that to maybe 10 employees. Now those employees hopefully will be paid well. But what do we do with the other 90 employees? We are not going to make enough jobs for them. We can pay enough a week for everyone to live with a 2-3 day work week. We can have UBI tax and give UBI to all making under a certain amount. We can eliminate money and simply be like Star Trek and provide basic needs to all and if they want to work in a field they can. They will get perks but ultimately they do it because they want to. No matter what we have to do something soon. Because the timeline is roughly this. 70 percent unemployment by 2030 to 2035. 80 percent unemployment by 2045. 90 percent by 2060. 95 percent by 2075. 100 percent by 2100. The last 5 percent is going to be about humans letting go. Humanity can change for the better. We can focus on family and helping others and our passions and creativity. But society would have to value that and want that. People are still going to want to be doctors and engineers and whatnot. So we make those jobs worth going into. They get perks or something for being in those fields. We are having a population decline so maybe that's a good thing. Because less people means less jobs needed to sustain them. Maybe we will need to upload someday soon and live in virtual reality. Maybe we become immortals via robotics or cybernetics or whatever. Then we can have less kids and we can do long term projects like cure cancer or make a Dyson sphere around our sun. If humanity had a longer life we could think about large scale projects. Like turning Mars into a second earth. But we can't have immortal people who are homeless or barely scraping by. We need to make sure we can provide equally for all. Those who achieve more may get more. But no one person is left behind. We can make our society so much better. AGI and Robots and Automation can give us an amazing future today. But we are stuck in a society who values greed and capitalism. We honestly will probably have a financial collapse next year summer or fall maybe. Then with AGI and Robots and Automation we will have mass job loss and then do what humans normally do and that's going to war. Humans seem to like war. So we will most likely have a second civil war or world war 3. Because humans tend to like destroying themselves and each other. Which is sad because we could become a type one civilization if we wanted. Humans prove me wrong and don't destroy yourselves.
youtube AI Governance 2025-12-05T01:0… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyDnzfGBGx4aas4gB54AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxErOsCWFzIiJRXNMl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzaOY0B-IIvOg1I-aR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyuaGeRWbkcodg8pk54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgziSBhyBQwFhQ52jf14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwVXHdjpzrBNKBlcWl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxUQc9RmrZjxYPObiR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwO-W1nNBY4CHW_a8x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzy2Kvuhq2c-5LI2j14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxvQ7nZ15OGA4v5PLl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]