Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
26:08 i wouldn't say it's naive, but it's overly optimistic and puts too much faith in humanity lol. Yes, when the horses and buggy went away and industries relating to them went away, new ones popped up to support the auto industry. The difference with AGI is that the entire point of AGI is to make it so there's no new industry needed to replace the workers and industries AGI will displace. Because AGI + robots (if they actually get it to work) quit literally just means those people and those industries won't be needed AT ALL. In a scenario where AGI actually is good enough to self learn like a human would and self upgrade their own code and self repair their hardware and create their own new algorithms/models and build other new robots. That closes the loop and ends any support industry needed entirely. Which means no new jobs/industries for displaced people. There is no new job being put back into society. And that's the problem and why it then equates to 15, 20, 25, 30 or more percent unemployment and a breaking of our society (unless other things like Universal Basic Income are properly implemented). The people pushing to advance AI and AGI are explicitly doing so to cut a chunk of society out, not to better society for all of us which was the case for the industrialization and the auto industry and others that have come up in the past. This isn't a new industrialization age, it's the end game of late stage capitalism. AI/AGI companies are doing this to make society better for companies and the people who own them and the rest of us will be left to fend for ourselves, if they get their way.
youtube AI Moral Status 2025-07-24T23:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzzaG7sVLVgO8li5TJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz8umJa6g9sYZJ1OXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzL6gGtCAJfUqOSZGB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzfqnVaCqQTUTi2nxh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzbu9D1hjnzpuL4udl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy_FTJu0cHJ46FbyaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwiGVYRIZyeM_Xn20Z4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzAFXQPnoHy7m1JQDR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugz4JYQo6eXGuNxXqE54AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyFyOu2C6AT1XpeKPN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"} ]