Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yeah we're maybe 2000 years away from what he's suggesting. But there has to be a balance if not there is no money in it. If 99% is unemployed and nothing to create by humans then everything is useless to us and there is no ambition to create new things or financial means to run the economy. What he's suggesting is so far away in thw future that if you want to see if then you will need to live a thousand years. Robot dexterity for skilled labor is probably 100years away. Plus we need full on artificial intelligence to be able to do those trades. If you work in the trades you would understand. Plumbing, carpentry, electrical, welding, landscaping isn't perfect. There will be things every 5 min you will need a human for. For every robot you will need 2 humans to do a 1 person job. They will be for assembly line working general labor like shipping, sorting sinple welds compartmentalized electronics and have 1 human putting it all together.
youtube AI Governance 2025-09-21T02:1… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxonJ0o8-dbrtdkdsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyRsXtfpNqOoyr9oSR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwpmRkDXCb0j1eP_mp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwzWHGStW4wN0y5a2d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzFULFn-tSVAjMqFLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxWlredTPBv8X72vOx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyHh_qq7azkqmENeUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzFWy-oGH7XMpvXCAt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx6EpyFd3iM5p3bj4V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyJLodASMI7R06sFcx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"} ]