Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you beleive all jobs will be replaced by A.i because it is better then humans, then you should also believe that Race Car Drivers are going to get replaced because Self Driving Cars will be better at racing. Never going to happen. And that goes with many jobs. A.i. will replace call center jobs, I believe that. But A.i will never be giving authority to give judgement to humans such as law, medicine, or warmth from hospitality work. A.i will also never have human senses to experience the world. You will always need boots on the ground. And lmao at the idea of Robots replacing plumbers. It will never have the ability of human laziness to figure out ways to accomplish jobs easier, faster, and better. You think a A.i robot will be able to grab a screw and nut that dropped in a tight space down below? That it would know to reach in a feel with its fingers to recover it? No, it would have to cut a huge square in the wall to to reach and recover it. Robots will help with heavy lifting, just as much as a forklift, nothing new! But they will never take over. We will always be able to do stuff they wont. As far as A.i. killing Humans as we know it. Never going to happen. Humans are more adaptable creatures of War then we give ourselves credit for. We might get sent back to the stone age but Humans will be the ones left standing. We are truly biological giants. We just think of ourselves small and weak.
youtube AI Governance 2025-09-08T19:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwNZcOWd3YQ7cAS2Ep4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw9K5DybkKU4akq8iZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw8kNRnWSBnvbojN1B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyTOhzqSxGRjlfei5p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugwhe90Ce0Or9iooAqV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyuJ6o9LMEaShnc-Ft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzuL9fDcxdtLFQgEt14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwY_4CNadQ-VrYiF6B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwayJ6dNzSlk3rERyx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgylY8k3Kta1enfgb8l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"} ]