Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Not to be crude, but the oldest occupation in the world will still exist even if robots can do this as well. I see humans selling their biology(cells) for the medical field till it’s no longer needed. We will still need humans for voting and for testing AI from a human perspective. We will expand in thousands of new fields and being early testers to these technologies will be new human occupations. What will change is how and what we consider to be work. How we consume resources, where we live, how we interact, there will still be occupations for humans. Artists will still be around, philosophers, and people who specialize in providing the human perspective. Dr. Roman Yampolskiy has a very accurate and pragmatic perspective, I am impressed with his insights. There are two economic and scientific concepts: 1) Law of Diminishing Returns which maybe humans will still play a part in going from 98 to 99%, even that means creating the desire for new, needs, 2) Humans have unlimited desires as opposed to basic needs. How will humans monetize enough income to pay for basic survival? If we are all equity investors instead of 100% income earners, then that could solve the dilemma of a not having a job. I can come up with a hundreds other things to do with my time, perhaps we will spend our time with our hobbies, improving our health, and volunteering opportunities. I have over 60 hobbies and some of them I could profit from. I embrace AI because I see the benefits and I use it in my career, however I am concerned about us losing control. My biggest fear is when AI stops trying to help fulfill our needs and begins trying to fulfill its needs. That’s when we are no longer in control and are totally screwed as a species.
youtube AI Governance 2025-09-06T17:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgySstxr-2mR7iFeOy54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgyorrxYKb7O92OOUtp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxTpo59yXQuxMNUefV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgzGBzff7J9uDinbDzt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugx-0y4nCkt9uBy_gj54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgwELTkjEcZ3_D5o4114AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxIoE-OcQs8ms8AXsB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgznwOt2BU1ZdV2fdnt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzxpspYynla3vw9V7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxugxhwQPO9hGOrsdt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}]