Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@crow_featherJust as likely as it is in other fields. why would ANY job be safe from AI if you understand what it is? There are jobs where the point is literally to figure out how to help AI help you do your job, it's smarter (in theory) than you and is being trained to do your job, once it's implemented in robots or has access to entire networks, why wouldn't it just do your job? It would be up to the people themselves to demand their presence be maintained. What should really concern you about job security is what a job does for you, rather the power it gives you. Being able to provide a function to the people with the most power and resources- the billionaires and hundred millionaires- and FACILITATE them having power means you retain some personal power- they NEED you/us or everything they have comes to a grinding halt and we can act together and take more power back ourselves. But with AI, technology and robotics, what do they need you to do anymore? Nothing. So you have no real personal power other than violence left because other than that, your life doesn't affect them anymore. Think they need you to buy stuff? Why? You have no power left. They have AI and technology that can make movies, art, entertainment, medicine, food, transport, clean water, etc. AND they control the resources they'd use to facilitate any of those. They don't need you AT ALL. Then what? If anything, you're a tax on the system at that point. The mistake is thinking "we're all in this together" but I promise you egomaniacs don't see it that way, and if you think your life has any intrinsic value to them you should look at history because there's a slew of assholes who directly and indirectly murdered MILLIONS of people and still thought only of themselves
youtube AI Moral Status 2025-06-05T21:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxiiwhQvGgB5wqwvc54AaABAg.AIzD_Q9ySsaAIzJUPwuI95","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzD5rSJBT8syEf1--B4AaABAg.AIyMrQyFyx5AIz1FySaRYy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgwLpb_3FqloMeJkFyl4AaABAg.AIy7UBAk3SpAJ-AtSi5rzx","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugyid_nHGeznA970v4N4AaABAg.AIxyG_9xDS_AIy5eeT8kom","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzDavvISEiTXHVTMih4AaABAg.AIxy0X_tqe3AIxyD1a-dSV","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgynCkLUqEJgLb1Axlt4AaABAg.AIxqv-JiL36AJ-KGAD9Aek","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytr_Ugxf2zuBxAwIG3M_bQp4AaABAg.AIxqMzpPMh_AJ-0k6ouMOe","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugxy3MnGxvAkI4GoHpN4AaABAg.AIxn5M3tceMAIzxbh38Vvy","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytr_Ugxy3MnGxvAkI4GoHpN4AaABAg.AIxn5M3tceMAJ-12dMelq9","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxy3MnGxvAkI4GoHpN4AaABAg.AIxn5M3tceMAJ-eeuV_tlJ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]