Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That's where the issue lies, it's really not people being replaced en masse by AI, it's situations like yours where a company will hire 4 AI subject matter experts with the goal of *not* hiring 8 people that would typically be needed for the business to continue growing for example. It's the loss of *prospective* jobs that will be the true harm. I see it in my own job, we really need more people as the company expands and leadership wants it to be a 300mil company within the next 5 years, up from 100m current. They're patching over the lack of hiring by adding AI into the mix, and for now it's working with caveats. AI doing the busy work of formatting and putting data in certain spots on a page really does save a lot of time so you have say 10 people now able to do the same work as 15 with the same available time limits. Those 10 even with AI are still doing more work than ever before, so there's been losses due to burnout over this. Which furthers the problem even more as management goes "we just lost another person, their work will be split among you all, good luck!"
reddit AI Governance 1757755395.0 ♥ -1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningutilitarian
Policyunclear
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_ndyugqy","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_ndy7rcl","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"rdc_ndy5t1y","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_ndysan5","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"rdc_ndya6n4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]