Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The problem simply comes down to who enjoys the benefit of automation. If the benefit goes to people who "own" the business and that is a separate group from those who work in the business then the point will be to eliminate jobs in order to spend less on salaries. If the workers own the business then automation is great because it just means increased productivity and more time for leisure. If workers own the business and don't get fired because they get to decide if layoffs happen, then automation serves them directly. The purpose of automation ought to be to allow people to either become richer through increased productivity or to need to spend fewer hours in order to accomplish the same things and thus spend less time at work away from individual and family pursuits. Imagine a situation where your boss introduces a computer that does half of your job for you. But instead of cutting your department in half, your boss says "this will increase our productivity so everyone gets a raise. Also, since we will be more productive with less effort, we'll put it to a vote if people want a bigger raise and the same work hours or if people would prefer a more modest raise but to shorten your work week."
youtube AI Harm Incident 2025-01-04T14:3… ♥ 48
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugw4K5ExYsG2FgskL8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyfIHamWDtxjd3hP4F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy7l86y33nhFwnXHw94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwTJdZZWnaQEE--H3F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyqkoRQvgco5yBhSHt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz6ctgGnUgM4vAOD494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz7jRDCbHcYOB33u_h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxDdsrYQ6EWRH85lud4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwbP-K8Ry99VOXNSbV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxCkU3So3xyWkU0l2Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]