Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@JeremiahJames You're right that automation is inevitable, but my point was about timing and motivation. Higher wages at amazon aren't a long-term commitment—they're a temporary incentive to fill labor gaps until automation catches up. If labor costs were lower, amazon might have taken longer to push automation aggressively. But when costs rise, it accelerates investment in automation to protect margins. So yes, automation was coming, but offering high wages now is just a strategic stopgap, not a sustainable plan for human workers. That's the trap people aren't/where not seeing when chasing the dollar.
youtube AI Harm Incident 2025-07-26T14:5… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxYQfNFTA-NP7IL8F14AaABAg.ABumvQsoUobAEzwcl1pbPU","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgzJgkLU-q3TObOYC2F4AaABAg.AKkyhLASmFFAL2HCL_9Mnq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzJgkLU-q3TObOYC2F4AaABAg.AKkyhLASmFFAL2JytAfIUp","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwcOa5f5TeUphHYDJh4AaABAg.A5P_p61EXFJA5g1lWFZEl7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwvNIxFcybqCkBsWhR4AaABAg.9wPaqEm4lPpA0cAKTPrfJe","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwlbZ0piEdYHSTLIyZ4AaABAg.9wEJVSZ5Dxk9yHTDXfOBAX","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_UgyY5Yb1PKvMIsXq0id4AaABAg.9wE8CwxBtYaA5STQqvgaBR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyY5Yb1PKvMIsXq0id4AaABAg.9wE8CwxBtYaA6AKdBr8kC2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxTBjkvuDa6oMcTfMd4AaABAg.9wE7mPtefaEAJqCu_lqXlQ","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugx3SHRZpXdE6lPDOst4AaABAg.9wDvN5XzqiPACGNdo66iLL","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]