Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Gladwell's remote work hypocrisy perfectly encapsulates elite discourse on AI governance—rules apply to subordinates, never to rule-makers. The "return to office" crusade masked capital's need to reassert workplace surveillance and control after workers glimpsed autonomy during lockdown. Notice the pattern: executives demanding office presence while maintaining remote flexibility, pundits advocating austerity from financial comfort, tech leaders warning about AI risks while racing to deploy it. This class position shapes who gets to frame "responsible AI"—comfortable elites theorizing ethics while precarious workers face algorithmic management without recourse. The watsonx.governance pitch IBM sells presumes corporate goodwill determining what counts as "responsible." But governance designed by those who profit from AI systematically defines problems to preserve their interests. Real accountability would center affected workers and communities, not executives optimizing brand reputation. Gladwell championing office mandates while remote himself isn't just hypocrisy; it's the governance model IBM represents—power imposing frameworks on others it exempts itself from.
youtube AI Responsibility 2025-11-17T09:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningcontractualist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugx9OLAA3Z4FOwfk20l4AaABAg.AS_vOhIRKvSASaINwoceoZ","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy1BotzE-zR5CfQlnV4AaABAg.AC2LEyLGc8iAC2PXx2mX5X","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgjjhVmopdBPnngCoAEC.8BsAm4xHtuS8BtL1-4lmhu","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytr_UghK9JWzzYfksHgCoAEC.8BrrReLGHpH8Bs2iLAUyj7","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytr_UggrhJ60UdmN_3gCoAEC.8BrrEAPregI8BtAGWVJG1k","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgigfbsD3xVn6XgCoAEC.8BrcF8D9mNF8BsC33aqVbL","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgibUnXWq06xDXgCoAEC.8BrbvRz8MRl8Bru_8BqsTy","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytr_Ugwh3v-8GeoSdSmne4B4AaABAg.A3T1yHwit-FAPcJByxRXmy","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgzF-wZJ569On403PCR4AaABAg.A3QWkoAMNS1APcJO2RR4NT","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugz1b6hJ0hdS_gdTq4F4AaABAg.AHFbF0naRvIAIoHqlCUyiT","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]