Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If the software you write is an enables a product instead of being the actual product, then it only has to be good enough to make the actual product less expensive. Even if there are bugs, and even if the code is shit, if it works then management doesn't care. If your code actually is the product that's a little different but even then when one developer uses it to become 10x as effective the company doesn't need as many developers to do the same thing, so people get cut anyway. However both of these perspectives will result in stagnant, enshittified products (management just hasn't figured that out yet because MBAs are parasitic morons). The one that actually works is keeping your original staff and allowing them to 10x using AI as a support tool, so the additional productivity can go into creating things you otherwise wouldn't have bandwidth for.
reddit AI Jobs 1742986286.0 ♥ 3
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_mk0udj6","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_mjt96sd","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_mjtxcog","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_mjtp6yf","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"rdc_mjtr4oz","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]