Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Corporations do this all the time. It usually takes the form of ensuring management has no responsibility, but their mid level managers do. Sometimes they hire contractors, so their "fix" for a major fuckup is to just fire them and that saves their own asses. Now, they just do it with robots. How exactly can you "punish" a robot tho? Fire it? Dismantle it? It has no feelings; no family to feed. You can't ensure it doesn't work again, just to absolve yourself of liability. You can't lay waste to it's life in order to retain your bonus package. Stupid.
reddit AI Moral Status 1524950398.0 ♥ 10
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_dy4rwfz","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_dy4epou","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"rdc_dy4f5pg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"resignation"}, {"id":"rdc_dy4pyyx","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"rdc_dy4nl6o","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]