Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I always thought that medical and legal were going to be the first to be automated -- because, as tough as they are for people -- they primarily are procedural and based on diagnostics and memory of how rules might apply. The perfect task for an algorithm. When people saw that the first automation was landing on writers and artists -- they dismissed that, thinking; "well, that's usually low paid work." But doing good art is much more difficult than procedural technical -- especially for traditional programming. So Chat GPT doing well at medicine and legal work is absolutely no surprise and we have to start the discussion of "what happens next"? What happens after most all tasks people might do are done better than average by automation? It's only something we should worry about if there isn't an equal "de-valuation" of ownership. And since that will be the hardest nut to crack -- I think there is something to worry about.
reddit AI Responsibility 1684466051.0
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_jkq55h9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_jkti2bz","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"resignation"}, {"id":"rdc_jkpnab5","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_jkpulei","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"rdc_jkqe0in","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]