Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Multiple things can be true. It likely will impact entry level jobs that don't require critical thinking but many people use as a way to get their feet wet with actual thinking work. Apple's study showed it's basically worthless for anything more than that. It possesses next to no actual reasoning ability in its current form. Until that changes it won't pose a major risk to any job that requires critical thought because it has none. Essentially I don't doubt it can take over jobs without any need for reasoning. My fear of it taking over jobs that do has plummeted in the past few years as I've continued using it and seen studies testing its abilities. There may be a breakthrough at some point but I find it just as likely this current iteration of AI will never make that leap. It will likely require a completely new type of AI and I have no idea when that will occur. As time passes I view this more and more like the belief people had in the mid 2010s that cars would be reliably and fully autonomous by the early 2020s. How'd that one work out?
reddit AI Moral Status 1750965747.0 ♥ 8
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_mzy7ii3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_mzzqgze","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"rdc_n00wnmq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_mzxqwlu","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_mzxx2ne","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]