Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Failing at what? Generating a functional product? That was never their intent. The only purpose of "AI" companies is to take money from ignorant people. LLMs and "AIs" are not just different things, there is also no path for an LLM to magically become an actual AGI through refinement or additional processing power and memory. Everyone working in LLM SaaS businesses knows this, and understand their company is selling snake oil. I'm more concerned that MIT thinks 5% of them are succeeding. This article is the tech equivalent of "95% of Essential Oil Brands are failing to cure Cancer"
reddit AI Responsibility 1755604830.0 ♥ 8
Coding Result
DimensionValue
Responsibilitycompany
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_n9ij4s0","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"rdc_n9jvyjf","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_n9m5pif","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"rdc_n9mczs3","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"rdc_n9mfo4x","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]