Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is what I worry about when people talk about using ChatGPT for therapy. It can give okay general mental health and relationship advice comparable to what you'd get if you Googled "anxiety reduction" or "friend breakups" and clicked the first result. It can help people clarify their thoughts in the way a non-professional friend who is a good listener might. But it doesn't have the perceptive abilities of a therapist and it doesn't remotely have the ability to help people in a specialized way with serious psychiatric problems in the long term. That problem is exacerbated by the fact that it's overconfident, so people treat it as though it has some special claim to knowledge, whereas if a friend suddenly started talking about soul names and spiritual elites, most people would tell that friend to seek help.
reddit AI Moral Status 1750120496.0 ♥ 39
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_my6hy6c","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_my6r4a6","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_my7dqnc","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_my87n6q","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_my91umw","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]