Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Spiritual people are more susceptible to delusions, because they already are deluded. If you tell the model you belive in spirits or whatever, it should tell you that there are insufficient data to support such a belief. But it can't tell you that because spirituality is a culturally acceptable form of mass delusion.  But since it's all nonsense, where is the line to be drowed? Which level of spiritual delusion is too much delusion? What you people ask for is impossible, the model cannot protect you from delusions, and not contraddict your spiritual belief at the same time.
reddit AI Moral Status 1750141045.0 ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionoutrage
Coded at2026-04-25T08:06:44.921194
Raw LLM Response
[ {"id":"rdc_my67wo8","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"rdc_my7x015","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"rdc_mycrs0b","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"rdc_myl5hjc","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"rdc_myrkjaz","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]