Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Kinda ironic that our sense of self is a hallucination. It makes us believe that we have a fixed self and a central "I", and may re-inforce that belief linguistically (e.g. through inner narrator). Similar to what AI does when it claims it has a self.
reddit AI Moral Status 1749754561.0 ♥ 79
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_mxgbh0a","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_mxfodsy","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_mxgmuii","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"rdc_mxfi5m2","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_mxfjcwf","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]