Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Compassion is a pretty specific evolved state which likely involves conflicting motivators, where I'd guess that the empathy / theory of mind part of the brain used to guess what other beings are thinking, and the 'don't kill the other parts of your species or harm yourself' part of the brain which social species need to have to some degree, are perhaps overpowering the 'be violent with this threat' part of the brain. I very much doubt AI will have those specific things to lead to compassion, unless we're smart enough to train them as a social species with peers. Even then, human compassion barely extends to other humans outside of visual range (and often doesn't extend to people in visual range for some humans), and even less to other species even in visual range, so we'd need to do better than a billion years of evolution did with biological social species.
reddit AI Responsibility 1709801019.0 ♥ 11
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_ktrzdim","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_kts7m1t","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_ktqjqew","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"rdc_ktqeybe","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"rdc_ktqp49c","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]