Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I see your point about loneliness being a deeper societal issue, and I agree that shaming people doesn’t solve anything. But it’s also worth acknowledging that AI companionship can create its own set of risks. These systems aren’t neutral they’re designed to keep you engaged, and if you start relying on them too heavily, they can subtly replace the push and pull of real human connection with a controlled, frictionless imitation. That might feel safe in the short term, but over time it can make real-life relationships seem even harder, deepen isolation, and leave you more dependent on a product that exists to monetize your loneliness.
reddit AI Moral Status 1754771593.0 ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_n7tvp3l","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"rdc_n7tvds2","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_n7tzdus","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_n7u9jfx","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_n7uerea","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]