Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There's *plenty* of alternatives. Heck, there's a veritable *buffet* of LLMs out there nowadays. Mistral's models are surprisingly solid. All of the models coming out of China are pretty insane as well (Kimi-2.5, Qwen-3.5, etc). OpenRouter/Chutes/etc have whacktons of models to try. And if you're really concerned about it, *just host your own*. Pretty much every consumer grade computer nowadays can easily run at least a 7B model. Granted, you're not going to get anywhere near SOTA cloud models without investing money, but all of that data stays entirely on your computer.
reddit AI Responsibility 1772301117.0 ♥ 6
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_oe51w3r","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"rdc_o7w54jy","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_o7vxq8t","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_o7wxd3d","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_o7w0wjm","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]