Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Definitely love open source options! Op, set yourself up with a local or hosted chat app like open WebUI or t3 chat or typingmind. Then you can use something like openrouter for access pretty much any model you want. I even checked just now and all the OpenAI models you mentioned are still available in the api. You’ll probably save money doing it that way anyway.
reddit AI Responsibility 1754658351.0 ♥ 13
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_n7lc74u","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_n7leyfl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_n7mly7k","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_n7mo3ji","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"rdc_n7lavjv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"} ]