Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If no possible way to install a local LLM, how would you go about this? I use it for work, synthesising, texts, emails, searching through certain documents, it honestly saves me plenty of time, but I always double check, I have ChatGPT and Perplexity opened side by side and I disable (data sharing and training) if that's true..
reddit AI Surveillance 1768675681.0 ♥ -6
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_o05grm4","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_o0aggex","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"rdc_o074uwc","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"rdc_ohlrp7g","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_ohqi33w","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]