Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You’re never going to plug all the security holes in co-pilot because you’ll never know what all of them are. Other people will likely find some before you plug them. Additionally, every mitigation measure you add, you’ll probably create new problems or at least make it less efficient.
reddit Cross-Cultural 1768414564.0 ♥ 13
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_nzpee8x","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_nzky5ul","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_nzluut3","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"rdc_nzkcc6r","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"rdc_nzkdck7","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]