Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
“an attacker would simply have to have a user open a phishing link, which would then initiate a multi-stage prompt injected using a "q parameter." Once clicked, an attacker would be able to ask Copilot for information about the user and send it to their own servers.” Come on. Phishing?? Fckin click bait.
reddit Cross-Cultural 1768408670.0 ♥ 35
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_nzpee8x","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_nzky5ul","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_nzluut3","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"rdc_nzkcc6r","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"rdc_nzkdck7","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]