Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There are no real images of Taylor Swift getting raped. And yet, an image generator is able to generate thousands of images of Taylor Swift getting raped. This specific case aside, this kind of garbage is baked into the technology. >People have been doing this kinda horrible shit with photoshop for a lot longer than AI. Technically, someone who builds a car by hand over a months is doing the same thing as a worker who is part of a factory team that assembles 10 thousand cars in a month. But we don't pretend scale doesn't exist and these things aren't entirely different beasts. >Blame the man, not the tool. Perfectly fine to blame both.
reddit AI Harm Incident 1730126498.0 ♥ 28
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_lubhlbn","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_lu6l1yp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_lu6o6an","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_lu7rj96","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_lu5txto","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]