Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
.......oh, so the way it's called put, it was more of a case of "non consentual pornography". Because it started with real pictures of people, that were transformed. But I would think that argument could be stretched for anything with AI then. Because AI will have looked at 10,000 pictures of boobs, to know what boobs look like. So even though you might have it generate a "topless girl with boobs", it's still basing that off of all of the previous pictures it looked it .
reddit AI Harm Incident 1730127639.0 ♥ 9
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_lu6op2t","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"rdc_lu6h3f3","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"rdc_lu6j07b","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"rdc_lu6jr1y","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"rdc_lu70mqn","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]