Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So I actually did some testing with this. Using already nude photos, I selected areas for the AI to edit. All the AI does is apply a generic "nude" image to the area you select, and it places it in the right place. It has no awareness of whether or not the source image is nude. It doesn't use features from the source image. It takes the area you selected and puts in "boob". The resulting image has no correlation to the source's body. It's pretty much just cutting out the face and placing it on a"generic" nude image. I will admit, the blending and shading is done well. But if you even vaguely know what image was edited, you'll know the output looks like an AI garbled mess. You can probably do a better job with free Photoshop alternatives. You can put a picture of a dude in there, and it'll spit out a dude with the same boobs, same stomach, same everything as if you used a photo of a woman. Deep fakes haven't progressed much from those first big news videos (I think it was of Emma Watson?) that went up on pornhub years ago. All that's changed is the ease of being able to do it. It's still a mess of uncanny valley looking Pixar characters.
reddit AI Harm Incident 1695592163.0 ♥ 4
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_k20brfw","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_k20ei2r","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_k21xsfz","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_k1zxiib","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"rdc_k20owoa","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]