Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is actually really interesting. Could it possibly be a solution to preventing people from using images and videos of real people? If they can satisfy their urges with fake shit, couldn’t that be helpful? EDIT: I realize that this specific article is talking about images of real children being doctored. I’m specifically curious about AI that generates images of completely fake people. If pedophiles were given some kind of outlet for their urges, could this have the potential to curb those urges so that they don’t act out on real people? It’s really easy to just demonize pedophiles (for obvious reasons) and to have zero empathy for their struggles because we’re only interested in proving how repulsed we are by CP, but child sexual abuse has been going on since time immemorial and it’s getting worse. We really need to figure out how to solve this issue, not by shame and punishment but through empathy, understanding and coming up with real solutions. Unfortunately, this involves talking about it objectively and seeing pedophiles as human beings who need help.
reddit AI Harm Incident 1695568866.0 ♥ 117
Coding Result
DimensionValue
Responsibilityunclear
Reasoningutilitarian
Policynone
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_k20ulhj","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"rdc_k20brve","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"rdc_k226z8l","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"rdc_k2214xq","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_k203lxz","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]