Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Within our life times it will be impossible to tell if CSAM is created by an AI or not. Currently we prosecute people for possession of CSAM even when we can't locate the child in the image, that simply won't be possible in a world where it is legal to possess AI generated CSAM. Even if the victim is real, you couldn't prove beyond a reasonable doubt that the person in possession of the images knew that, they could have thought they were looking at perfectly legal AI creations. There's a huge legal can of worms opened by allowing photorealistic depictions of CSAM.
reddit AI Harm Incident 1695589552.0 ♥ 3
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_k20eafg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"rdc_k20hk66","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_k1zwre5","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"rdc_k21qrce","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"rdc_k1zxpcu","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"} ]