Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You can easily go down to pixel compression and stretching. Once you alter an original image it will get distorted. You make an algorithm to find and compare images down to a pixel which is not such a hard task. Deepfakers take images available online and usually don't have high profile people posing for them to create new ones. Even if they did you can detect distortions in image with deepfake, so I believe detection systems will be ahead of deepfake for a while.
reddit AI Harm Incident 1651309292.0
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_i6saw8r","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_i6rh5yy","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_i6rk4r2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_i6rolhn","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_i6rrm9g","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})