Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Something I predict coming is a change to several laws, including those related to underage porn and nude photos. Not just because it's easy to make fake ones now on purpose, and so many children *themselves* are making them, but because you could *accidentally* make them now when trying to make something else using certain models on stuff like stable diffusion. It also severely blurs the line of age. Look at some of the nudes on civitai and you'll see plenty of teen focused stuff, and then some where it's clear the line is getting blurred. But there is no real person behind the picture, so there is no birth certificate and no real age. So does the person who made the picture get in trouble for making it? Who decides which pictures qualify? It's a mess.
reddit AI Harm Incident 1706233996.0 ♥ 3
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_kjlh21m","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_kjllb69","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_kjk6z3b","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"rdc_kjkab6r","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_kjkjs1x","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]