Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is certainly a more than rocky situation. I really like the ideas of what AI can do, and I am an artist myself. But it cannot be understated how unethical this can all be. I do think this situation will eventually call fair use into strict scrutiny, and that's as exciting as it is scary. On a lesser note, I have a strange opinion on AI art that I've gotten a lot of flack for. I miss crappy AI art. Like the new stuff is exciting and scary with how close it is to human art, but there's a novelty in obviously computer-generated work that people don't seem to appreciate. "Why would you want a bad drawing? That's not the point of AI art." But it's funny. It's weird. Interesting. There's something so endearing about a blurry mess of an image that vaguely resembles the style of a courtroom drawing, with a just barely recognisable silhouette of an approximation of Goku's hair standing behind what could be a desk. That's a level of abstract art that is hard to find in humans, and harder to replicate. Humans suck at making things in a truly random fashion, but computers are great at it, despite being unable to actually make truly random results. That's cool, and much more fun to me than remaking the Mona Lisa in Walt Disney's artstyle pixel-by-pixel.
youtube 2023-02-17T04:2… ♥ 109
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzRybU8KH37uKtVGSV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzO3eyo3JWiPU70m8p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzorPpzFvHF-ul7gvN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxhZ3rfCxqLdZm_XRt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy3XYLgpxdJQ1E4BcN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzrqs4gtq88Ixk9l7t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwCDJYJyEdTHS-N7ah4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgyWwj5_pV_kX3rsvk54AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugw51SLM09Z3RD2EkLJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzaIuR6WlTzHc8DGd54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"disapproval"} ]