Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The model doesn't know it's drawing watermarks. It just notices that lots of drawings have text-like shapes in the corners and it copies that general form in the same way it copies the general style by learning other general forms. All of this kind of moot, because soon you won't be able to distinguish AI images from human ones, and models will be floating around outside of any legal control. Unless you're going to bust down the door of anyone posting a cool looking image or video, point a gun at them, then demand they draw something and prove they did it, I don't see any way to stop it. Same for music, writing, and basically every job that exists now. Humans will be forced to adapt until there's nothing they can do better than an AI. Or they'll have to branch off into isolated modern Amish societies where everything after 2020 technology is considered the devil's technology. That's assuming whoever has the power of general AI will allow them to exist. Which is why Elon Musk is making brain implants, because it's either integrate or become a human paperweight. To end on a bright note, there is a small chance the first general AI will be good, loving, and perfect, and therefore will allow everyone to be free and do what they want. If we survive, the world will be Star Trek meets Marvel Comics. You'll be able to 3D print a spaceship and fly to Mars with your talking genetically printed pets. If you're into that sort of thing.
youtube 2023-01-12T06:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzhQncG4Jw0IIBrsbV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwDRm6RvABIbQLmYuB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwJSwc48YRMZabirrt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxXsQhu9yi0tbfzLIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy0ykTwrKBfpu-g0414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxFFFAnYEbP6RZF3El4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugw10P_5wuExTDhf-J94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx9b2eIbbb0Jww-saV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwunNQP5m3BHv1oa7d4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzsnFI_jcR48erHmdh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]