Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'll repeat my appeal here, artist are making a huge, gigantic mistake going after AI in a way that will actually hurt them. By fighting for this silly "opt out" battle, you are just digging your own grave. In case you'll succeed, AI companies will be able to claim that AI generated art is 100% legal, hence a "fair" competitor. At the same time, you won't protect *anything,* because AI software allows users to train their own datasets, at home, using whatever they want. So you'll have unfair competition that will crush you, and your work won't be protected anyway. Does this still sound as a "victory"? I don't think so. What we have to do, instead, is figthing to have every output generated by an AI to be *public domain* and *copyright free.* In this way you won't have an unfair competitor. This goes not only for AI generated art, but for AI generated everything. Since the source was public and collected for free, the output must be public and released for free. Simple logic. For example, if tomorrow you'll go to a doctor, and instead you'll find a technician who will input you blood analysis, your DNA and your radiographies in a giant database to fina a cure for you, what do you want that database to be? Public or private? Would it be fair to pay Bayer 1000 bucks for the result when the data was collected for free?!? Please, stop this madness of "Opting Out". It's not just silly and useless, it goes against your own interests.
youtube Viral AI Reaction 2022-12-24T22:2… ♥ 8
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxQPDXNRTW-sBMOYnt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxAMBrbuNMA3wOPOcR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwD4-1Fr1HyWUnTF_54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwIj7GNKemnagMm7mF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx6xagx_bOz37OSPit4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugxz8RajBfNuEa8CbdN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgySsxb4qzl3neoJinV4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugy6ydcYJoE_tqaWfM14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzkAuC5dgThxNyzpzd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgypeLqRa2ydbW6m56N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]