Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hello. Thanks for an interesting perspective. I normally support AI generators in general. Of course, currently they are controlled by large corporations with the goal to displace individuals and keep profits for themselves; but we will not fix this problem by getting rid of whatever AI generation software you are thinking of. And here we are having a clash of social needs. If you go back a century or so and look why copyright was in existence you will find that it was there to stimulate production of new art (and sciences). Copyright was reasonable, it was 15 years after publication, and for some works reaching as high as 25 years. This way the artist would gain monopoly for more than a decade, but later other artists can start creating derivative works, generating more and more art. Today's system is insane. Most countries have adapted the bizarre "life + 75 year" rule. Can you answer just one question: How can you incentivise an artist to create more art after they are dead? This rule was created not for human beings, but for the corporations, which own these copyrights. This rule ensures that if you witness something being created, you will not witness it going into the public domain, because you need to 1) kill the artist 2) wait 75 years. I am a poet and a programmer. As a former i release all of my poetry into the public domain right away. As the latter i distribute my works under free licences. So when i come up with somebody who complains about evils of AI, i find them dishonest, unless they also comment on the need to revamp copyright rules and make the term into something sensible.
youtube Viral AI Reaction 2024-11-04T14:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningcontractualist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyMd8RbSEnUXJhChFh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxbEBMYtTR1dnzrJJd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgxLwMI2gLsQyF5FqjR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwkAOtFoAVOVNvzLd14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_Ugz4_mwpA8tmAylj-Ax4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx3NmgZO6I6xwAfJi14AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugyu-l4SGUvXOyQjZIN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgziofpHglQxBhnrOR14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw28z27FcR-uTmP2x94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxGcaMW7bB-1bDbhSx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]