Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Important ethical talk from November 2023 around the time I got into generative AI, although outdated by now from a technical point of view. It's true however that copyright infringement definately will and already is a problem. With all the powerful AI tools and models in upcoming 2026, including my own ones, anyone can train their own complex models on anything you can find and download on the internet, such as vast libraries of text books, photos, images, videos, audio, videogames, anything. We are now entering the limitless and abundant Age of Artificial Intelligence. My personal prediction is that even copyright issues will be replaced by AI where nobody will own anything digital anymore, not even the corporations who provide AI platforms. Ownership will disappear on the internet. Some of us artists will be scared by that and against it ofcourse, including the music and Hollywood film industry, others may see it as an opportunity where anyone can share and use their content with each other to create something bigger. So yeah, it's a critical question if we as humanity want to go that far with all the AI at our disposal. I also guess each country will make up their own laws how they want to manage these important issues, including bias, racism and other negative things included in AI. For instance, India has recently introduced laws where any image uploaded has to be labeled whether it is real or AI-generated.
youtube AI Responsibility 2025-12-02T16:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyliability
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgzJ3gpi9meq_nlte-14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwA9TEYo4yu-TUicpl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzQvJuXGqdRiye3Wut4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugxrhi_DntOUnp8JMjZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugw1nH1S-gANYx3cNRh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzDj5vbTgFM2phbBB94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwwYOgWiGCDgn5nSEF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyIR3o6J4Vi75BsYop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxEI3fKyCOLXnd-3a14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzfEZPCdHd8ak9Ptrl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]