Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@freerangesimp If you made and distributed a photocopier that came packaged with 60 pages of a Harry Potter book in its memory and able to be printed on demand, you're claiming that would not be copyright infringement? "This is no different (in principle) from users of peer to peer networks sharing content. The network enables a broad activity" Yes it is different, as the AI companies specifically LOADED the content on this "network" (in your analogy). Your analogy would only work if the AI companies only created the training algorithm but users provided the data and did the training. Users did not. Anthropic (in this case) chose and specifically put that content into the AI model. Like storing long text blocks in the memory of a photocopier that was then offered for sale. To be clear, I personally believe copyright law is entirely broken and pretty much all stuff more than 14 years old (including the Harry Potter books) should default to the public domain. So I don't personally give a crap that these models memorized large portions of their training set and can reproduce it on demand. But under current copyright law, it seems these companies have trained engines that can spit out large portions of copyrighted works verbatim. That's not fair use under any reasonable definition.
youtube AI Governance 2026-03-22T00:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzvEKu5YEtMQd_Jno14AaABAg.AUcStaH5xxfAUchO2gAgvF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgzbGTgkQL2qq0nJsi54AaABAg.AUcQkT_dHamAUd1kpX2BHY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzbGTgkQL2qq0nJsi54AaABAg.AUcQkT_dHamAUd9fXkau6k","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzbGTgkQL2qq0nJsi54AaABAg.AUcQkT_dHamAUdBDLkLTad","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzbGTgkQL2qq0nJsi54AaABAg.AUcQkT_dHamAUdD1kJYM2n","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgxJZB46iYoPrqXI7QF4AaABAg.AUcP5nLkWvkAUkys9vdwEr","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwIYkAfjFhaX_Jfta54AaABAg.AUcOdEI2sL-AUcPs4bA0jB","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzSCHxS9STET2nWvn54AaABAg.AUcDZ1-N1m3AUdXMv58puH","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytr_Ugz9QYU3jUueXVkQVxN4AaABAg.AUcA_ATPElTAUcPcEeDo3p","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytr_Ugz9QYU3jUueXVkQVxN4AaABAg.AUcA_ATPElTAUclGsu73_O","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]