Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here are a few things I want to talk about: 1. AI doesn't have to learn like a human. 2. There is nothing wrong with anthropomorphizing AI, John McCarthy in a 1979 paper wrote the following: “To ascribe certain beliefs, knowledge, free will, intentions, consciousness, abilities or wants to a machine or computer program is legitimate when such an ascription expresses the same information about the machine that it expresses about a person. It is useful when the ascription helps us understand the structure of the machine, its past or future behavior, or how to repair or improve it.” … “Ascription of mental qualities is most straightforward for machines of known structure, such as thermostats and computer operating systems, but is most useful when applied to entities whose structure is very incompletely known.” 3. Humans don't need as much as data as AI systems because of many reasons, the biggest one could be evolution. Evolution is also a type of learning, we already know the basic things from the beginning. 4. “I found it on the internet, so I can use it.” is not a pretty good argument, because it's not an argument at all. It's just making a statement. You made all this video but didn't even argue why is it wrong to train on drawings. Imagine I downloaded a bunch of drawings from the internet and have a machine learning system that learns something basic from them, like the most popular color accents or something. Is it wrong to do that? I personally don't think so. Why people can learn from drawings, but AI shouldn't. That is the main question here. And I'm saying all these as someone that actually draws, and is a copyright abolitionist.
youtube 2024-07-15T20:0… ♥ 2
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgxR7bKx7hPeTClh9Ox4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy0083XmOvbI3fQWLd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzZoY1pa-_2ayoG-R54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzUhz6LmOTU0aTVMvV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz0mcfrhrBq9xXCSWF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwsF_CxY2y7BhElGKZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy-eBxbqKd0EQ9F_Ad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxdZShYhyU18HD1Asl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxBhToXCxm7z_U5w-54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxnRf-8D2xc7Lx-WlV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"})