Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The issue with AI is that humanity did not agree to it. When a human takes inspiration its agreed upon to be ok. Someone making a program do that is not the same because AI's are not people. So it becomes something that was never discussed. Something that was never agreed upon. Even if it functioned the same it wouldn't matter because the agreement was between people using human abilities. Its like agreeing to share a potato chip with anyone who asks and having robots collect that potato chip on someone's behalf. You agreed to give people potato chips, but they went beyond that. They lost the personal connection that made you happy to give them the chip. Once again its all a people problem. Nothing is inherently wrong with AI and one day it make become a valid tool of artists. However it is currently not being used in that manner. The technology is taking advantage of pre-existing societal norms and contracts not meant for the AI and the end result is not something you can control properly. As for its actual artistic value. I have tried AI to see its value and quite frankly questions of theft and morality aside it simply is not at the level of allowing humans to make art. It does not create what you want to create, it just makes something "close enough" that isn't. Only when you can create the image in your head with AI will it be a proper tool for art.
youtube Viral AI Reaction 2025-03-31T03:5… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzvZU0lVDYN1hM6YH54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxGsiIKJuCyZrIL2u94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxAaot-N0UgV9hHYZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugxd-pP5aMlAE-QJqsd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzerkMN6Vd9P05BcRh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyjQDFDnGze0nsNHHt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwOAb5twNIjEt0ezr94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyfhMSDGZ2Bl9C54yl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxVvoe2WiMfUlXk7wJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxwvDvU2b6Dp2q2eUd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]