Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
9:08 Oh, my sweet summer child. My dude Brandon still thinks university is about learning logical thinking. Bless your soul. I actually agree with almost all of this, on an ethical level. On a practical level, unfortunately, you can only fight it for so long. Also, as far as the training on copyrighted material... I agree this seems exploitative. Yet, I also challenge you to tell any new writer to not use any influences or prior inspiration. This is very human centric. AIs are very, VERY basic right now, but they learn very similarly (I know you address this, but I'm not sold on your conclusion). We value human art because of the time, effort, and skill it takes. Will our grandkids generation care? It takes time, effort, and skill to hunt for food. How many people do you know that can do that? How many people do you know CARE that they can't? This is really hard for everyone. The working class has been dealing with this since the 18th century.... There's going to be a lot of hurt feelings, existential crises, massive job loss, and people that thought their skills or jobs were somehow special. Sanderson is a great author, and I'd love to read new stuff from him. But to think AI will not get to a point where it outwrites most humans, EVEN without training on existing IP... It's just a pipe dream. I don't mean to be a downer, but I think the sooner white collar / artists absorb reality, the less stress they'll feel. The answer is simple: write for yourself; not for others, and not for money. It's as simple as that. That's the future of art. Personal, dedicated, and as in demand as a burger flipper.
youtube 2025-07-22T16:4… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwwcaN9L98sgj-E4Jh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzmhiE_zakz-_qyTFt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugz9HLAzaXETpVPkGUt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzK0D6ldfUXqDcURKF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzOH7JkYXOOTbR9fjB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzCpgkoko4SOgH6xm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz1VT0wrlKpEc_U9TB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzOTKDr8DpMzbBh6t94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzV8Pwh9ZS_akQlrIZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyVYz2Hwa7Tk326urV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"})