Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In this semester in college I'm in a poetic analysis class. There are so many fascinating discussions to be had in a poem based on the traditions an author pulls from, the way their life experiences color their work, and much of close reading comes down to understanding intentional minute decisions an author made -- why did they use this meter, why repeat this line, why this word? Answering these questions is key to a deeper understanding of poetry. However, the answer for AI is obvious -- it saw this pattern somewhere in its training data and decided to copy it. A human might use "rose" instead of "poppy" because of the poetic tradition associated with the rose, a larger poetic project that reveals itself throughout the work or even in other works of theirs, etc. But a robot, they just use the word rose because it's more popular. It's like that joke about English teachers asking why the curtains were blue -- authors do have a reason for what they write, a reason to specify the curtain color, even if people at first don't think it's the case. But AI doesn't. Because it has no reason to make art. It just does it because someone asks it to. It's probably extremely similar for visual art, though since I am not an artist or an art critic, I can't give good examples. But, if I had to guess, questions like, why is this character given this hair color or in this pose or in this location? Why use this line weight, why use this facial expression, etc? All of those things probably have answers, the author has reasons, even if subconscious ones related to their personal experience -- but AI doesn't and never will.
youtube Viral AI Reaction 2025-04-18T16:4… ♥ 4
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugwbbzafz1TiJ7udnl54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwEoJbU2qdMD9qyXel4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzpqAwsh55ABQ73bK54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwqZTXap7ps0XaRKql4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw3H44MCS-lrKPlV894AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyHfye1G0qGJREQoa14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyduqeDhwBu8Tmd8mx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx6js1rNJWvoHpPS5F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyEHZyU6hj_Lizvs6R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzIp_OkcEBBygm81bN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]