Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As a film major in college, it’s instinctual to be upset that someday my job won’t be necessary. What I imagine will someday exist is a subscription AI platform that will allow a consumer to enter text and output a tv show or film based on that input. I think the same will exist for other media, such as music and traditional art. The bad news is it will be increasingly hard to be profitable making art. However, I can’t say I would blame consumers for killing my job in favor of AI. If I weren’t a filmmaker, would I rather spend my days scrolling Netflix to see if someone else’s idea of a good show sounds remotely interesting to me? Or would I prefer to come up with my own ideas, characters, worlds, and storylines that can be produced specifically for me? Of course I’d probably choose the second option as the consumer of the media. Would I rather buy someone else’s painting, or create a unique painting based on my own personal input that looks just as good if not better at a presumably lower price? Of course I’d likely go with the AI option as well. There’s a good and a bad to AI in art, mostly bad for the creators but good for consumers. Imagine a world where I could tell an AI to make me an alternate ending to my favorite movie because I wanted it to turn out differently.
youtube Viral AI Reaction 2022-12-28T16:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxOmfiJu7Xjc5QNVPl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxbY79kYFuF7ZMhkNV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgweMdG64E8HXmhoD9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy1vAnWlc8IX2eBjTR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugyjta7Ig2U05YnRywZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw1NamUihBuZuiwfwN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzy0vjfc3yMhF0JTh54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxyxOdFrQkF4OSXjB54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxEJekaTpJPhr44gcR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwZSqBNKPJaK4blryp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"} ]