Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well, the buggest reason why I could see it be a bad thing is if humans rely too much on the AI. I think there's a good possibility of us using AI to make our lives, not easier, but less frustrating. Like, no one really wants an AI that can paint beautiful pictures with the level of flair and soul that a human could, cause humans can not just do that, but they LOVE doing it. Instead, give the AI to do stuff people DON'T like. Such as figuring out the perfect list of things to buy in the store, based on the house's needs, and the amount of money you're willing to part with, or providing you with a list of discounts to every single store article in the stores you visit the most. Or even in animation, to allow the AI to do the part of animating that the animator in question dislikes doing the most, so they can focus on the part they enjoy, and let the AI do the annoying part. (Like if the animator doesn't like lineart, they can give that job to the AI, while the animator does all the other fun-for-them things, for example) And even then, only with complete, proper regulation by the state. My biggest issue with AI now is that it's completely unregulated.
youtube 2024-07-12T14:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugwv0lIgewJ-6FTaCE54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz3FFF9BqRIkDfAn194AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwhv-eXCjoApwKlgzx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxwAh4vZT-IwIzRWjd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyTSwzjkKeJTFe7dvB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyBnhBtM4KALoWtuDZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwo6HflziVqS-w5j3B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw5YcQlYv3NDMRBVx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyuM7GVSBgol4nB2z94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwhsJDQidpH1VGO_eJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}]