Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Let’s say we’re in the hypothetical future where Ai is perfected to the point where you could video generate anything to being lifelike as possible or follow whatever animation style you want. Let’s also say it can also generate lifelike voices that you wouldn’t question if it’s real or not. Finally let’s also say that Ai will generate the most perfect human like script possible. “This is paradise, I can make whatever I want with just the words on a page and it’ll come to realit-“ How long are you going to be happy with that? Congratulations, we’ve reached a point where art has stagnated. No more innovation because why not, it’s not profitable anymore to sit down for hours to create something when you can wait 20 minutes for your next animation. “Well, if someone wants to innovate and make something themselves, they can still do that, I’m not gonna stop the-“ So what, so you can feed your Ai with the next new data it hasn’t integrated yet. Ai will get better. It’ll evolve and to the average joe with no skills to contribute to it, it’ll seem like a god who can do anything. Until one day, it can’t. It’s peak, it’s done. It can’t make anything more because it’s innovated to the capacity that we’ve fed it. We starve for something it can no longer provide us. It has no more innovation. At that point it comes crashing down, the false god has fallen off its false throne, a throne build by the man who holds it as its god.
youtube Viral AI Reaction 2024-08-02T20:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgznyFaudLGQ_plOMAd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxjJgW8HPbn1hL5VoZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzjHkpFlQuvpUszzXF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwTkeZAGVH6yAuVed94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzXjvU0_ECp84hE2Ll4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzRgwESkkb6ljG1VEF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgygcJNnN7Rrp75x8rx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwhitnghWRNhhznaMN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyUH6mIwggC8Hgdikd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugx6QQ3pjGWTasmAUN94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]