Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ok reading all the comments favouring ai... Damn...yall are just lazy as hell... And also have 0 patients, you just want 10000s of content spewed in 2 nano seconds. Did you think that maybe, instead of consuming constant content nonstop, you should actually take the time to feel something, process your emotions? Like cool in 1000s of years it can learn to replicate human emotion and passions and you can write a precise prompt or whatever, but like.... Why? So you can go "hehe, nice, next, hehe, nice, next, hehe, nice next", yall are becoming emotionally stunted, it's literal brain rot. Cool, you made a device that can replicate human emotion in 1/100000th of a second and make 1000000s of deep emotional art in a blink of an eye.... For what? What's the point? So you can get a constant dopamine rush like a little baby who cries if it doesn't have entertainment in front of it every second? Just make a machine to dangle keys in front of your face, it's just as effective with the given goal. Ai can make a pocket dimension with infinite posabilaties for all I care, if all you are gonna use it for is pump your cains full of happy chemicals 24/7. Imagine a species making a device that can create infinite energy, but only for itself... It's cool, but why make it if it doesn't help the species what so ever. Why are you making art? To look pwety? The you don't even need complex prompts and the ai to mimic human emotions, just have it make abstract colors and that's good enough for your goals
youtube 2025-11-17T17:5… ♥ 2
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgykvjnG81pasfNVRa94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxHGIOEXkoAvfgc-oN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugykopv-9NTyanlT6G94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy5PibtdmsbJ1D55jJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwxNP_ZxJHfc0UhbWd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwS-8MNcCFaVkMA4Xp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwFRnO8BDf2JsSTKG94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwujPr21PBbvZYTe1J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwCt49d6uEkpUJL5b94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzD-szxCIjtRVApgMx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]