Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
(apologies for the too many words) If you want the two cents of a crazy person, I think there are two assumptions humans make that makes it very hard for them to make ai. The assumption that humans don't hallucinate patterns from data, and that people can sort out fact from fiction from purely a knowledge base. When creatures evolved eyes, it was to cannibalize near relatives more efficiently. What tools they had for recognizing similar of the same species was mostly touch, and it wasn't very effective. This is similar to every other traits that evolved, sight, touch, and balence might have been originally to consume more, but the more data the creature could take in, the better it could recognize family, or it's surroundings, change its food source entirely, have reasons to move. The assumption that a trait that explodes in popularity happens because we were 'going to get there eventually' is a misnomer. Even looking at the structure of a brain, it's only a few species that got lucky enough to develop it, and it, given their structure, to detect scarcity or wealth as a fundamental building block for every thought after, that is not a structure that would develop in a creature with excess, and just because the species evolved to look different later and retained the trait , doesn't mean they don't all have the same body plan( look it up, embryos for most species with a brain are near identical) Humans, and human level intelligence developed during the ice age, go ahead, look it up. It developed when pack was literally all we had, when the only method to survive that only a few could find was to get more out of their food, and to get to fire, get to spears, get to complex trajectory, it took a lot of incendtal steps inbetween If we look at AI by this standard, the largest alterations that have achieved the most was stressing it's ability to mimic, which yes, is an evolutionary trait we see as valuable. It's how we create language, share ideas, and we are a finely tuned machine for this purpose. It is well known that humans have issues with pattern recognition, our strongest trait. Our ability to hallucinate data is so strong, there careers of con men, writers and psychologists built around pulling it apart, using it, and releiveing it. We do not work out hallucinations, the most common ones at least, with more data. We do it with constraints. Language developed as a mimicry structure to encapsulate ideas we could not display with our more flamboyant body parts. You count with fingers until you can build a structure to contain the numbers outside your own body. Same with sounding out words, or breaking down scientific observations with live tests. You get the idea on paper, and nine times out of ten we do this with some sort of stimulus, most often sound or sight, sometimes 6th senses like feeling, and touch or taste are well known to increase memories and make them more vivid. We are not good memory machines, we are terrible at retaining data, and corrupt it easily. We are great sensory machines, incredible in fact, if you look at every standard for technology, it took centuries to reach human level, thats how good we are. What I think needs to happen, if we really wanted human level ai, is that they need reasons to retain, extricate and use data. The only thing keeping an AI alive is it's ability to mimic and the limited sensory of a yes or no system, sometimes a limited memory. So, evolutionarily it um. It's on par with a virus. Maybe a simple creature if you want to stretch the definition of its memory into a body, I mean there is creatures above a nonsentient peice of protein that also can't recognize the difference between its own body and a hole in the ground, but thats still not good.
youtube 2026-03-21T15:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzfCVc7pVa3HH3JCjV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzQ7ZVR2617wzaBLmd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzCP0B8gITo5feE1sV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyM7KpyP4Y-A4dDJrl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzW9lUc2MYlfyOMLLx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyshaDX-Rtu7zNZO_R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzjT47dQ5yjYIOljUh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgztwuORH16dl6Ftiat4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwl8AtN1Vd2WugBjpt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugyo4ZYLcBwGdr2pKX54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]