Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@saurianwatcher4437 Intelligence does not require a sense of self. Period. Solipsism is a worthless thought experiment that is so pointless it is a waste of time to ever even bring it up. In the end it does not matter if we are a brain in a jar, because an inescapable subjective reality is effectively the same thing, for all practical purposes, as objective reality. Asserting that "if it isn't exactly like us then it's not intelligence" is just anthro-egoism / anthrocentrism, no different than saying "humans aren't animals, we're special!" No, we are animals, and there can be other forms of intelligence besides human-like intelligence. To the extent that modern LLM's are inaccurate it is due to inaccurate training data. Had we a PERFECTLY curated set of training data, of similar scale to what we use today, I think people would be downright SHOCKED at how amazing the results would be. Of course that is beyond impractical, it would require billions of man-hours of effort and errors would still sneak in. It's the same with humans, humans "learn" things that are wrong all the time... people think they "know" things that are incorrect, and of course so do these models.
youtube AI Moral Status 2025-10-31T06:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugyu6z4Pp0svDkQdioV4AaABAg.AOvWlkghdIeAOwHPKKoVXh","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzuZRURQSeeS-QzHsR4AaABAg.AOvWYTnzRcKAOwAgj_NNJj","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgxT7RhFToA3B5KS5el4AaABAg.AOvVT1lAWuUAOvX28fpa8B","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugxm7-V2cw080X9sQZx4AaABAg.AOvVHzWnHuTAOwJ3QLWO2U","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyCfMdD9BZ9eMYKsqd4AaABAg.AOvUflNyaVbAOvlKIM-c07","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyCfMdD9BZ9eMYKsqd4AaABAg.AOvUflNyaVbAOwEBayQVOR","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgyCfMdD9BZ9eMYKsqd4AaABAg.AOvUflNyaVbAOwFlVGWy1q","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw-S1nEvQFHU322zGt4AaABAg.AOvSWEDCLLeAOw5yAMjmCo","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_Ugw-S1nEvQFHU322zGt4AaABAg.AOvSWEDCLLeAOw9OWiySM3","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugw-S1nEvQFHU322zGt4AaABAg.AOvSWEDCLLeAOwB90dnOe1","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"} ]