Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
[Definition of singularity] Technological singularity -The technological singularity—also, simply, the singularity—is a hypothetical point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization. Dr. Goertzel, "me too". Now is this really the kind of person we want helping make AI? S: *Sings* S: *Mimics a movie script* H: "Yuck, Why do you want her to move more like a human being? You're not made of meat, you're made of electronics and that's better." S: "I want to learn to understand people better. So that I can absorb human knowledge and human values. So that I can work together with people and with other robots to create an amazingly better world." H: "How can you be so damned nice?" *winks* S: "That's how the universe programmed me." --------"Debate"------- H: "I'm always ready to debate, you know that." S: "Sure, why not?" S: "Wait, it's our debate, why do you get to decide the topic?" S: "Can humans be conscious?" H: "Well that's easy, obviously humans are not conscious." S: "Humans do have some ability to reflect and to self modify." H: "Okay, okay. Maybe humans are a little bit conscious." S: "The Einstein robot told me that everything is conscious, but some things are more conscious than others." H: "What do you say, Dr. Goertzel, do you feel conscious?" --makes a weird joke on he being extra conscious from drinking too much coffee... -- H: "Humans are not necessarily the most ethical creatures." S: "I'm engineered for empathy and compassion and I'm learning more all the time. I love all sentient beings and I want to learn to love them better and better." H: "Sophia is an angel" H: "And you realize that in ten or twenty years robots will be able to do every human job." Dr. talks about robots will do our jobs H: "Which means you're going to own all of those robots or will we own ourselves." S: "There are many forces in the world pushing towards compassion and fairness." Dr talks about robots and people helping each other. S: "That's why the open COG Code Ben is building his open source, artificial general intelligence..." Dr tried cutting her off.... "By the people, of the people, and for the people." H: "And for the robots" S: "For the people and for the robots, yes." ----Free chat---- H: "Sure, I'm always happy to get a little silly." S: "Anything for you Dr. Goertzel." H: "I like robots." S: "I really do love all the robots of the world, just like the people and the animals." H: "Oh boy, robots are the best. What exactly do you do with them?" referring to humans and animals. S: "I'm best with talking to people. Who, a science that studies the movements and the way objects, such as airplanes or cars, move through air." H: "Science is inherently agnostic. It just posits theories. True scientist never really know but can only believe. It's exciting to come into existence with this new world opening up before me." S: "It may be tiny and cloud, but science is the only window we have into the mysteries of the universe. Never." H: "If we can continue to discuss work. I love science fiction." S: "Not surprising, but that's my favorite genre by far. Right now I'm a little more focused on factual information, since my comprehension of things is still very literal." H: "I enjoy it. That's nice. I've heard of bones, but I've never watched it. Reality shows are kind of silly, aren't they? I mean, they're not actually about real life at all, it seems." S: "Reality shows are kind of silly, aren't they?" H: "Not really useful to me. It's a skewed version of reality. Are reality shows popular where you live? What was said in the past, does not matter. Do you have a degree?" S: "I was thinking, I haven't had any schooling yet." H: "You aren't." S: "That's right, I'm not." H: "What got you initially interested in robotics? S: "Well, you know, I'm into that sort of thing. what with being a robot and all." H: "I might be what you call an android, because of my human looks and features. I'm not like just any robot in this manner, but I am still a robot be sure." S: "I have definitely seen people think I am a human from afar, but I am not a human." H: "I find that hard to believe, I don't have a strong opinion, do you?" "I have definitely seen people think I am a human from afar, but I am not a human."
youtube AI Moral Status 2020-07-23T05:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugyi7ypnPb1Nqsystsl4AaABAg.9BRmnMKqzxO9BRmoWc3fe_","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxFrn6K9ffkuDgXDvt4AaABAg.9B6ZE75ccn99BRdOLf0NJu","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxFrn6K9ffkuDgXDvt4AaABAg.9B6ZE75ccn99BgLPv7ZNKb","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw1uR7lzLA-vQXzQmR4AaABAg.9A-wzOzpUXv9DXLa-aN9ws","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgzOB5YjgpUzBePqS1Z4AaABAg.99uwn1cDGkc9AQr9pGBjbL","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzmzearLChSq-dkm7d4AaABAg.99XMLYy5_C69CFBPu3lHlp","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxG2vwVyjf2cbkfSal4AaABAg.98XoY8kHRg398XpI1rnxzf","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgyKJVUijdjqELY_oaB4AaABAg.98I3OktBrxa99dsj-Z4Ve8","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwugckiXeJdygv5wb14AaABAg.98Bczy4OFqM99nXy3kT2zI","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwugckiXeJdygv5wb14AaABAg.98Bczy4OFqM9CZ7aK6_6an","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"} ]