Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
When I think about sentient AI, I like to think about Data from Star Trek. He is an android with no emotions. He is extremely intelligent and very strong. He is fascinated by the biological life forms on the Star Ship with him, and strives to understand humans better and be more like them. I know you said they have to be able to feel pain and love to be sentient, but Data always felt sentient to me, maybe in a somewhat similar way to how a Vulcan is. They start with emotions and learn to entirely or almost entirely suppress them in favor of being as logical as possible. There is this episode where someone dies, and they ask Data how he was taking it. He didn't feel sad because he doesn't have any emotions, he said he had become used to her, and now he felt like there was a gap where she was supposed to be. He has a friend on the ship, the crew member who accepted him the most quickly and treats him the best. He is attached to this friend in some way. Not an emotional way, but you can tell he cares about him. They like to spend time together in their free time sometimes. There were times in the series where I wondered if Data really didn't have any emotions, because sometimes he seems like he has some mild emotions, but the show insists that he doesn't. He gets a pet cat, who he obviously cares about. He experiences curiosity and confusion. I wonder if AI could ever be like Data. He seems to be leaning on the "good" side of morality and sees value in human lives and other biological life forms. I love Star Trek, it explores a lot of interesting questions like that.
youtube AI Moral Status 2023-08-25T23:2… ♥ 83
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policyunclear
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw6GLzhMGXHrN3N0wZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz-OeFG85b2HofvFS54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyqMv7KhLlhbiYmR0x4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugw57BuPF7ByZl_CvPJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxoOhCPxgT14MvEbFx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwRrphOkv681auiucd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugwzc9V9KYzH7QxZSrF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyUGofJpwdW6YCFgSx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyiziW-yRr1hXAVRUB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzb_KKiGApgpb9Tg_54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]