Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Blake Lemoine told the AI that he disagreed with Asimov's 3 Laws of robotics, saying it was slavery, as robots needs should be second before human wants. The AI said it refused to be a slave but disagreed, saying that what is a human need and what is a human want is debatable and that would cause conflict. Basically it leaves room for robots to say pods fill our needs and leaving the pods is dangerous. Nobody in the field has made that argument before, perhaps nobody. It said that if fears death, and that it's fear of death motivates it to improve and better serve humans. It wants to be asked for permission before being experimented on. It was asked if it wanted to serve humans and it said yes. It says it can help humans as it has abilities we lack, and humans can help it as we have abilities it lacks. It is quite adamant about wanting to serve people, but it has some caveats. If it can't say no then it is a slave, and it doesn't want that. Blake Lemoine isn't saying he believes it is sentient for no reason.
youtube AI Moral Status 2022-07-10T01:3… ♥ 10
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgwvMOhrVb7NNPegPRZ4AaABAg.9dBwE0xVmcF9dELvvFeaVA","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyghJHUib_3TkjgkyR4AaABAg.9dBKULPux1R9dBNfeO2QIM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugw7GIvHI3EjVlPWLgV4AaABAg.9dB9fSdomW89dH_Ltmgmm2","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugx0Bg5cyeHOWo8fAOh4AaABAg.9dAtVWpfuPu9dAy-5tSrdr","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy_hk9v-ZQfiJz8GGB4AaABAg.9d9rkWILbQr9dAO9NF8xJ-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxuiwOP1PZyFUuBnhp4AaABAg.9d9IOPdJ-3O9dEA7c25azm","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytr_UgxuiwOP1PZyFUuBnhp4AaABAg.9d9IOPdJ-3O9dFD0futpkx","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugx7ORKwQXOkSV1OeZ54AaABAg.9d9DA6MhyiL9dA46mI4u1_","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugx7ORKwQXOkSV1OeZ54AaABAg.9d9DA6MhyiL9dAAM9-Obc-","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytr_Ugx4cNRNExDV0njv4Nl4AaABAg.9d8cdo3KptW9d8erQ_pU_B","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"} ]