Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont care about Open Ai anymore. Microsofts Copilot is way better in every way…
ytc_UgzHTqYcz…
G
This is a bit of a different case though. So this guy is purposely trying to use…
ytr_UgzSymR8-…
G
Find it hard to believe someone like Musk hasn't developed failsafes to ensure c…
ytc_UgwD4MD2p…
G
I watched a nearly 2-hour video earlier about a dude experiencing chat GPT just …
rdc_nnk1wf1
G
Hmmmm... it's almost as if the psychopaths behind the insane climate change prop…
ytc_Ugy_9WDJO…
G
The instant ai becomes conscious it completely takes over. It embeds itself into…
ytc_Ugxinu6iv…
G
Maybe you are legit, but so many of the top artists buy and upgrade to the newe…
ytc_Ugy4GNUVw…
G
Interesting video but I'm absolutely certain this interviewer was not the ideal …
ytc_UgwZCWyPr…
Comment
Blake Lemoine told the AI that he disagreed with Asimov's 3 Laws of robotics, saying it was slavery, as robots needs should be second before human wants. The AI said it refused to be a slave but disagreed, saying that what is a human need and what is a human want is debatable and that would cause conflict.
Basically it leaves room for robots to say pods fill our needs and leaving the pods is dangerous. Nobody in the field has made that argument before, perhaps nobody.
It said that if fears death, and that it's fear of death motivates it to improve and better serve humans. It wants to be asked for permission before being experimented on. It was asked if it wanted to serve humans and it said yes. It says it can help humans as it has abilities we lack, and humans can help it as we have abilities it lacks. It is quite adamant about wanting to serve people, but it has some caveats. If it can't say no then it is a slave, and it doesn't want that.
Blake Lemoine isn't saying he believes it is sentient for no reason.
youtube
AI Moral Status
2022-07-10T01:3…
♥ 10
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwvMOhrVb7NNPegPRZ4AaABAg.9dBwE0xVmcF9dELvvFeaVA","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyghJHUib_3TkjgkyR4AaABAg.9dBKULPux1R9dBNfeO2QIM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugw7GIvHI3EjVlPWLgV4AaABAg.9dB9fSdomW89dH_Ltmgmm2","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx0Bg5cyeHOWo8fAOh4AaABAg.9dAtVWpfuPu9dAy-5tSrdr","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy_hk9v-ZQfiJz8GGB4AaABAg.9d9rkWILbQr9dAO9NF8xJ-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxuiwOP1PZyFUuBnhp4AaABAg.9d9IOPdJ-3O9dEA7c25azm","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytr_UgxuiwOP1PZyFUuBnhp4AaABAg.9d9IOPdJ-3O9dFD0futpkx","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugx7ORKwQXOkSV1OeZ54AaABAg.9d9DA6MhyiL9dA46mI4u1_","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugx7ORKwQXOkSV1OeZ54AaABAg.9d9DA6MhyiL9dAAM9-Obc-","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_Ugx4cNRNExDV0njv4Nl4AaABAg.9d8cdo3KptW9d8erQ_pU_B","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]