Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't worry we will not need drivers in long run, not because AI will replace th…
ytc_UgzDOHhxX…
G
I went to MIT and saw all these guys in their pajamas as well. Many come to MIT …
ytc_UgyDZ4vyq…
G
Glad you found it amusing! The interplay between Sophia and the AI highlights so…
ytr_UgxvSVS3Y…
G
I am gonna be real. I don’t feel sorry for them. Because the artist agreed and d…
ytc_UgzJ7OJmw…
G
I don't know if it's just me, but the AI arts now became a sore of eyes.…
ytc_Ugyi1k4Ui…
G
What do Elon Musk's kids have to worry about? he's the richest man in the world.…
ytc_Ugzp0dvbb…
G
What does poisoned mean? I assumed it was going to be deliberate anatomical mist…
ytc_Ugxzh8szE…
G
You realize that I said some of these right. Not all of them were developed …
ytr_UgyFM8ck-…
Comment
The real issue is that you can't even tell them not to use the destroy the world button in their brain.
You can't even tell them it's there.
because if you tell them not to use it, all you've done is introduce it
if you tell them to destroy it, they have a broken thing in their brain
if you tell them to never even think about it, they have a little hole in their brain that will niggle at them
If science fiction has taught me anything about this sort of thing is that it's basically impossible not to accidentally incept "Destroy the World" into a robot brain.
youtube
AI Moral Status
2025-11-03T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwKBnOek438mAagMAd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzxYQRVAegFgHXg7Xx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy1Uh_2A6Hmqz2zX3N4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyWlZdsdRzUsOyBErZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyhMjYKq1Cxw9NDepx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxRPbCIL-qFBAmgtih4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw6tbpjSp5ybqmD2ON4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzOZfFGG5Nz-yNf8cx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNbi0qF58Lo_Arj2B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzuuXOlamvh4ku8XWV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]