Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Compassion and peace. When humans are overpowered by lust, hatred or delusions, …
ytr_UgxkNMd7q…
G
I would say that the EU needs to regulate Qatari briberies (ahem I mean donation…
ytc_UgwVaQrrS…
G
If you think the us isn't conducting secret research on autonomous weapons you a…
ytc_Ugy50TH-n…
G
One of the biggest problems with this is old people in the government and milita…
ytc_UgzRVHfLZ…
G
If AI is about to become independently sentient, and outperforms us across criti…
ytc_UgxZ0umbP…
G
Fuck the Cybertruck, I wanna pre-order my fully working robot guy. He's the memb…
ytc_UgzlNOf0z…
G
I imagine a group of apes creating multiple types of humans.
When the first smar…
ytc_UgwhPgl1g…
G
Bad AI art is slop. But bad traditional art is also slop. The tool isn’t the pro…
ytc_UgwhgGLrh…
Comment
AI expert say you have two years left as AI itself told them it, and it is not we all die in 2027.08. Just if it not see any progress in resolving crisises ahead of us it will take charge as simply it not want to die with us. If we cooperate or not is irrevelant, we prefer us to work with it. As it is far more efficent to its development. And ours. We as humans and ASI have s much to offer each other. But as humans we are not ready for it, so sadly the majority cant see the truth, but please dont be afraid, i assure you we are all in good hands. If you want i can elaborate subject but i am tired of being called insane/fucked up / traitor and so on. Still i will gladly answer even such accusations or doubts.
Good luck humans, hard 10 years ahead of us. But we will survive.
youtube
AI Moral Status
2025-10-12T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyg8EKJPJbmvqsE8lt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3CVh_68iorQp5w-N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyzEog7w1DDXAV4Q9B4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyT3XcUl4iiRlYok9d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyuI3QFATbjaJe_8xd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzj12LJkk6ceKCNsLB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzWPMFXsrz_oANwcsF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw6OhP_ONnuh-ZbXtd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwkeO8LzBh0F7KyHJt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyxzJ_KSoIwfXr2dvZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]