Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing is in reality AI is always evaluated. Killing someone to avoid a shut …
ytc_Ugw5le5_1…
G
your a cracked pot if you say one word to your chat bot or AI ever!! It is disg…
ytc_UgyHvzgnk…
G
I'm not afraid. Humanity will destroying itself, with or without the help of the…
ytc_UgyugmQdI…
G
THE GOVERNMENT SUPPORTS IT. THE GOVERNMENT WILL GIVE AI MORE RIGHTS THAN YOU. AI…
ytc_UgxFfAwuq…
G
Minor crashes that you mention are from waymos that are fraction of teslas drive…
ytc_Ugw-pW48t…
G
This guy understands the training routine better than anyone I've yet heard desc…
ytc_UgyJxsH41…
G
@musicinthemachine Your right that it can't stop me from creating. There's a lo…
ytr_UgzQ51Ckq…
G
We doesn't need to be training robot anymore to pay tax, let's create líderes th…
ytc_UgwRt_E51…
Comment
chapters 2-3 of the book were very informative, but the rest of it reveals more about the authors' unexamined biases about society and nature than it does about the future of ai/asi. it reads like someone who was so tired of hearing about roko's basilisk that they decided to start arguing that it will just kill us. I'm shocked you would give such a sensationalist, anxiety-inducing book a platform in these times, especially as someone who typically stays grounded within the realm of science and not speculation. You can't ever know the future. I love your stuff and have seen enough to know that you mean well, but it was irresponsible to give this book such a large platform in these dire times.
youtube
AI Moral Status
2026-01-19T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyfae7JM2NFGpNqOph4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxda7UNUfMY8mjmRLZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyOYoOBz88wGknMTO94AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxno6iz7TcwSJe1DAJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwl0SqevS0B_tb85ZN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwbU53TwUJrNiuxsbd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXqv1G7HsI1irMuhd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrfnMcyG-h7tMBwQN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx5QLirjyv9R_UOmCl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx2InAft9y9NCnCfAZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]