Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When an AI can, on it's own, create an AI better than itself - I think there's n…
rdc_kk2p6ks
G
Artificial Intelligence or AI will take over a lot of jobs, done so far by human…
ytc_UgykAIQlK…
G
Thank you. I’m a long-time listener and really value your curiosity, passion, an…
ytc_Ugz_SVHba…
G
How do you define thinking?
It’s clear AI can learn and make neutral connection…
rdc_mzvxxwo
G
You never talked to ChatGPT, did you? This is entirely on par with what neural n…
ytr_UgzxZhqbk…
G
I have a strange feeling that more and more women will become involuntary single…
ytc_Ugwyl90EP…
G
Ai might not be able to create a different original like that but if trained on …
ytc_UgzWpuBbD…
G
I love using AI, but developers need put in harsh safety measures in place to pr…
ytc_UgxBDQ5f7…
Comment
Ezra has a point about humans being in continual negotiation with AI. But he's contemplating a widespread effort with the spectrum of mankind having representation at the table. Peter Thiel on the other hand doesn't seem to want to bother with most of the human race, and in fact has publicly referred to Yudkowski as "the antichrist." And these are the people who will be doing all the negotiating with AI.
youtube
AI Governance
2025-10-16T02:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxYZZUWf1e0BmiKVjB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyj61IC9y4O1eajFIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw3j7ix_m4O6fjeX954AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4-TMZuxbJYngQ8Mx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw58XvKpbBYlzWFchJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGHqp3D-7GTb5h_Id4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwzjuhdXejZ9g-vYPJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxEeUhfSG0D_ImVweV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzohPviAeIyf6Vgdm54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxjhUMpviZsQdzeHKJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]