Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pluto is passing through Aquarius for the next 20 years. Have a look at both Plu…
ytc_UgxtwVvU7…
G
Its just souless, also ai almost never actually makes exactly what you want, its…
ytc_Ugwhp16Np…
G
I dont know but sam has something off in this video and i noticed a big cut in t…
ytc_UgzCVRQTO…
G
You can argue against AI and lament and you are right so in doing it. But the bo…
ytc_Ugxs8_X1r…
G
So...be wasteful and stupid and still get paid for your errors? Where do I sign …
rdc_ckq6i9m
G
You can recreate it using the AI and avatar noted on screen. Answers will vary, …
ytr_UgyKe7KRd…
G
I'm one of the human's who absolutely doesn't trust Ai and I know it's not 100% …
ytc_UgwHzUUS7…
G
It would be nice to have an international treatise requiring the development of …
ytc_UgyfnBJ2M…
Comment
Ezra Klein does a great job of critiquing this guy's arguments. There is a gap here between AI can become misaligned and AI will kill us all. Huge gap. More likely, in my opinion, is that humans will use AI to do very bad things. That's why we should be concerned that militaries and powerful corporations are racing to build super intelligent AI.
youtube
AI Governance
2026-01-06T02:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzesVC2fNbzorJKwEZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy3xQz8zKD8ApbXqrB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyx48xRodopeGKrvFh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzqugwni3NHFQBVHFJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_giddtszVAZIAyeV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwo8TctWPQa3lB_whx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy6vSx83zfa_kPliLF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqStuTdkR57OWbGiB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0PS3HBIl8EDk2F354AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz7bkNKsymd12Igu4B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]