Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is to show the "effort" that prompters put into their art, not thr result. Al…
ytr_Ugy2Hmdmv…
G
Yes when you're replaced with AI you can just go away and fie for all they care.…
ytc_Ugz-xtR6j…
G
You're really supposed to make an account and train it. I been working with mine…
ytc_UgxwH1DZZ…
G
Next they're gonna build a machine that implants memories, so you don't actually…
ytc_UgxF4MUvZ…
G
That first post - so you use ai and are willing to essentially get paid $0.50 / …
ytc_Ugz_z7H-6…
G
I would answer that question from the intro: What I have that AI doesn’t is the …
ytc_UgwLRp7WI…
G
I think this has the potential to go downhill really fast. Once doctors start re…
ytc_UgzALG3Is…
G
There's a Theoden related quote about being the "lesser son of greater sires."
I…
ytc_Ugz5WSvb7…
Comment
If you take a P-Doom of __% , a great way to think about it is to imagine 100 Earths. Extrapolating , you can say, if there is even a lowball number of 10% extinction risk, that means that superintelligence on average will kill 830 million people. That means that the person who creates superintelligence would (on average) kill far, far more people than Stalin, Hitler, Pol Pot, and all the great tyrants of history combined. In terms of death and suffering, these nerds building AI are worse than the worst psychopaths. We can’t treat them like just nerds. Men, women and children in great numbers could be killed or starve or have all their hopes ruined by these harmless looking nerds.
youtube
AI Governance
2026-02-24T23:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyguxlSmhlIKh4gZdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweDRYen7rHTPUc3lR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgywEqcCcgDCABDNsJt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWtSy0N1tjtMhxcZd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVfVxsND3Ua3tNcqV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrISWJ7hLjSvcP1Zd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyK5F1g2Q8-W0m5Wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGSkrwJrn7aXPYS454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlL3VQZqpQHX0KRBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0R9HqqG275eEcUxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]