Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And here we go again. Just because it's a new technology doesn't mean people hav…
ytc_UgxjFmnE2…
G
The weakest spot in this is that AI's don't vote (yet?), so representative shoul…
ytc_UgwJeOGs6…
G
Weird theory but bear with me: If Suchir had some underlying reason to commit su…
ytc_Ugz0bOjPg…
G
Remember folks, if someone tells you that AI art is better than actual art, draw…
ytc_UgxSS4xpy…
G
The big beautiful bill prevents any regulation of AI for 10 years.
Throughout th…
ytc_UgxJOnPDe…
G
I’m an amateur artist but I’m still slowly improving. I’ve considered using AI t…
ytc_UgzUAoiYi…
G
I could write a book on the complete disregard of basic employee rights by Amazo…
rdc_grlpf6l
G
I laugh at the fact that people get so antsy over AI. Show me a machine that plu…
ytc_UgwvruGaS…
Comment
the ridiculous thing to me about this is that super intelligence should not be posited as the threat. like, yes, obviously, if anyone develops that we all die
but i suspect we are all dead if any of these systems, even non-AGI systems, develop any amount of real autonomy, we are all dead
i mean, a few hackers in one building in russia basically tore apart the entire electoral system in the US by trolling on FB. the reality is, we want to kill each other
we don't need some fckn super-intelligence at all. just a version of ChatGPT that one day tells people there is an outgroup and they need to kill that outgroup
youtube
AI Governance
2026-02-08T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxuSEuDMEC0tjiJE9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTCxNwGq_lgqDh3Kx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyAx2Qpr6NczK02Snl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNxLLZ3dY_a9Gt6Dp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzinWPBk9jl7p9eQvZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxX6Vu6aMBrr_4GDv14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdCq2DGG1FE1ibytx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwWW2Il3p8Faim5A6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsHTOyhAvQG85uZP54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx849mvh7WM2eMtkQ54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]