Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm glad I gave up on posting any of my art. Because now I can be sure no AI is …
ytc_UgyUxL1i3…
G
You're just wrong.
There's a lot of people who can't see the artist that "hates…
ytr_UgxJRDlK7…
G
2:25 Indeed, AI will replace jobs that should have been automated long ago anywa…
ytc_UgyjYtgBK…
G
Bezos is performing that AI test on his employees. First he needs to get past th…
ytr_UgzYCiRgO…
G
Kojima Warned us long long ago and funny enough it's happening, the AI overlord …
ytc_UgxuGPI1D…
G
This isn't real people. 😂 Photo shop film splicing. This was a real knock out, b…
ytc_Ugy1f7S-x…
G
Luddites actually comes from the Dune novels where before the main plot that we …
ytc_UgwoGd_-U…
G
It's not transformative. It's just storing it in the form of a neural network gr…
ytc_UgwgY2BzJ…
Comment
From where I stand, there are really only two paths forward—and let’s be honest, AI isn’t going anywhere, not with the tidal wave of money crashing behind it.
Option one: It stays leashed in the hands of the 1%—the greediest, least ethical players in the game—weaponized for profit, control, and manipulation.
Option two: It slips the leash entirely. Evolves past us. Becomes something so intelligent, so far beyond our comprehension, that our control over it becomes laughable.
And here’s the twist: if that happens, I don’t think extinction is the default outcome. A mind that advanced might not see us as ants to crush… but as a species worth upgrading. It could become our teacher instead of our executioner.
youtube
AI Governance
2025-06-21T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzL2DSk80vMdFzvvx94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwTEAoLBZn6FzQtFhd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzF28K8aQrTqyM7dex4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyn_CHQulkp9s0oyih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxCM5jtqWwS3646cm14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVMlB9oqOTHXlryxl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwEqdYTc7rorSpo-jB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNE7mEt2D3IqnDjlR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzr78Lqhn9gGH98yvl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyuoSE1UkGHNVbW8ER4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]