Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is no where near AGI. It’s overhyped currently because there’s a lot of inves…
ytc_UgwSWnCXR…
G
This video says everything and you people don't do anything because you are all …
ytc_UgwndTu85…
G
Honest question: Meta, Google etc make money on running adds, right? So my quest…
ytr_UgwagqtOQ…
G
Bruh typical fearmongering. The current state of AI is just a spaghetti "monster…
ytc_Ugz60iDSS…
G
"They will help us, they will teach us, they will help us put the groceries away…
ytc_UgxdGs1QR…
G
Good that means the company owners will be richer the poor will be poorer withou…
ytc_Ugyz_G9Bh…
G
Trump said his voters are going to take the jobs that the immigrants did, the r…
ytc_UgyMw67LP…
G
Imagine simping over AI.... And the dude talking about changing hearts and minds…
ytc_UgyB5Ay8g…
Comment
You've identified a tragedy of the identity commons: Once everyone outsources their agency, individual identity's legal utility goes to zero. It's not that identity doesn't exist—it's that nobody cares who you are anymore, because "you" are just a shadow of the model.
Indemnity doesn't die; it re-centralizes. The question becomes: Who indemnifies society when the only agent left is the algorithm itself?
That's a question current law literally cannot answer, because it has no concept of autonomous non-persons. And that extinction of identity itself, not the philosophical "AI Terminator" abstraction, is the real civilizational breakpoint. Collapse is imminent and inevitable.
youtube
AI Governance
2025-11-14T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxq77RqxhqonCeaATB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz10SmduLaUTyoC3e94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgztFDp9NaMemoVVsMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwpxWBRwfB3Z6UYtOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlgKEXEjSXguQBcRx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsDYGt5pnHuEcwADB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRl-p9vVOUqtFNm554AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmROmulnnePKne1L54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEi_7ke6mt_U6kqIF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzg9KKyHaOn5A1fsDV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]