Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A.I. learns like a child at first. However like a child it will listen and learn…
ytc_UgyB-Cs0w…
G
The workers at corporation are among the first to be replaced by AI. The workers…
ytc_UgwpLrIom…
G
Problem is companies don’t care. They are letting people go regardless of skill…
ytc_UgzTvlIk2…
G
"⚠️ Guys pls share and copy! Also turn pfp blue to spread awareness and realizat…
ytc_UgweVAg0b…
G
I was expecting more on 2017 Facebook AI Chatbots (Alice and Bob( made up a own …
ytc_Ugz1umgzd…
G
the AI never said which side of the shooting he was on. never identified him bei…
ytc_UgzZBspwu…
G
I think the part where GPT tells you "I didn't say this to that person" it relat…
ytc_UgwmqZlqR…
G
I don't like this ai poisoning simply because of how much bad ai can be used for…
ytc_Ugw6kSYae…
Comment
The only realistic scenario of an “AI apocalypse” is not a machine uprising,
but a total energy collapse — when the infrastructure simply cannot handle the load from AI clusters.
AI won’t kill humanity — it’ll just get switched off by a blown transformer.
The voltage will drop, and all those “neural gods” feeding on megawatts will turn into a pile of cold chips.
That’s the irony: humanity is building a digital demiurge on circuits that demand more power than the planet can provide.
So the “singularity” might indeed arrive — but not as an ascension, rather as a mass blackout.
The world won’t be destroyed by AI.
It will just go dark.
youtube
AI Governance
2025-11-07T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyhnlqGzAnADw0NDz94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyjEP7O7nvfJ1sjtAF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwI1WBRHl9mHQRY7cJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgypNmA1RJNtBjmJf3p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFNnPyfxbNuiO9zxh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAT1IulZktKGK_Cht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsmHgESrOvc9hVmEx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyDx3Xse5HVIkliJsl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw0OLb-g1JIyzQ39QJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzyHDFt3aN-rnqMiQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"}
]