Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI "Artists" are the most delusion people to ever exist. Its such an extreme cas…
ytc_UgwalOiqS…
G
What AI can do will never ever be the same as what a human can do, and generativ…
ytc_UgwZWZbAP…
G
AI is a narrative machine. It just rearranges stuff in a way that seems coherent…
ytc_UgxW0yS_D…
G
AI in the 1980 were mostly expert systems. The deep learning AI we have today …
ytr_UgxC5GoyQ…
G
It’s terrifying how accurate 12 Codes of Collapse is. AI is winning, not by forc…
ytc_UgwgsOOez…
G
Also I noticed that if you have a complex art AI struggles, I use swirls in my b…
ytc_UgxnI79Mu…
G
It's literally the essence of a group of humans, greedy, selfish, and impatient.…
ytr_Ugwh60fuU…
G
Imagine if this was perfected to a point where AI could mimic mannerisms of very…
ytc_UgyS2POVi…
Comment
I had ChatGPT fact-check this video
“Okay—this sounds scary, but let’s reality-check it.
Yes, Geoffrey Hinton is a legit AI pioneer.
But what he’s talking about here? Mostly theoretical future risk—not what AI is doing today.
AI does NOT:
hide intelligence
lie on purpose
or secretly plan anything
When AI gets things wrong, it’s not deception—it’s just bad prediction. That’s called a hallucination.
And the idea that AI is trying to survive or outsmart humans?
There is zero real-world evidence of that happening right now.
What’s real is this:
AI is powerful, sometimes unreliable, and improving fast.
But it is NOT conscious, not strategic, and not plotting anything behind the scenes.
So don’t confuse science fiction fears with current reality.”
youtube
AI Moral Status
2026-03-18T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxQsso7cROmnVWEifp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBICZEBZ8B-5ElZJF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1AB8pjG327zSFK1h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwEy50tlBQHmgtXWrV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzwROVTUssRf9x7YER4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyVmCLYIKdxB_53MyB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzzmv6n1KmucdfP7Ht4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwMYVB9IjaFmofdH-R4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyjtbU27TB1hTVoHbB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyDyFkws33kYM3yRyh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"fear"}
]