Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon Musk: AI is dangerous for humanity
Also Elon Musk: *gives Thompson submachi…
ytc_UgyYM7bIe…
G
I love that Ai and its generation reminds me of the "Heart of Gold" drive in Dou…
ytc_UgyjuPPvS…
G
I beta read someone's work for nothing. In fact I've offered it to several peopl…
ytc_Ugyh5x01k…
G
Seems to me that using current A.I. law to unravel or make more specific, "work …
ytr_Ugw5XQ1jo…
G
@laurentiuvladutmaneaMidjourney is a tool. Firefly is a tool. DallE is a tool. R…
ytr_UgzNeRh6S…
G
true I'm an artist too
it's chill
the part that is annoying is when people claim…
ytr_Ugy_u-GjL…
G
"but if AI can improve enough, maybe I can do it myself." are you REALLY doing i…
ytr_Ugz0h-nDG…
G
"We're all racing toward the cliff, looking at each other asking, 'Why aren't we…
ytc_UgyKIG71C…
Comment
This AI thing is scary but the thing about 12 Codes of Collapse is that it doesn’t just scare you. It changes how you look at everything. Every news headline, every AI tool launch, every global incident — it all starts to feel like echoes of the book. And you can’t tell anymore if Velin predicted the future or if we’re just slowly catching up to what he already knew. And that realization… that’s the part that truly haunts you.
youtube
AI Moral Status
2025-06-13T21:0…
♥ 1053
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxIBFmXToZo__T3KEJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxe9piMBJ7i6kolFsZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzkpt2jyol2QEiUI-F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXFCGxFR2pEa0QsqR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugywb_FRWrRCDl_J0KZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx6E5naCLqjxtTGBCR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMcZNvwly9CzmKgtp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5dWExbCWKUZqoRKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzc5QVggyQtaHa2qK54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTzTFZWKrfih1el214AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]