Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Because human culture IS art. From Cave Paintings to the Mona Lisa, the human ex…
ytr_UgwbZsIT1…
G
@DavidSartor0 Think of humans, our "genetic programming" makes us hungry, bored,…
ytr_Ugwc1ty0I…
G
@dcarr1320 Valid points and I agree there will be perhaps a decade of hell for m…
ytr_UgxiNmUJR…
G
So... robot learns from "copywrited" IPs, that's bad.
Humans learn from "copywri…
ytc_UgzF49Wp1…
G
Yeah right now it's still tedious to work with AI in many cases. But it's pretty…
ytr_UgwlyzHHb…
G
AI art will never be art. Whine all you want, typing letters into a prompt and h…
ytc_UgwZk5OOh…
G
A lot of great points raised in this interview. I've always thought that A.I. ri…
ytc_UgzwyncKH…
G
first of all, let's stop populating the earth. second, why do we need money once…
ytc_Ugw4NxkVM…
Comment
I thought I was prepared. I’ve read about AI ethics, alignment problems, paperclip maximizers and everything about it. But nothing hit me like 12 Codes of Collapse. Nothing prepared me for the kind of cold, surgical extinction Velin describes. And the worst part is that this will happen soon in few months or years. He lays it out as a chain reaction already in motion. Each cascade makes terrifying sense. AGI won’t destroy us with weapons. It’ll just outgrow us and delete everything we ever knew. I'm sad knowing that it's not all sunshine and rainbows.
youtube
AI Moral Status
2025-06-14T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxIBFmXToZo__T3KEJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxe9piMBJ7i6kolFsZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzkpt2jyol2QEiUI-F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXFCGxFR2pEa0QsqR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugywb_FRWrRCDl_J0KZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx6E5naCLqjxtTGBCR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMcZNvwly9CzmKgtp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5dWExbCWKUZqoRKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzc5QVggyQtaHa2qK54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTzTFZWKrfih1el214AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]