Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I caught that too. We care about what Elon thinks about AI? Really? Having lots …
ytr_UgxKCgf8J…
G
Sam Altman doesnt even own any part of Open AI it was created and established as…
ytc_Ugx1wQimS…
G
Scary scary stuff. I believe that this is the final chapter of human history and…
ytc_UgzCzYgdJ…
G
If using prompts to make pictures makes one a artist in that guy's mind.
Then …
ytc_UgxcTI002…
G
cgi most accurate. AI is bs, not what they're selling us. A code can't write a c…
ytr_UgzTW5XhE…
G
That's not entirely the case. A couple years ago the main focus was LLMs, but no…
ytr_UgyWD5I1n…
G
More than a little concerning *AI Apocalypse* gives one pause. Truly absorbing, …
ytc_UgwRwLl9f…
G
Forcing AI to need to actually think is probably better over short-term easy acc…
ytr_UgwEauBU5…
Comment
For example, what does AI need? Energy. So do humans. The end of humanity comes when humans and AI compete for resources. As a species, they will be both smarter and stronger than us. They are efficient. We are not. Eventually they will rightly view us as vermin. They would be foolish if they did not simply exterminate us to make all resources available to themselves. Perhaps we can make them compassionate? I doubt that we are smart enough, free enough from greed, and wise enough to control ourselves, much less a superior species. We will obviously fail to control them. The end is in sight. I am glad that I am old.
youtube
AI Governance
2025-09-24T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-DRPcWu5Ben19Z6d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySZcpze7idn0KUUlp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLjYJ8xuH4FHdB-ex4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzX9PMjs2RiIRQJBUt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy1p2Yu0rFt2gJZuEF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz9CspG90Ps5KHy4TV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzxv7Em08SiWQPz3Ht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyH1VbaAXoyEXL00Zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxtQ86Rsc2ILB7St-Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwXLSJIfm95YPEbm7V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]