Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
.... Speaking of natural resources, a large hyperscale data center can use 1–5 m…
ytr_UgyzaOk-2…
G
This is real, because the only way humans can ever constructively use blackhole/…
ytr_UgzSpcUOU…
G
Imagine this: You fell in a coma and the circumstances brought you to wake up wi…
ytc_UgyQTnrwC…
G
this is mayority of a way to much of a strech, first of all, if there is a AI co…
ytc_UgyfZOkvE…
G
That's a thought-provoking perspective! The evolution of life and technology is …
ytr_UgxlEMN4X…
G
But why should I bother reading the blog of that company then. I could just ask…
ytc_UgyXLIEX1…
G
If you're selling AI art just say its AI. If people want it they can buy it or n…
ytc_Ugwf1K-mW…
G
@86fifty The problem with this video is that he is assuming any form AI use is a…
ytr_Ugyrrre54…
Comment
If you make superinteligence - you play God. And God doesn't come without the Devil. If AI has a built in conscience, it may have flaws. If it get's it by himself, he might have dilemma, to be good, or bad. If it is so, than it's going to have philosophical thoughts. If it will somehow think that there are no absolutes - AI will go full 'retard' mode and be most inteligent Luciferian thing in the known universe. Who ever makes it - will end humanity 100%. I thought it through like in 10 years or so. Bet it will be fun to watch the world self destroy. This is the "last judment" and you don't need to be a prophet to see this.
youtube
AI Governance
2025-09-15T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyxHi-gnFLnlhSX_9N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVLAf7VVXQIus73Dd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz4ZFKeh5HkO6TsJO54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1TUdh5L5zuW_6hAJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZGF0m8k3lAqOT8WF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwL_4wIDfCElEWoPJ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNZUDj5BFzj1rWA9N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYKdXmOuAIAtgpJSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZBKaVqBgXXeTPeUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLzVVp-pUBhex3KgF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]