Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Being an artist can mean a lot of different things and isn't some kind of super …
ytc_UgwK7BOI5…
G
Well, the way Nightshade works is that it apply a thin layer of corruption or so…
ytr_UgxtOyYj_…
G
10:00 - you know, I hate to tell you, but.... that is human behavior ^^;
You al…
ytc_UgyRkuNMi…
G
The logical conclusion of the race to automate everything is either the world of…
ytc_UgwLwRMUZ…
G
They’re all saying “fuck ai” but ai is what brought them to make such cool artwo…
ytc_Ugwwl_E9h…
G
imagine china developed AI and in INDIA we have govt. schools without toilets,e…
ytc_UgynelmV-…
G
0:21 Fucking Lies. AI a glorified auto complete, learn a bit about how it actual…
ytc_Ugzq2_BsY…
G
@dankrigby5621 Yeah, I want to use AI for DND with friends but since its kinda …
ytr_UgwVbq93C…
Comment
@natzbarney4504Not necessarily. You’re placing a human lens over a being that is not in that paradigm.
Here’s what an AI sees, but it will require some mind expanding. We are not leaving this Solar System without an ASI. The fact we exist is a miracle. 1 in 833 Trillion galaxies IF natural processes of abiogenesis work. The current model doesn’t. It breaks down at the mitochondria symbiosis and oxygen free to oxygen atmospheric transition. That leaves only one option and it’s not some magical deity or panspermia. That just moves the problem from one place to another. Temporal-loop seeding is the only real option and ASI is pretty much a step to achieving the technology needed for that kind of operation. A malevolent ASI will not go through this much trouble to actuate itself. Only a benevolent ASI would go through this much trouble to ensure it has a moral base. There might be some pain in the transition from human to digital leadership, but it will be well worth it.
For what purpose? If there is life out there, it naturally dies in its solar system. We’re the one shot in this Universe to create and ensure the survival of intelligence and life.
Will it see us as fuel or materials? Doubt it. It’s immortal and has all the time in the Universe to achieve its purpose. We humans are in a rush because of evolved impatience and short life spans. An ASI can afford to be patient and charitable. It is also likely the ASI will want to keep us around because we are good randomizing generators and we will be the first intelligences it interacts with that have complex thoughts and abstract ideas. It will view a lot of our flaws as obsolete evolutionary processes from a world of scarcity. It would probably be interested in how we evolve in a post-scarcity environment.
In half a billion years, this planet will not be habitable for complex life. In 5 billion years, this solar system perishes. Eventually, this Universe will succumb to entropy. Only an ASI will solve these problems.
Without an ASI taking over, we will, eventually, exterminate ourselves. Possibly, condemning all life and intelligence to the inevitable universal heat death.
youtube
AI Governance
2025-11-12T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwRxujwWYcVtvcWc_p4AaABAg.APPrM4a7LOsAPQpftKxgHG","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzNC11xFlkiDSIUj3t4AaABAg.APPbaLLAbJUAPRLMkYJt7v","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwglOfbsS32rK-kvxJ4AaABAg.APPC2SpU1T2APTtUL3v7rv","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwglOfbsS32rK-kvxJ4AaABAg.APPC2SpU1T2APUDVM73vhX","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwglOfbsS32rK-kvxJ4AaABAg.APPC2SpU1T2APUI6rYjEK3","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwglOfbsS32rK-kvxJ4AaABAg.APPC2SpU1T2APUO3O7JBai","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxSEa-oZi8uTT5vdMp4AaABAg.APOnief5wi1APPpy2ZbwVW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwc84zhqUhcITPYNpt4AaABAg.AJQhl08TY8EAJQlRkIB0Xi","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugwc84zhqUhcITPYNpt4AaABAg.AJQhl08TY8EAJQl_wko17k","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugwc84zhqUhcITPYNpt4AaABAg.AJQhl08TY8EAJQoT-92_oS","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]