Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I work with LLMs daily. They are too not the solution. Something else has to com…
ytc_UgxY7-gs4…
G
Like all human advancements, it'll be humans that purposely use said development…
ytc_Ugz4apSgC…
G
Chumma don't scare us, Humans will control AI. AI will make Humans live better,…
ytc_UgxTW-S2E…
G
Really silly to listen to him. A.i. will be so much smarter that these people th…
ytc_UgwxFM0Wy…
G
Yes the computer hardware economy is the first sign that the entire economy is p…
ytr_UgwLa88Zm…
G
I feel the same way when I'm coding. Sometimes, copilot is great. Don't get me…
rdc_nm0v9az
G
Well let's jump ahead to 2010. My generation x movie with the AI Sal and the hoo…
ytc_UgwQVRSeA…
G
See there that a.i shit is already starting to take us out n it's not even what …
ytc_UgxT43dq3…
Comment
This one is actually by far the biggest risk to total human annihilation:
Climate change could kill a lot due to food shortages, certain regions becoming inhabitable, etc. but it doesn't really have the potential to kill all humans.
Similar story for nuclear war: a lot of death, suffering, certain areas uninhabitable, famines, but not all human.
Even with a very severe asteroid impact, humanity could persevere in special bunkers. Also, in the last couple years we got the technology to detect and intercept the globally destructive ones before it's too late.
Biological weapons might be very lethal, but again, small isolated groups of people could likely survive and carry on humanity.
A sudden gamma ray burst from a "nearby" black hole, or a rogue black hole passing through the solar system are the only things I can think of that could be potent and unavoidable enough to get rid of humanity altogether. Though luckily these two scenarios are extremely, cosmically unlikely to happen.
AI on the other hand: if it's superintelligent and for whatever reason considers us an obstacle enough to decide to wipe us out, we are cooked. Compared to a natural disaster that stops even when there are survivors, a thinking force wouldn't stop until the job is done, cutting off the potential future of trillions of people that could have lived among the stars.
youtube
AI Governance
2025-11-22T00:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxoCtXxVGUMylnbUKp4AaABAg.AMMVeHhZ_CCAMNf4Llv8jc","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxoCtXxVGUMylnbUKp4AaABAg.AMMVeHhZ_CCAMOFqJqmbLo","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgxoCtXxVGUMylnbUKp4AaABAg.AMMVeHhZ_CCAMOMo8xT9M3","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugx5Ai4xn2qCXGTnegN4AaABAg.AMLucwwjrWlAMMGyJ_v2Zm","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwHYRPfGwNlHXWXURR4AaABAg.AMLroQvDxmtAMwuUW4pg0S","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzbgPnwynGf75XgKNp4AaABAg.AMLTSkFHfByAMNll8kfwOD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwtiVwM7oQQfACTRJ94AaABAg.AMLPHd_BGZ5AMLQsFFwXtE","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugzda0CqwR2pwyg5z9J4AaABAg.AMLK_oVa3m7APoEGQJnoe3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxruEjafI6CsLocy7Z4AaABAg.AMLJsptIyK_AMNqYBsO8LQ","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxruEjafI6CsLocy7Z4AaABAg.AMLJsptIyK_AMOcQgFtxbn","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]