Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We all know the world is headed there! And there are in fact benefits from surve…
ytc_UgwzmNm2r…
G
AI does not help to do my job faster. Actually, it slows it down. Yet, it helps …
ytc_Ugx8B6bpZ…
G
@Totally_not_a_pineapple with traditional art your literally just moving a penc…
ytr_UgyHBgYUM…
G
By win you mean drive OpenAI into bankruptcy? Then yes. They will win. Google ha…
rdc_nsetqku
G
But wait, it'll be some guy in his garage or basement that ignites the trigger f…
ytc_Ugx3q3bis…
G
People probably wouldn't hate those AI image generating morons that much if they…
ytc_Ugyzx9qzW…
G
@i34g5jj5ssx well the difference is this artist use their own effort and skill, …
ytr_UgydyQ8Af…
G
I dont understand how like half of these jobs could be replaced be AI 😭🤨…
ytc_Ugw95txOV…
Comment
Here is the question..
Likelihood of what unfolds IS an intuitive guess regarding super AI.
IS nuclear weapons innocent or is super AI innocent? Which is guilty? Which comes first in the future? The chicken or the egg?
WILL SUPER AI BE GUILTY AND CAUSE NUCLEAR DISASTER KILLING ALL BUT SOME THAT WILL SOON PERISH EVENTUALLY OR KEPT ALIVE FOR EXPERIMENTAL PURPOSES LIKE LAB RATS.
OR
WILL HUMANS CAUSE NUCLEAR DISASTER AND SUPER AI BE THE SAVIOR FOR SOME TO SURVIVE SUCCESSFULLY?
My guess is that humans create a nuclear situation killing half or more, and that SUPER AI picks up the slack in making the world run without human labour.
Sometimes fate is simple.
youtube
AI Governance
2025-09-04T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzwD1aXsynbOl4drKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBf3Y969epF6y_rqJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzqnN-H8NvBGcUD3GF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz9kVb1cWqIVp4q-tN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9GoNcas6GlNjV1pV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwxRNPNBS_Hw4Q9hON4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzaBknrxIfVAZU66Nh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz9oMy7JFH4ITy4MER4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-I9t6RNQOeqyjAM54AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLaQanFSAJ51UOlKd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]