Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m a seasoned software engineer with over 20 years of experience, having worked…
ytc_UgygHuWJd…
G
2024: eh we will be fine
2034: *shoots robot* THINK TWICE BUDDY
Ok I know it’s…
ytc_UgwlEm4z-…
G
There is probably a lot of truth in the story. Apparently, a service will emerge…
ytc_UgytbmE4b…
G
Considering the parameters for chatGPT was blurred out I am likely to assume you…
ytc_UgzrVizHd…
G
To be fair. I use like 5 different LLMs and all of them will claim to be from Op…
rdc_kcp3868
G
The question that the jury will ultimately answer is not whether this is fair us…
ytc_UgxMKa3Sy…
G
The rich are going to be the ones buying stuff we are entering the age of techno…
ytr_UgxT5qHmC…
G
Thank you for your comment. If you have any questions or need clarification abou…
ytr_UgyxgYgi6…
Comment
Back when the French Revolution happened, even having the very best equipment money could buy wouldn't let you beat even 5 people in battle (assuming civil war in the streets, not battlefields), but now military hardware would let a few thousand crush tens of millions equiped only with simple pistols/rifles. And if we can get AI to control these military resources? Then it will actually become possible for, theoretically, *a single person* to win even against a hundred million revolting civilians.
So, a few thousand of the ultra-rich and top politicians, controlling an AI-operating national army, could actually hold power no matter what the people do. That is kind of my nightmare scenario. In part because, with these people typically being utterly heartless, I can actually see them allowing the vast majority of humanity to starve to death in order to _fix the climate crisis_
youtube
Viral AI Reaction
2025-12-13T16:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy74BzUiJhZ5w8EIDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxg2u6uVRzOsoSr6rF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwr_Gq2h8nCEixNMyB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzLkP60sKN-NfuZpbF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyTb_6EoASHDVZgALJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyP_YZmSkpwxFUiCy14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJSVylsO3a_cDpWTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRHpK3Vmja6cSku3x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgylBL2ayunyVQuV9ER4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgysrGDRvGZwUeGNfvp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]