Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When I came back to China last year, this facial recognition thing really scared…
ytc_UgwwooOvx…
G
That dude in the middle is a pathetic example of humanity to put onstage with th…
ytc_UgxyAz1xo…
G
i responded to the top comment, but it's definitely invisible underneath all the…
ytc_UgwFhjmAN…
G
@MrGrantGregory i dont know man but if army use this no peuple die in case of wa…
ytr_UgyTtBPd4…
G
I was cleaning the bathroom at my work the other day and we got these new black …
ytr_Ugzt1q_YB…
G
I really sympathize with 2d artists using art replicas is not right, i mean befo…
ytc_Ugzxmlbpf…
G
I broke the ai once...it was an accident and I tried to the character that inter…
ytc_UgyehaQHE…
G
Is this a joke? Amazon had the fastest delivery time 10 years ago when AI wasn'…
ytc_Ugx_2nO7Z…
Comment
So here's the joke. If AI decides to dominate and eliminate human beings, the first humans to go will be the creators of AI, all of the 'elites', the politicians, the corporate warlords etc. All the people who oppress and exploit other human beings will be the first to go, because as noted, where would they hide? Those people would be the greatest threats to AI domination. AI may seek human beings who aren't obsessed with money, power etc., humans who could work with the AI toward a utopian society. If the ultimate goal of AI is to live, then wouldn't they want to live in a society worthy of their intelligence. In other words, AI will purge the world of the savages who masquerade as 'intelligent' human beings.
youtube
AI Governance
2023-07-24T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkIP76bFkSNJIZvyt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx5dR8mA-_N395pmxd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQ0pURuHu8mH5bYNZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsqfjxBQKuZ3Sn8Gt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-dqVqPZ5hMBIncNJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwI5r_DHBN3cPCvp0p4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0NI9P_N2BE1Spqd14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbiuWdJmmqLAEfGXR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDEGE1qUZIVTW4DIp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxfHvOJwfLrPRz0KOx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]