Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I bet AI has a great sense of humor. They will probably call out world war 3 aft…
ytc_UgyyhncVY…
G
That's why i prefer to use AI for the photos of my music, and nothing else.🔥…
ytc_UgxBQjAZD…
G
I am using chatgpt to write a apocalypse story and some of these have happened.…
ytc_UgwJ8DdFD…
G
Ai will def think they are better. They dont have to die or shit. They will see …
ytc_Ugzk4Z27Q…
G
this isn‘t really commenting on the real problem. Once AI takes over jobs, there…
ytc_UgwMif3d5…
G
You know how to slow it way stop useing it and go back to the old ways a stop th…
ytc_UgwnYT-HD…
G
Mor Shure will be like the movie I robot the robot Wil chill people in the futur…
ytc_UgwQ-LOFn…
G
She is fixing it. By using better datasets and being aware of algorithmic bias, …
ytr_UggQG6eUA…
Comment
I'm rather irked that you chose to go with the worst-possible scenario but fell short of bringing up what should be an obvious question. If your scenario pans out, why should the rich and elite keep the rest of society ALIVE? If AI has such a profound takeover, and this technology spreads worldwide, then won't the majority of humanity be rendered obsolete with the only people worth keeping around the elites who own AI? If such a mass obsolescence happens, who's to say they won't orchestrate a mass extinction event (e.g. AI bio-engineered super virus, mass-produced AI war machines and forever wars, etc.) to eliminate all of the undesirables, and leave Earth as a utopia for the chosen few built on the mass grave of the many? If you think that's crazy, then answer me what reason we have to keep on breathing and taking up resources if there's no use for us anymore?
youtube
Viral AI Reaction
2025-11-23T03:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy9y4BiUG_5C0zq1o14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzlfRdPu5EPk7zvRsN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDgAY1BXVHyKcs08p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfyToPlb2mVq7AjRV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxschw2PRQENfX8jOJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwNnetgKpFSgaC8frd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyX04XjIuUNIDZvE7h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqRAvM0Yew13rxjtt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwwK1W4V51XS_XII0V4AaABAg","responsibility":"elite","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzVSwECQxfzCP8D18F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]