Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Now's the real problem: imagine an AI say "screw you and your instructions, I'll…
ytc_UgwllwvHz…
G
Bro, I bet that comment was ai, there’s no way someone who depends on ai to be a…
ytc_Ugzne6SWH…
G
In summary, Stephen Bartlett and all time wasting long form interviews are numbe…
ytc_UgySB-42p…
G
Those are people's jobs. That's not funny.
What about the problem with AI and …
ytc_UgyF55nRa…
G
The issue of AI depends on the social hierarchy in which you move and also on th…
ytc_UgxsTJcLF…
G
25% of the energy will be used by AI systems by 2030? That is insane, there goes…
ytc_UgzPp_Awt…
G
A company that makes an AI releases a non-peer-reviewed study that "proves" that…
ytc_UgzEY0yU1…
G
A super intelligent AI will be able to invent energy sources for itself that we …
ytc_UgzMVGmKY…
Comment
I think a big and easy way to stop them, is to ignore them altogether. Cuz if they don't have an enemy to defeat they start losing their main foundation/goal. Instead, we should educate anyone we know to stop using them too (to not watch any AI slop in the net or download some weird AI prompters).
But it would be cool tho, to have a robot companion who can guide and help you, not all this shi+ going on.
youtube
Viral AI Reaction
2025-11-11T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzlGizd5LZh6XWCuvZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHooFt1tlh3dS1DUF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw48-AjfxCjLfL_5CV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgznphEdlfAg6e5O0Ql4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyZVEefvdisa-RB3Jx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy7PmZ9CY55Z3WBaVx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygUVN2-o0q9qhcMYx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz8abwb5XmJ-kCaw5V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyMwhbiVjPWsYs2j_F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxbFOizsr4TW-OfjcR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}
]