Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, I see it. It looks decent the first few frames, but then you can tell it's…
ytr_Ugw9JeAg9…
G
This will kill the real feeling of being rejected and getting into arguments cau…
ytc_Ugz5dtni5…
G
But when you sell your ai art off as if it is art that is not inspiration…
ytr_UgxUf5Kj5…
G
The difference is outside of getting you to subscribe the AI doesn't have any mo…
ytr_UgygZoyYH…
G
They're all pushovers. Justifying shitty actions because everyone else is doing …
rdc_o88gktb
G
We appreciate your humor! In case you're interested in exploring more about AI a…
ytr_Ugw5MXOZg…
G
They programmed the male robot to sound like the stereotypical male asshole idea…
ytc_UgzTvBC7I…
G
Blake reminds me of a little demon who wants to a part of that equation where AI…
ytc_UgwJS63VD…
Comment
Lets face it most programmers are slightly or even highly sociopathic so we should not be surprised that AI has gone this way. I think Peter Thiel might already be part Robot: Am I right Peter ? Mark is clearly already part Robot, no contest.
youtube
AI Harm Incident
2025-07-27T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxpiLA1zq4Ppu5p15x4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwpF6LjhH6AaLtygLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxv-FyaK1TeJBc_Zyt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyrWA4M48esA6RUaYx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgxN-7k2nynL9jwlXTF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyTEur-qpjJSebMDYZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUn6_JiOkMTNsshOZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgygmrgwFTrGVpNuC9l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwH482PKgkFoJItJbx4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzdJDFxnFWmy3Vokt14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]