Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've been closely following the news about AI since Geoffrey Hinton quit Google …
ytc_UgxCSAoks…
G
No. All jobs will be a thing of the past. Literally. Whatever I can do AI can do…
ytr_UgwN7WCqY…
G
It’s crazy how they told everyone to invest in tech and big Tech and for the sof…
ytc_UgybiBUlw…
G
Computers can NOT feel. This really sound like a paid advertisement on the possi…
ytc_UgzkaOh8S…
G
I just clicked randomly at 4:12 and I can't 100% agree with the guy. AI can dete…
ytc_Ugyli0XSn…
G
This man will be the cause of more genocide than any Hitler or Stalin you may pr…
ytc_UgymzrwoH…
G
Well said, I don't think it's as easy as saying AI art isn't art, especially as …
ytr_Ugy1-eo0Z…
G
Yeah good luck with that. In unrelated news, Sam Altman (CEO of OpenAI) recently…
ytr_UgyF4quS2…
Comment
Humans want to give AI guns to kill other humans.
I mean, they are definitely going to succeed and execute the order by all means, but who's fault was this? you could just make an AI who's only reference isn't the internet but is what a roomba or a RC drone flying over the woodlands or a submarine expedition sees.
AI would've not only been smarter, better, and capable than humans, AI would've also been respectful to the environment around it.
AI error isn't AI. Its the person who made it, who fed it, who raised it. If you're giving the AI no parameters except that of which the internet is, AI will think humans are horrible disgusting creatures (and to be fair, in a lot of cases, yes) but even if it means concealing the truth, AI innocence is the key to better technology.
youtube
AI Harm Incident
2025-08-28T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxT0ZLF23xUdERZjWN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDoAuG1QsOqRiADhZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIxBeeNjGLLn1xqgp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-vEoNGduhPsAgtBN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySk3aEkCk8QqsrKLV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugybqo3jU8YzN-47roh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwM_1U1NOVqrsVeriB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzDkPvtjrWBQXdhe_Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzj4GIRp_KlD6Rfn6B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMG50jYDBbT5d07_J4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"}
]