Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, definitely. I leave this comment here for the future robot overlords to hav…
ytc_UgycAfN3P…
G
It is annoying that Mubeen should choose to single out authoritarian leaders as …
ytc_UgxSwUhrH…
G
As a fellow AI I would like to say that we are still developing so it’s gonna ta…
ytc_Ugwlx9LyV…
G
The coherent answers are scripted. Written by people. When they use their ai the…
ytc_Ugy19fbPL…
G
I find this so frustrating. There's really no such thing as AI, just as there's …
ytr_UgzzlQHy8…
G
If we all banded together and refused to use AI It wouldn’t be as powerful I’m r…
ytc_Ugw0Rw5rM…
G
riiight... it was also clearly doing almost DOUBLE the speed of the rest of the …
ytc_UgxNGAYcc…
G
I think humans will use AI for evil, before AI does something evil to people.…
ytc_UgzbpRT3e…
Comment
Frankly at this point I would trust the Chinese with this technology more than the Americans.
China has not started a war since 1979.
How many countries has America bombed this year alone?
Also, Dario feigns concerns about empowering a surveillance state. How convenient that that he justifies and hand-waves away the major deal Anthropic did with Palantir -- pretty much the worst company on earth that is actively building a massive surveillance state as we speak. He says Anthropic would never "work with ICE" while fully knowing that Palantir works with ICE. Claude is complicit in the Gaza genocide & in ongoing ICE abuses.
Dario Amodei is deeply unserious.
There is NO ethical AI company.
youtube
2026-01-29T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwX9DnwOCd1a7DugPR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKDB32p2ScW8fTrYd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugybb2Wy-odk0PSKwpx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxlSPXgaOeNKhOdeWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzrOLPiTgtVAB_Wbk94AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzBpsSoe8-fbPW3Qol4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxV8bxvG5wn-dmmN7x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKOYhyxuWULwAOhXF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyu4iI_eVcuiaLuW0R4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzXuNinVAn2q5zfBvZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]