Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And also, not every form of art has to involve drawing. Something like worldbuil…
ytc_UgxTS4fxv…
G
i havent seen the section on reference yet, but ai as a reference is SUCH a bad …
ytc_UgzIWEstt…
G
@BaconBaron52 bro it's not like it was a battle of races ai makes content based…
ytr_Ugyoa2EFU…
G
My thing is the real art LOOKS WORSE than the ai art the composition is worse th…
ytc_UgxGMijM9…
G
Even the robot doesn’t place his hands on the trigger until he’s ready to fire. …
ytc_Ugzz8enep…
G
Great. So give artists a little bit of money so we can train AI to replace them.…
ytc_Ugx75Os3F…
G
lol, yall think your fighting AI art but you just did the same thing it did.
too…
ytc_UgwDDpV6g…
G
LLMs are a great help. I don't think we need self-aware super clever AI. It does…
ytc_UgwGmSx3j…
Comment
The problem should be obvious: the entire goal of Artificial Intelligence, writ large, is to emulate human intelligence. Humans are too often shortsighted, self-serving, egotistical, tribalistic, and psychopathic.
If you want to teach a computer to emulate human intelligence, the bad of humanity is coming with it. So, of course, it's difficult to achieve alignment with AI, because humans aren't even in alignment with humans!
youtube
AI Harm Incident
2025-07-27T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzu9bIWv0MDR9ccUgJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy5zseSndz-2F46yAp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx2ZkvEg_BfdV3aIQl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyalSvZp29RIIvr29J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwYdFpPVoAOm2OwaYl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyAIUFE3cePT4mnRUN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwu8YxWUZGSyPxxLDx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzyIejAxU9rbmfJP514AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxG6o8-2BSh-tiNdrl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwR9L3RNSSPZ4onwil4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"}
]