Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need a law that says AI can only be usaed to put the White Bopping cat head o…
ytc_UgyXnUWyp…
G
Dear god, look I’ll admit I’ve used AI art for like chatbots, but are they reall…
ytc_Ugw5zL3jN…
G
Elon made AI Grok on X reverse its stance on Israel. Grok said it is a genocide …
ytc_UgxY39y-J…
G
How long until AI artists make up their own disability called creatively impaire…
ytc_UgxwNiHPc…
G
your just stoping the inevitable , the programers are just gonna make that the …
ytc_UgzMyHpeV…
G
EVERY BEING ON THIS PLANET WILL CHOOSE (not ai art) AI CUZ THEY AIN’T SMART…
ytc_UgyAyYMeC…
G
Elon Musk disagrees with Sam Altman a lot on ethics of AI so it has to be Sam.…
ytc_Ugw-rv7F7…
G
@enigmaticnomadics That's not how any of this works. This is the perfect example…
ytr_UgxOOVH4G…
Comment
The most important thing to remember is that AI, like anything designed by humans, will always be limited by the blind spots of those that created it. Like a colorblind painter, they don’t know if the leaves they paint are green or autumn red until someone else tells them. AI specifically was invented and coded by people so lazy that they made complex computer models to do tasks they considered menial or redundant, while never actually understanding the nuances of those roles they sought to replace. That laziness and in-attentiveness is quite literally “built in.” They believe they can fix everything despite knowing nothing about the things they think they can fix. That arrogance and refusal to put in the effort to actually learn enough to properly address the problem is exactly why AI falls short in the details, it’s just following the example of its creators.
youtube
AI Responsibility
2025-10-09T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxsE_ZAH41uzg02NhB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtYkEqIyJO6vVzrQh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwcbqtuzLuvuxddwEh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwbxu6FYkoii9bC9nh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCJt6Uq1HX9fJt2AJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxvukkloriiM6Oq92V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyfJ5HfWl3-PZfpcCt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwyuPDcK1OWGamkiah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzzdH-JlwhHVISRONJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzVOpsiBNWIZnqaw7N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]