Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Use AI or be unemployable" great, create a product that has no tangible benefit…
ytc_UgwqhNISW…
G
Art isn't about being exceptional, it's about making art. This 'exceptional peop…
ytr_UgxnXiNYw…
G
@johnbrown1867well that's usually the problem. You don't see it until it's too …
ytr_UgxFm4vgg…
G
You can just call me arthur maxson, because id rather destroy the ai than use it…
ytc_UgwTihrgP…
G
@Bexx74 Eh... I prefer to avoid talking about the "originality" and IP Law a…
ytr_Ugz2kfkgO…
G
There is evidence from evolutionary psychology that helps explain this. People a…
ytc_UgwK8Wqew…
G
So why the fk didn’t you think of this in the first place genius? Fk you wizards…
ytc_UgwSzkizD…
G
One. Million. Percent.
Part of the reason these corporate overlords love AI is…
ytr_UgwN4JXE1…
Comment
Good video, but it feels like it misses the bigger point. Nobody serious thinks today’s LLMs are the path to AGI—they’re just the current step, and of course they have limitations.
But companies are preparing to spend $500B on next-gen AI infrastructure through projects like Stargate. You don’t build something on that scale just to make a better text generator. That kind of investment is about what comes after this paradigm—new architectures, real-world interaction, continual learning, etc.
So when people judge the future of AI only by looking at today’s models… am I missing something? It’s like judging aviation by the Wright Flyer. We’re clearly in the transition phase. The real breakthroughs come next.
youtube
2025-11-17T22:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwmqSE39iyLF9Mzemh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzYAerrrQN1cFGzuv54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxKQbGcwkQCiN22Ibx4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_Ugw-jbwUikWyu42QQLh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwkmE8elRaDem8h43h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgyAA6Cclk9ioTzhY3l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgzgsddE9q6ecLOmpZF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgzEUX_kG29sX3WlNcF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugz9ByUof-k4ASQ3TYl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugx9X-CkWUq6Km9w8zd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}]