Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AlexReynard You're not tricking anyone, buddy. Generating AI "art" is just as m…
ytr_UgwYyzj8c…
G
Whatever is done by the human thinking is unfortunately destructive. It has a di…
ytc_UgwpGAsSE…
G
@alex-rs6ts I understand how it can seem hypocritical, but when learning art an…
ytr_Ugz7ZHyk1…
G
If tens of millions of people lose their jobs to AI and robotics then they wont …
ytc_Ugx1JTSP7…
G
That's an interesting perspective! The name Sophia indeed carries a lot of depth…
ytr_UgynuPjM3…
G
As an artist myself, I will sometimes use ai for concepts and potential composit…
ytc_UgxGeJFLX…
G
They'll do whatever they're programmed to do and be whatever they're programmed …
ytc_Ugx_Ubhus…
G
This is more scary than nuclear war.these robots are already built in huge numbe…
ytc_UgxLp9EU4…
Comment
LLMs will not replace humans. Even if a company becomes super efficient by employing AI, they may lose to other companies that create innovative solutions that make their efficiently created product obsolete. General AI is a fever dream that requires something completely different than an LLM. so far we have only observed one entity in the universe that achieved economically sustainable GAI - Humans. And don’t forget: curreng LLMs are unsustainable. Chip production will decline soon due to collapsing globalisation. So if they don’t pull out an every day quantum computer that runs at room temp on the cost of a normal computer, don’t expect anything resembling real AI take over your job. Sam Altman just wants to keep the hype alive bc his company (and probably life) depends on it. Sunk cost fallacy. When youre too deep in it to pull out anymore and you just keep going in the hope of finding a solution along the way.
youtube
Viral AI Reaction
2025-12-14T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwlY7Qx6BdmYV9-QDt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqdjY0BAhSbm4r2PJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwInOO_p6tYVzWnoPF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyOtrbFRidp_Ait7aR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSB_G0qLNrp4mutI94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5LyQXbaoXxgOr8aV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgztVDbgjYLpy9Y-3tV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzEgbg14TG8-CLW_xx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwVAte6Hr3fya4fEdB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzd91-FYRk6DWe9BoN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]