Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you have a chance you can also check out the documentary [Trophy](http://www.…
rdc_deuomii
G
From the Article
>The new research, published in the British Journal of Opht…
rdc_ire78at
G
11:47 is the one modified by ChatGPT. Markings on the microwave aren’t as sharp …
rdc_oi1aq7p
G
Thank you for your question about the meaning of the name Sophia mentioned in th…
ytr_UgxCCPSpp…
G
Isn't this the guy who wants to actually put the ai in side us ? 😅😅😅 wake up yal…
ytc_Ugz3PqkvW…
G
As a parent it's kinda frustrating that my kids don't just begin life with every…
ytc_UgwtNnLXJ…
G
The world being built isn’t for the masses. It’s being built for the elite. With…
ytc_UgzIRfMcW…
G
Yeah basically he calls AI a tool to achieve the art, while all he's doing is ju…
ytc_Ugz1ELrM5…
Comment
Yeah, this is fairly off base. Producing more lines of code is not equal to actual production. LLMs are very verbose in the code they write, and every line of code is a liability. Don't stop learning how to code. You have to know what is going on with the code you're shipping, not just how something seems to function. If you don't know how to code, you are going to push out some harmful code, and it is going to fall on your shoulders. LLMs are inherently flawed, and nothing they produce should be trusted. We need actual good engineers. Even good tools in mediocre hands are dangerous.
youtube
Viral AI Reaction
2026-03-06T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwOZexr6ZUSxm1cNDh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzhpYfjwwVx5RODrEZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhueAHQ7F8snrI5CZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyrzxvNS39C4gLfFPR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIyUrZDtqIC04tPox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycIR4rGlV3wSC_C5l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzobKxIUck3ID-dOkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgysBWv15LLCHyvJhWp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYL024bP3j6Drqo5t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy6FwvbA1MCLvfgtaJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]