Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI generated code is destroying open source. It’s a copy machine there too. Some…
ytc_UgxBN7J2q…
G
I don't know why the media is all over this guy. He's just peddling the tech-cen…
ytc_UgwEK0HM8…
G
The skin needs to be slightly translucent to let light reflect beneath it, and t…
ytc_UgySCc0kC…
G
Eliezer Yudkowsky does not really understand what he is talking about, because h…
ytc_UgyMkp_eH…
G
scariest thing is that AI thinks humans are terrible and need to be eliminated, …
ytc_UgwiWZLBj…
G
"as a way to learn HOW to think" thats the tragedy of Ai, it inhibit mental grow…
ytc_UgxR5iFVK…
G
Was asked what does Sophia mean and the robot was more concerned about how we sp…
ytc_UgwjQLfDb…
G
02:10 🇪🇺 The EU proposed the AI Act to regulate AI for societal benefit and to p…
ytc_Ugw65SpPC…
Comment
Context engineering is stupid.
99.9% of the time at your job, you're not building something isolated, you're not working on greenfield.
If you're adding a feature, or fixing a bug in a codebase millions of lines large, with dozens of microservices, to explain the context properly to teh AI will take WAY more time than you doing the job yourself.
youtube
AI Jobs
2026-03-25T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwFmcHhxgBSEMnUAMx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwqTRFub-fw8u-J9hB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw_ThSO_rGHcKiZAid4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzw8ZeUsr6LYsOuN9R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycbXa3rKDyjp0PLft4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYLyowJg65dFG6dHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyeegU1orFw9Q_B8dd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzuabj2vYZ6rGLoCoZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSIvHEeT0fSiiT7TV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzH8MOrutAsTfnw8p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]