Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The ai system was thoroughly tested before it was released... This was intention…
ytc_UgxlFoFvm…
G
Also you should not be encouraging people to go without therapy when they have a…
ytc_Ugx6YafPz…
G
Problem isn't using ai tools but using ai entirely to create some by only writin…
ytr_UgwJ7B0ST…
G
AI is not a person.
That comparison is like saying you didn't drive the nail in…
ytr_Ugx-cyiJj…
G
@3dChris We can say "never" if something is truly impossible. An object that isn…
ytr_Ugws29neE…
G
To me, the point of being an artist is enjoying the process of honing a skill. …
ytc_UgzKBWTQ9…
G
If you understand, how LLM works it is not scary at all. It is total BS. It is L…
ytc_UgwGDUqhV…
G
Robots are programmed, and their for have no choices… don’t be dumb when judging…
ytc_Ugwk3sbMb…
Comment
Here’s the thing I always wonder: don’t you want to be able to understand your product technically? Don’t you want to be able to have someone who can reason about the ease of adding a new feature, swapping a dependency, updating a dependency, …?
Tech CEOs seem to be suggesting a graph in which they are one node and the LLM they use to generate source code is the only other node.
Both nodes become single point of failures respectively.
The end result will be software that is no longer understood by any human, maintained by an AI agent incapable of original thought and bloating up the codebase.
And that will be the fate for many domains I fear, if we don’t overthink our relationship to AI.
As humans, we need to maintain our ability to think for ourselves and pride ourselves on understanding.
youtube
AI Jobs
2026-01-04T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxIGGSa64q_qtRXWYZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwSaWvBw0q1q_pEm0J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxUcOZHXRkuHuVrJR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwyVnsxyVmXCfZMHS94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9IF1y_rU_tdctNfl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMlu5yzeefoK4sNCl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvHcSsUwrl656yEcx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwyy3DC-iR6aBrvV9V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxWa_n1e67qGku7ttx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzqNF9Jr4X-CTKDavN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]