Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, his suggestion wasn't exactly rocket science, was it. Germany's "social ma…
ytr_UgwghjYg2…
G
i have a question - in a world of ai would the government choose communism where…
ytc_UgyniY_Ab…
G
Maybe one day we'll become so advanced that each self-driving car will come with…
ytc_UgwPuiTMo…
G
Actually I’ve been reading and studying my Bible I’ve read end times prophecy… s…
ytc_UgzlTjDqo…
G
Nobody asks Hinton the onvious question: AI machines need energy to run. Are hum…
ytc_Ugxt-b813…
G
Bruh, this is how Ai was always going to go down. Anyone surprised by any of thi…
ytc_UgxaXGRZr…
G
Artificial intelligence will make people less intelligent. The more people depe…
ytc_UgwBB9xPH…
G
The guest was saying something interesting about relating to ai and a sense of k…
ytc_Ugyp5mD3Z…
Comment
The Claude Sonnet 3.7 which is the model you most likely want to use for coding has 200k context window. So I would say that for most mid size projects it should be enough. The catch here is that although something is in the context and LLM can retrieve info about it, this still doesn't mean that LLM can take into account all context while reasoning about the code. On the other hand 2 or 3 years ago no one was able to imagine that we would ask questions like does AI going to replace coders. So who knows what will be in next 2-3 years. Only time will tell. I am more inclined to be on the Huang side, who recommends to learn something else.
youtube
AI Jobs
2025-03-13T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwDj0doXc6kaUkzH0V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyuNfSBSjwANZsJf6x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyaBG2C1JIe3VnjuTh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz0hwCtab9LtouSq414AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzWk_2Dz4nImyN3tRx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCSMC6xZSc6O8aw5N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxAlitGi6W2PcjVYNh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx0rzAc9e7emoLzCVl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwYiqBT2pdej0_Pl5h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyG83ZMf4PBRx2mrJF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]