Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We don't need or want driverless trucks or electric trucks, we need trucks with…
ytc_Ugz9uHPHV…
G
The problem is that we don't see anything wrong with the word artificial intelli…
ytc_UgwRFF7Xo…
G
Fighting a losing battle. Everyone’s panicking about stolen art but that’s just …
ytc_UgwivWjx2…
G
Ai, missed the prints in the sand from heavy bots, too many dents in the truck f…
ytc_UgzBT5V34…
G
Give me an example of someone who wasn't a mediocre at best artist who lost thei…
ytc_Ugzcr27d8…
G
LLMs are excellent at handling language and can respond in a way that we associa…
ytc_Ugzt1HTcS…
G
Who says it must become conscious? All that is necessary is that it becomes eff…
ytc_UgwKlUtbq…
G
AI should be more of a tool, although in the late future, it'll most likely put …
ytc_Ugw725-R5…
Comment
I wouldn't want to work for a company that puts it's bets on fully "AI" (LLM) generated code because once your customers find an issue, good luck fixing that. LLMs are exactly that: language models. They don't do reasoning. They can't work in a bigger context. It's gonna tank so hard. C-level execs and investors are simply thinking in terms of money and creating hype to drive up prices for imaginary products and services that don't exist. It's a productivity tool and nothing more at this point. Once you're an experienced developer, you're free to use it to boost your work even more, but you can't trust it to generate perfect code, so you have to review it.
youtube
2025-03-13T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJWOhJrf__XmN6Aj54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdybZkGgK8lwvvAkV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7DYgwpKdpS_bWWQt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFCPaL4v2K3S47Zph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMAkCLfxtpN4TA4w14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxhzlbAUu2he8u7yKF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKcHgQ7Bv0Gn91Cq14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLUVG8lV3qvKL_WHR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxjE7XYxNRmTVwioBl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwY8SsdVSeIcoB3iTR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]