Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When Trump allocated 500 billion of tax dollars for AI development, he failed to…
ytc_UgyvEyd1d…
G
AI cannot really replace human writing, those same companies will come running b…
ytc_Ugz9lVzGv…
G
Missing is the economic scenario with majority of population out of work, hence …
ytc_UgyY9m8ym…
G
Thats knowledge not intelligence. Intelligence is brain power. The ability to …
ytc_Ugx0uN65e…
G
Bro I heard once that an ai as a test was given free will
Mf wasted NO TIME he w…
ytc_Ugyt-THMY…
G
They were talking big about self-driving cars in the 2010s. Now the poor things …
ytc_Ugzs1vdzd…
G
I really don’t like what AI represents. I will admit it has potential and genuin…
ytc_UgzOXdLUo…
G
Once we creat AI it will possibly have the ability to update and improve itself …
ytc_UggSkZsWg…
Comment
this is stupid. i’ve worked across every type of tech company (big tech, faang, startups, quant) and it’s obvious that ai is going to and will eat coding. developers will become architects of systems vs just programmers. the current models are great and are only getting better. even on the google codebase engineers don’t even hand code for most things now, they use ai because it’s good enough for even the most pessimistic engineers to trust after setting up a rules engine and your workflows.
my ego was definitely bruised bc all the skills i spent years learning were useless, but learn how to use ai or be left behind.
youtube
AI Jobs
2026-02-06T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwZ4x1eItE29XAgQAJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgylOl4zXTYPPU-bC394AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyST_gIiO9ZiI-SIBl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxoPHErRN9t9BTTJVV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDCZtZXx3N3ngWN4d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz53rpVk8DjQkuKVKJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxtfv8-4V3S7B1Zqmd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuUF0tpzqtWhUUQ7h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx5IZy8KwGzHbO69Xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgySLzq3sZgY_7B43kB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]