Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Asserting that there is racial bias in AI implies that it is deliberate. This is…
ytc_UgzlnP2_Z…
G
At one point it was believed that as many as 80% of jobs could be lost to AI acc…
ytc_Ugz7VYjqK…
G
We're sorry to hear that you found the content disgusting. If you have specific …
ytr_UgxyDyHF_…
G
Machines can make music and have been able to do so for a long time but people s…
ytc_Ugwuyfyme…
G
Goldman Sachs is now predicting 300 million jobs will be lost to AI.
Anyone ca…
ytc_UgyNQruvl…
G
As a med student, i can confirm that chat gpt 4 can answer about 90% of the stan…
ytc_UgwlH0Uxp…
G
A.I. is just mimicking human beings the good The bad the ugly. and human beings …
ytc_Ugw2IM5kI…
G
AI is the Beast system that comes from the Antichrist so he can establish one …
ytc_Ugw15aj7o…
Comment
Writting code fast has never been the issue, that's what AI does and that's what managers don't seem to get.
Architecture, design, ease of maintenance and so many other things are the pillars of Software Quality. AI does none of that, at least to a good enough degree, as it doesn't have any actual knowledge of what it's doing. It's great at replicating algorithms it has already seen, because that's the only thing it doesn, predict what will go next.
Programming using natural language has been an idea for many decades, a very stupid idea, because natural language has many ambiguities, so you'll always reach a point where dealing and specifying those ambiguities will cost you more than if you had specified it in a more formal way.
youtube
AI Jobs
2026-02-04T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyayATpiD5Nr3GApHB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzYW1C-9iYqEnKdndt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmbJ0O9YjU69EsjWt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6UKgs-7Y2OJBghX94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwtAakuk3deHloLdXx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSbXDiIVchDXgJJWV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzbRE3WNhEjR49rc14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxtGGCVxgJo7VnYUHF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz6Gul5LeJU3payyOF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyC4uVWY0JPe_zHNdF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]