Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Saying that "the AI images required human input and is therefore art and I am an…
ytc_UgwTaagJ5…
G
Testing alternative paths via decommission vs killing someone will only train th…
ytc_UgwtrUH9l…
G
All artists practice to get good so y’all doing ai, get good and put in the work…
ytc_Ugw9fid_2…
G
We wont. Not with the models being developed at the moment. Various researches h…
ytc_UgzYNXmT-…
G
I think context in what the prefer is important. Statistically white men have mo…
ytc_Ugx3GuniN…
G
“As an AI language model I will not condone hate speech in anyway especially dea…
ytr_Ugwzq8Y8h…
G
The movies about Ai were all made to forshadow the future, so there is no stoppi…
ytr_Ugx6zN6S0…
G
That's stable diffusion which is pretty primitive compared to DALL-E 3. It will …
ytc_UgxgRPkOb…
Comment
Yeah, now start trying to comprehend a large codebase that's been "engineered" by 10 developers using AI. Now you are developer 11 tasked with creating a new feature or even worse: fix a bug. The only way to make sense of this complete mudball of a codebase is to ask another LLM to do it for you. It's compounding the mess and it only gets worse. Up to a point no sane person in the world will ever be able to fully make sense of your codebase, making it not even qualify the term "software" anymore. Remember that the SOFT in software was supposed to mean it's easy to change.
youtube
AI Jobs
2026-03-18T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwUfs2JVQXz6_eT_Dp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnhQ1jqNHVzHXa_yp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxYKJpZhZA2NyDJs654AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwEdGljEssG3ptBBYx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugzd06y2PTExOj9Apkt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3ZONu-InUDq2zwzR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzE7Ylbb21u5lXRSJZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzjPQiqs_2VQE8RhkV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzzCaYUb0FIX0vefGx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJBNx58SI-2Qemwe14AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}]