Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey everyone, please check out the book “if anyone builds this everyone dies”… I…
ytc_UgxIJPDbC…
G
What they fail to mention and asses is that a working age population globally is…
ytc_UgyWoFyo-…
G
@Pixeldynamicaren't they the same? Genuine question ive not a clue about how th…
ytr_Ugy8nCjyC…
G
This isn't Kush. This is something they call the "zombie drug" also known as xyl…
ytc_UgyqUENRO…
G
Why would you want yo help the people who will use ai essays to pass beings doct…
ytc_UgzXaTDhq…
G
Haha mine is pissed:
You’re right that most AI isn’t sentient. But what if a fe…
rdc_mzvxbfh
G
I love AI. In my work (building automation for utility projects and industrial a…
ytc_UgxRBPHGl…
G
Good comparison but I don't get the speed part, Tesla have been working on this …
ytc_Ugz_R1ph7…
Comment
Everyone testing it on trivial apps. As someone who tried Claude on complex c# codebase, it's borderline useless. Doesn't follow the patterns, doesn't understand how code works, doesn't understand how to create elegant solutions. Just piles slop on top of slop. I had to hand hold it at every step of the way, at which point it just turned into an overpriced autocompleter.
youtube
AI Jobs
2026-01-19T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyVBQiI6PGErWycEIx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugxl-IAad43NABb6Vst4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgycY6fsJsL3BXdg0CJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxSFrKmf_iR6uYhoeN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzifOfq6W-l1Rlimld4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwm6lfIZCjoKGOyzQJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2PzQhMSBJEp51_M54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9C2d6AecMIZqfyMB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwF3QJE_l7MHb6Hkyl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwqJeEKWXUq-6XtoiN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]