Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We're at a point now that the ai platforms are training based on their own, NEW,…
ytc_UgyZ7w9zl…
G
There was a self driving car without a driver outside of Ghirardelli Square. Tra…
ytc_UgyDTzAgH…
G
He says this but uses an AI version of Alan Watts in his podcast to sell nicotin…
ytc_UgwLnh9hv…
G
It's honestly what I hoped would happen. With proper competition, AI should now …
rdc_m9hegpt
G
@angxls_real so, why not return to hand made physical 2d? And remove all the dig…
ytr_UgyHzzw_U…
G
UBI Will not happen long term. It will only happen until the corporations and th…
ytc_Ugx6h1wRo…
G
I'm less concerned about AI taking over like the matrix or terminator....my prim…
ytc_Ugw9PE4CZ…
G
AI can't survive on its own. It only exists and persists because of humans.
If …
ytc_UgxecW9VC…
Comment
So engineering sense, domain expertise, and deep understanding is still useful? Who knew. I use LLMs quite a lot now but my workflow has eventually evolved into mostly just automating the actual writing code, summarising code execution paths for me, and automating the writing of implementation plans. Every time I left it to its own devices it just goes off the rails. I'm still doing all the reasoning, reading, designing, and learning of the tech stack I've chosen, it just lets me evaluate the effects my decisions have much faster.
youtube
AI Jobs
2026-04-16T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyxv4NWlvR1u3hen9F4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxZScjnF43IFzlyBq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5YVnaTqSlBIB5Wc94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwdRlqSfCyxxA9B0xZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKqCA0qhkcOgRLlkp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRN8olBbRN8fpAcnJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzWxoxMdbOrrRC5XTJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyAV5sYPIaDAAZhn3p4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy622Tf-IH_uVY9sy94AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwwmAMi33Yd3BiHD414AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]