Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also when you post AI images AI learns from it's self... Also any mistake then w…
ytc_UgyTTsOfF…
G
To all of you saying please and thank you to AI because you are afraid of what t…
ytc_UgxA6n1X7…
G
Ai art at the level where you can't tell if a human made it or not is perfectly …
ytc_Ugy5WQfhp…
G
Your example is flawed. Don't tell me that because AI can now do customer servic…
ytc_UgyXSOi7C…
G
"Luddite’s was a term used to describe textile workers in England at the beginni…
ytc_UgxmQppx4…
G
Amazing statistic only one Waymo? 1000 cars an hour go through red lights illega…
ytc_UgwX1uJPw…
G
If you’re a ChatGPT user then you’re not smart and you suck at research 😂😂…
ytr_Ugx5M2XSS…
G
I'm very concerned, too. It's not just the AI girlfriend; I see issues with digi…
rdc_lzagxkl
Comment
Actually it makes sense that context engineering is the optimal way to use LLMs for coding: specifying and designing the right software for the job are two most critical steps in the process of coding a software, these are the hotpsosts an LLM can't help with, it can just suggest the most generic code that reflects the level of abstraction it manage to deduce\infer from the data in your prompt, and if it can't reduce right,it "hallicunates". A S.M.A.R.T. set of contexts to structure the LLM agent coding process sounds all right...
Only experienced programmers and software engineers possess this acquired ability to think in terms of technical abstraction, they know by experience what is doable, interesting, or damn crazy when it comes to coding; and you become an experienced cider by ... coding so you can get rid a bit of your ignorance before you start becoming a productive coder.
youtube
AI Jobs
2026-02-16T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxyhx3fvlybv6u8Ejl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzqb1F9GuqzqiNyIxZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz5_EZKGyZdpQb8_EN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwjB8EpPS8He733kwl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxHMX9Tosz6oYZ-2wV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzylKRYTSm9jA_Dy3Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyk1Ie-SDSCIDO-_TB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzqmCR-4Ywp4frWKqJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwM-XBY5eyEywAD1kN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgygwQ8ZG_C70IIk_n94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]