Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Over years of mentoring software engineers I've noticed that there is a cohort of them that does something akin to advanced pattern matching. They see a process that needs to repeat and they write down the loop syntax that they've memorized, then if there is an off by one error they try to subtract 1 from the loop bounds and run it again to see if that fixes it. Sometimes I call it 'guess and check' programming. Contrast this with how I and others I've seen write code. I first build a mental model of the 'machine' that I want my code to be and then only once I'm convinced it will work do I translate that mental imagery into source code. It seems to me that what GPT and out LLMs are doing is much more akin to the first process I described of pattern matching, albiet at a super human level. So if that's the type of programmer you are I might be worried for your job. That isn't to say that the approach once scaled couldn't replace us all. I think of chess engines which up until recently didn't have the human skill of intuition, they were simply doing what human players call "calculation" at a very super human level and that was enough to beat even the best chess players.
youtube AI Jobs 2024-01-14T20:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxqRvqM2vDPvOxSNuR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwcPYm0srN2_on4Arp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz1Rwm3dZ82Z3ABhLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxyAomzqeN-i047UvN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyQIfEEvoPB58IC6_Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyP61WQ0lj2AreoGkl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxgZcxOchwQivOumFF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzkGFf3nO6FVcWLxAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy6M56vkxxcV_b9h3x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwVCEtU0pmVNLtPs1x4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]