Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
No, it's *worse than that.* It's not only because copilot was trained with poor code. It's also because *Copilot has no fucking idea what you're trying to optimise* , and *average is not what you want.* The average between a good solution to problem A and a good solution to problem B is generally *not* a good solution to problem A+B/2. Average are *compromises* and compromises are generally *bad* in algorithms. Not making the difference between *result* on one hand and *method to get the result* in the other had is a typical *plain noob error.* You fundamentally *cannot write good code by language imitation* in general, because the test functions you need to solve your problems with a given language are *never ever used in the IA's training!* So such IA can always learn the language you want to use, but it *cannot learn your problem* , by design. Specialized IA could be used to solve your all programming problems, but *not* IA designed for statistical language imitation. What you need is a totally different kind of IA. More specifically, you need an *algebra solver* and a *machine simulator.* The only part where a LLM might be useful is to translate your request written in silly human language into a logical problem to solve.
youtube AI Jobs 2024-06-15T23:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxU-slTqYtBSe52fnh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyAlS3pDzuhNqy16EB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxN_vF4zaf5t_pedeV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx2kNNG7JKc-NsFeaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyuRPbyjoH2w2kXIXp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxdd1wSiJe9yZWaumR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwU5I9g-QNLzF0rSVl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"frustration"}, {"id":"ytc_UgwKHnlnzoObFNGwsMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwVe-t2OXIYqSUcKuB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwdy6uwOyKB4uJLxqZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]