Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I know how to program in C because microcontrollers, but not in JS for a web app :,D but I needed one for my job, that make me easy track reports and the good thing about HTML+JS is you can fix it on the fly, so, I start to learn it, now, the PC at job have Win11, because I don't know that much, I ask to Copilot make parts of the code, but I strictly define "this is the input, this is the output I need". Many times I lost a lot of time, even when the AI can give a lot of code, I have to fix 60%, but not knowing how to make that 80%, still is a save. Last time, I was trying to make a template system, you make a text as template, uses some markers as variables, the app load all, and reuse the variables if you change template. But AI was using RegEx to recover the vars, and was a whole mess, to the end I fix it using just the strings and parse manually like I have done a lot on my MCUs when I use serial comms, search for the start marker, search for the end, verify to check for false result, and add to the array, more work, but I lost like 2 hours trying to fix it using the Copilot and RegEx :,D AI is nice, but you still need check what is doing, is like hyperactive children trying to help but doing a mess. Btw, you need add another critical check to the AIs, ethics, someday someone will try to put a worm or trojan inside your code if you let it write all.
youtube AI Jobs 2026-03-11T01:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyVcQ5cmAL3fb8ij_t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzr02reWKGQltTRV_94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxSQcSjCvFd54xhyeV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx9SZQzLNAffawvmO94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzsLODuJQJoRxUi85p4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz9waY-DD8Jam4sq4t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxTCJT-Tu3U48vJzOJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyM4SQcbRNZhZkKyQN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwuf_g9RZT7zCST1E54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxnZ4s_0wysjZ0bi614AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]