Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My thoughts are that neural networks at their very heart are imprecise entities. Wokring in meteorology, the thought of anything being represented by an 8 bit floating point number seems farcical at best. That works great for interaction and things like chat GPT, where the bulk of the produced output is literally copypasta from Stack Exchange, and diverse tutorial websites. Not unlike how most programming is done by people today LOL. I see the future of innovative code will always be people based, I mean an AI has months of training, whereas mother nature has had millions or billions of years to train our data in our head. I think we are at a crossroads or will be soon, where a new technology based again on what worked for nature is going to come to light. That being self modifying code. Instead of emulating the human brain, emulating the genetic code that goes into every organism is how we got to the brain in the first place. This should be most successful IMO not on textual source code, but machine instructions themselves. Giving a scoring system based on a desired output for a given input, and modifying the machine code randomly bit by bit until an optimal solution is reached, then disassembling from there. GPUs processing thousands of generations of code on an emulated processor per microsecond can tackle the most complex queries we can throw at it in short order thanks to the principles of natural selection and evolution.
youtube AI Jobs 2024-01-17T10:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugx41Wa78wXM3Vr_mGd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwT91joY2v157zT2CR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw8ZMU-BJir8NuCZ3V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwqzyiYuGOoJ6zbNaV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx4gAU7cervqGV7ytt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz1vk-GF9zjK7Uto7d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwv19KB-_D4gfC4Hg54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyRgLAOmQgTNHpjdgJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy_xfgl2jcPjfp29vB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyyN2dPRYBJrX-8DhN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}]