Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@Alanpoeta The fact alone that you liken it to neural connections shows that you don't understand what an LLM is on a fundamental level. They're nothing alike. For example: an AI model can not take training data and input from users, then give a reasoned response. It can make a picture based on one it was shown in training, but it can not create something wholly original. It can solve problems it has been trained on, but if a scenario was not in its training data, it can not find a solution. Even if it is blatantly obvious. AGI under current tech, is an impossibility. And every expert in the industry will tell you that unless they have something to sell you. And the only solution the tech industry has found so far is to try and train on bigger data sets, build bigger datacenters and dump even more hundreds of billions down the drain in the hope that it will somehow get accurate enough to replace humans. Spoiler alert, it won't. Not for jobs that require thinking or encounter any unpredictable or complex problems on ar egular basis. Aka developers are as safe as can be
youtube AI Jobs 2026-01-28T09:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzXIWB_S9KUa1QatYl4AaABAg.AQiXMN_TktuAQma1POoxJ6","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzXIWB_S9KUa1QatYl4AaABAg.AQiXMN_TktuAQmpIsup1IT","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgykJNJEQWPlmNFKdud4AaABAg.ATuRJYvnOAAAUV7-nuZRuh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgwVois_AViyVgTVCQh4AaABAg.ASXQy45FmUUASZ-tpbQorq","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_Ugwx0u6rYosvvnwnOMp4AaABAg.AS1bP3PnM1xAS35E18A7c8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugwx0u6rYosvvnwnOMp4AaABAg.AS1bP3PnM1xASSY1-aXbAG","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwx0u6rYosvvnwnOMp4AaABAg.AS1bP3PnM1xASUeETu6-C9","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwx0u6rYosvvnwnOMp4AaABAg.AS1bP3PnM1xASWeg9L2KoT","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwIE769G9YK8lRiyoF4AaABAg.ARyFXysXknrASZGoCG86vJ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytr_UgwIE769G9YK8lRiyoF4AaABAg.ARyFXysXknrASZHoaCOPi6","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]