Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well, I'd say it has always been obvious that, when AI would've been invented, something like this had to happen. That's one of the motivations why it should've never go public (even though I find it would've been worse the other way). In general, since we invented AI in the form it currently exists, we were able to distribute it to the general public, and that means there are people that will talk to it as if it is human, people that will worship it, as well as (lots of) people that will talk to it not understanding what they are talking with and how it works, and believing it is doing things it is not actually doing... I mean, I am a technician that normally works with AI. But I don't think you need to be an expert to understand that 'your gpt' is the same exact thing every other user of that platform has, and maybe with at least a bit of studying, you'll end up understanding that the only things that makes your chatbot act different than others, is because it is working on a small box called context that gives it all of the data it has to infer with, and that makes "act" as it does. And that, friends, is because the AI state of the art is a monolithic object that DOES NOT evolve after its training. But I guess there is no way to explain that to people who won't listen... Still please keep trying, knowledge is our only way out of madness
youtube AI Moral Status 2025-07-09T17:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgywHTgtYMdl57YV8cR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxjgFuVIJltjFAuFOl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxaSLngq8R2zAlUtLZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwyvBK3oRbI5We4zxV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz0OHH2jKUpXh6w_ax4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyC6srb9tyDxAF9SEh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyncUx2piGzgJsAQNl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxpow7uGBcIUXURt8J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyE0qIAwYcJ0uKssOx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy5vpFBlD2rYxRs7Xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"} ]