Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That conversation about 20 minutes in reminds me of the original movie _Westworld_ . A major part of the premise of the story was that the AI that ran the place and 'bots *was not fully understood* by the developers, because they deliberately created it to be self-learning and self-programming. Hence, even the operation of the 'bots themselves was not completely understood, even by the techs who repaired and maintained them. Now; I still contend that we have not yet come close to creating "true" generalized artificial intelligence. What the marketing people love to call "AI" is simply an advanced form of what I worked on back in the late '80s, at the time termed "expert fusion systems". Basically, sifting through vast databases to chunk together potential meaning out of, for example, voice radio transmissions. It wasn't remotely _consciousness_ , and I don't think it is today, either. But these systems have a *vastly* larger database to work from these days. And as noted, it's *evolutionary* . Who is to say when "consciousness" may develop - especially considering we don't even have a good idea of what consciousness *_is_* . I'm not going to panic yet. OTOH, by the time I do, it will probably already be too late...
youtube AI Governance 2023-07-07T07:5… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgweDj5wWxLesj5dmJ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyxCW4q8VS6Z4TbPEl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxMlwKjBQjbkRdFjtN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},{"id":"ytc_Ugy1Ozh_A2dYsOutPyd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzZ2kVbiFmnP4xJagh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwBfZgfokfETMhIxot4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxPOZ8KM0WWgRhaKlV4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgwACQfEzLz2ScV8h5J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw2SWvBIlnmJUHsXk94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugyh1MOKkIfKLaT5Qrl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]