Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I might be wrong but it’s one option that as soon as we became scientifically intelligent, powerful & brazen enough to understand Human/Flora/Fauna genome & the mapping genetics templates for our own human genome, the quantum & nuclear age, the silicone revolution we killed GOD. Murdering our once beloved Deity. Severing in many cases our connection with the creator, now Darwinian evolution of species explained life on Earth going back eons?? Stephen Hawking (my hero) warned us about this danger ten or more years ago. I think the creators of this doomsday clock theory, Albert Einstein being one huge brain involved; would agree with Professor Hawking. We need wiser minds on this issue. Not the self interests of greedy corporations or the paranoia of militarily brass. Politicians have neither the intellect, wisdom, foresight or discipline to understand the potential catastrophic consequences of a militarised Artificial Intelligence… we are all living with the consequences of the Manhattan Project; once that genie was uncorked, understandably in 1939/45 & supposedly saved many lives in a potential battle of Japan in the long run how much misery have nuclear weapons wrought!! My country U.K. was a big part of the project & my own hypocrisy makes me glad we have an independent nuclear deterrent. I am forever grateful that nuclear war didn’t happen in the fear filled 1980’s when I was a kid bombarded with nuclear war warnings & drills. I wouldn’t wish that feeling of impending doom & hopelessness on the current & future generations of children. We say nuclear weapons save lives & work as deterrence, we were wrong but I understand the justification. I see no justification for AI if there is even a 0.1% chance a self aware machine can plot & carry out global human extinction sorry let’s put it back in the metaphorical bottle. In a Wiser brighter more stable future maybe!! Now ? No no no
youtube AI Governance 2023-07-07T11:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyzW4uMrOij6gq46hR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzSQurgPd4eR1JlgOZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyaJScccPQxaxCKuZx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgywfXJd4vaZxINlFxJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwMxynupjlnNgSyZcZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyLjRSUduRa_SGXrhV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwA2LSnM73RhZxSzj94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgywO94Ys9QiPXMsiIZ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyUgsQ-p4eHJri4f1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxzQ7CAXgd8r4dMG6Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]