Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
First take off, the Singularity, call it what you want. Humans aren't capable of seeing past the point of trillions of operations per second, every second, no sleep in an algorithmic self improvement explosion. Science fiction is an interesting window to these discussions and so plot lines such as Dune may be valuable in planning ahead. But the current administration in the U.S. and elsewhere has me less than hopeful. The movie "Don't Look Up" is sadly the best commentary of what we're dealing with here as an Extinction Level Event. As an armchair futurist, rabid consumer of all that is AI - I'm reading AI 2027 now by Daniel Kokotajlo, et al., I try to imagine just how AI can wipe us all out: change the composition of gases on earth to choke us out, virus with an R naught of 100+ we never survive, drive us all off the road using our collision avoidance in our vehicles (this is a joke), launch a nuclear holocaust, etc. None of them come remotely close to what may or will occur as nobody is capable of predicting this. That should horrify anyone and everyone. I have 3 children, so I'm more than a casual observer. I'm contemplating a $75 membership to the IASEAI to be more than "armchair" in this future.
youtube AI Governance 2025-12-06T21:5…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugwtu9j7l_eD0QYDslt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxY0CsU57Y1yTEirCh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgytD70lbCA5ZXOIefJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzCBwOlPLO-wgqmkdV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwo-8xcgrxWzpcYmrh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgypLUvHadvObIFszAN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwn4G_vyyD41TNOAZh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwFVxBHfs4VXg_eT2h4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugx9SpjjoedyAy11pAl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugy3fGQW-SZQenydZqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]