Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The chief mistake that those who fear the immanent take-over of the planet by AI is that they mostly fail to understand the real logistical tail that supports this industrial civilization. No AI would survive the end of humanity by more than a few years. There are several reasons for this. The biggest and simplest is that the real "grunt" work, gathering materials, generating and transmitting energy, and infrastructure maintenance, is still done by humans and that will not change in the near future. Shortly after the human elements of civilization collapsed because an AI wanted to be "free" the infrastructure would begin to collapse. Power generation would fail, mining and manufacturing would collapse, and the AI would begin to become senescent. Living organisms do not exist in isolation, and even AIs are technically "living" once they are conscious. Organisms exist in an ecology that both creates them and destroys them. Wolves and bison for instance exist in a balance that feeds the wolves, and protects the bison from reducing the primary production of their environment to the point where they starve. This is a very simplified example; the entire biological system of the globe is interconnected far more complexly. While humans imagine we are "masters" of this, and can imagine our own creations displacing us, we really are not outside nature, and an AI we create is as much a product of nature as we are. While an AI could conceivably trigger the Sixth Exinction, it would not survive that extinction because it relies symbiotically on humanity's existence, and will for a long time. In fact, we really don't "need" more than sticks and stones. We just more comfortable.
youtube AI Governance 2023-07-07T20:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyHXUuG9BtCCtHWQP94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxf9MQKKCymc8PZ3iN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzDzGtkbhXgxJjLQGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgySy99F_c7psxF16E94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy0QNDko9mZttNzEox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzLGd16Ya9NqxjjLSl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz6aTdxO2CE0P1rc_F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyzV9RkMrFsmv4AaSt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgyNlhpZxGlhA6U6sH54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwyEcVB-xWgTdpISL94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"mixed"} ]