Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have had a deep interest in computers and artificial intelligence from the 60s. Never once have I believe Ai within itself as an inherent threat to human existence. Ai in any form is a human tool. I can imagine humanoid Ai becoming something more than just a human tool, I still don't see Ai in any form taking over capitalism. To have a capital society you must have humans with the ability to buy and sell goods and services. I can't support your company's Ai products if I can't hold employment because I have been replaced by robots. All through human existence, new tools become a part of human industry. Nearly everyone of the entry of these tools were seen as a threat to humanity in some way. What I imagine untimely as Ai is super advanced, capitalism will find a way for it to make people wealthy. You can't make my company or me wealthy unless I have an ability to gather wealth myself. The reason world War three has not happened is because it will end wealth and power. Putin enjoy his wealth. America enjoys wealth and power. There is only one way that I can possibly see people not able to be employed. As in the world of Star Trek and the Federation, wealth may become unnecessary. Power will never go away. To have that power you will have to provide something to the everyday person. Providing a sustainable credit for the average person. Even if we could reach this level of human society, it would not happen within the next hundred years. Even if humans could construct such a society. It will likely become an even larger threat over Ai itself. Such a society will likely bring on the "Mark of the Beast". That will be the end of humans in this realm of human existence. Regardless if you believe in Christ as the God of everything. Coming to a society where you can't buy or sell without approval of whoever or whatever is in control of such a society. That type of society requires "The Mark Of The Beast".
youtube AI Governance 2026-03-28T18:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxsaYuHydwA65gMcZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzkwkWg4FOptRK1O894AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwPa7vTfspmUesTJVR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz1-OpVwyztYkJVAMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyqdPvH-XGIbfX90CR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyUAhkxyZ8k8HxJYi54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwk52TMyw_Q1G7_q1J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzJgxiOcUu1pqnlr4Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyEow2UU8AD6As_Jc54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzK9m7tB0lVIS2va8Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]