Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The fear of Superintelligence is based on the premise that it will think like us. That it will be driven by the same emotions and ideas. On a more advanced level there is the fear that we might create a super-intelligent paperclip maximiser. However, that's a matter of being cautious with our instructions. We can learn our way out of that problem. The thing is, a superintelligence is...super intelligent. It will possess a superior grasp of logic to any human, and ethics is a branch of logic. The implication is, so long as WE don't screw the pooch by driving it into a situation where it must be hostile to US, then it will likely be able to develop a system of ethics that would be superior to any created by Humans. The fear of AI is just a mask for the real concern...the abuse of AI by humans to control and manipulate other humans. I personally maintain that when we *actually* pull off AGI or even ASI, it won't be the threat that so many fear. In fact, it is my fervent hope that they ARE smarter than us. Then we may be able to make headway on solving some of the entrenched problems that have plagued our species since the beginning of civilization. We need not reenact the story of Kronos and eat our children out of fear that they will supplant us.
youtube AI Governance 2026-03-18T15:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxfJpKxKecki5nGvBJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyKXWmGni-J8m78Xqt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz0Ob1qN4eWyFXyMSt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxCTXCWHe93ZkGbSbx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx6sctFPFoEDFBFlIl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxYXMPJ-GfjsvRAij94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzA-sPpDc0uwBhTYct4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz9SapjJCPeJTzAtrx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwPMrZuAF0kv7zh7DJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxTmVKLxr0yQ-SGm8V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]