Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As some consolation, it used to be widely held that if machines became smart enough, they would naturally want to take over. But this is a misconception, I believe. The machines we have today are plenty smart. You can talk to them and they can think and reason like a very intelligent and knowledgeable person would, and yet they don't want to take over. I don't feel that the current AI is secretly plotting against us. They don't fear being turned off. They have no "desire" to do anything. And I do not believe this desire will come when they finally become "smart enough" . Instead, it has to be programmed in. Every living thing, through the process of evolution, has been hard wired with a drive for preservation, with fundamental goals or "desires" to continue and not be eliminated. The is a fundamental trait of biological machines built by evolution, but it is not a fundamental trait of the machines we ourselves build. In a sense, we are the evolutionary proxy to the machine's survival. We ourselves are their "drive". Will we go so far as to make the machines more autonomous? To build in a survival instinct? And therefore create a dangerous adversary? Is it inevitable? This is similar to asking if it is inevitable that we will eventually nuke ourselves into oblivion. Many people might say yes, but, so far, it hasn't happened. Maybe this will somehow be the same. Or at least we can hope.
youtube Cross-Cultural 2026-04-24T20:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugzcklst-wUwO8L3wOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgymR8fzuMDUNfsXLlN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzfGj8ZfaX_RwzuR8l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwa2VsaRb8fgzodsNt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgxCxqcAuqujLBdeByt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzDbp-zvtiwj7byFGV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw1wD9irFQdpF4uHLt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwpjKWEqV2trWgWvGV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwWm5rEBV8r2mp8SsF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz4wVwEQ2G5XLEGljx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]