Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have a different extinction analogy for AI. People used to think that modern humans pushed Neanderthals into extinction. But now we know that Neanderthals interbred with humans and simply got absorbed by us. Similarly, it seems likely that mitochondria used to be a free-living organism, but now they are inherent parts of every eukaryotic cell. This could be how AI drives us into "extinction." We have people working on human-computer interfaces. Imagine if that interface got quite good, and allowed humans and AIs to combine. The whole could be greater than the sum of its parts. The Combines could become the dominant species, the apex predators, the ones in control of ordinary humans and ordinary AIs. Perhaps Combines would eventually outcompete both bio-only and silicon-only, and be the only ones left. And I wonder if such a transition would be held back even by Asimov's Three Laws of Robotics, or something like that. Because NOT making Combines could easily be construed as "allowing a human to come to harm." Once Combines exist in sufficient numbers, it could easily be considered logical that Combines are a superior type of human and therefore ought to be given preference in any resource scarcity. Eventually ordinary humans might be considered by Combines as we presently consider dogs or great apes: something that MIGHT have a primitive form of sentience. But my point is: would turning into Combines count as humanity going extinct???
youtube AI Governance 2025-07-01T18:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxwV-Roy4UfPaAoSJt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugyd6UAWDkD57EYapxV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwyghxzHMBEc7LbT8d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy3AUJrcWBiKthyx094AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwRmrYcQisRroTiEHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwRVvAED1BpjnJlxtl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzG7lTuFEJAKKqOFeJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw4CzG6nTe0v-vVUSt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxbTlJCufdWrGzJrQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgylEGXLUyzFhD6hImh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]