Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AGI is impossible. This is not an opinion. And it's not a question of material infeasibility with current technology. There exists a mathematical proof (qv. Max Schlereth) that algorithmic cognition is incapable of ever crossing the barrier required to reach human-level intelligence-- this proof follows from the halting problem. All Turing machines implement closed algorithms. The "artificial" part of AGI means from a computer, i.e. a Turing machine, i.e. a system that implements an algorithm. And a system that implements an algorithm is not sufficient for human-level general intelligence, since algorithms are abstractions that are categorically discrete from real world environments. This means that they require world models, and/or insane amounts of empirical data (which is always necessarily limited and past-tense) with which to construct world models, in order to approximate animals' intelligent responses to new situations. Meanwhile, animals such as humans are able to fully and spontaneously respond intelligently to unforeseen and novel real world events because metabolic systems are not discrete from but are continuous with real world environments. A perfectly accurate world model, which would require perfectly accurate and universal prediction of the world, would be required to attain human-level intelligent responsivity. I.e. one of the necessary conditions for AGI is prior human omniscience.
youtube AI Moral Status 2025-10-31T02:1… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwcMdHiPdTgFCEJ9yV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzCo8EE_W1v1yLP2aZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgxQ6X2p5ivXXolDgdx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxDe4qf8L9tMvdtq7F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw9eKoKBqUxZEUdiM14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzWDTtG2hvhTEvjf8p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwcsfS4IKgHj9eDrJh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy6hKPvTYO5uTLhEUp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyWjtc_wDUeskBZwYZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyJGki2dmO2trJ0wWZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]