Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How would it be useful, from the perspective of an artificial superintelligence, to do such a thing? We, fragile, limited, and slow biological beings, will eventually appear completely useless to it. Why waste resources sending them to colonize other worlds when the ASI can do it itself much more efficiently with its robot armies? It is anthropomorphism to imagine that superintelligence will have any desire to preserve biological life. Unless we are able to implant such goals into these systems (which we are absolutely incapable of doing—this is the alignment problem, still insoluble to this day), the future superintelligence will act without concern for what happens to biological life. This is why all these experts are worried about this existential risk.
youtube AI Governance 2025-08-02T23:2… ♥ 2
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgyK2hhZrQ2ql3FhJsN4AaABAg.ALKMXhUBO5KALLxnph3VpK","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_Ugwzpn-jVtylqsoa5qV4AaABAg.ALKIvnU39j9ALKKw4NFO3y","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugwzpn-jVtylqsoa5qV4AaABAg.ALKIvnU39j9ALKNPayW6Ec","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_UgwIYUnePCIQjdVCpDZ4AaABAg.ALK8JHI8rlJALMryZLsWpN","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwIYUnePCIQjdVCpDZ4AaABAg.ALK8JHI8rlJALNZUPOcUZ4","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwIYUnePCIQjdVCpDZ4AaABAg.ALK8JHI8rlJALNt3QGfmEm","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwIYUnePCIQjdVCpDZ4AaABAg.ALK8JHI8rlJALO2ild6oOl","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugy5Hm4fEomuU6ka2bh4AaABAg.ALK6anP6VfoALKERqBobld","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwYprA4Q2XMnF1eMo54AaABAg.ALK60U2KXOnALLFF6RERvn","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxotUzsDNFALOvVsxN4AaABAg.ALK2q2KGiqMARksq02IW7i","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]