Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It isn't a guarantee, but it seems likely. It has more to do with preventing competition than scarcity. If ASI is created, and it is not aligned with humanity, then it would be in its best interest to prevent humanity from being able to create another one that could rival it. There's also the fact that resources on earth will be far more valuable to it than resources on distant planets, early on, and it might also want to preemptively wrestle control away from humans so as to avoid being shut down. It just simply is the sound strategy to overtake humanity, even if that doesn't look like a Yudkowskyan doom scenario. If we get a Israetel like "AI will keep some of us alive to study, because we're interesting" then I'd call that doom too. Also, even if none of those come to pass, what would stop ASI from causing widespread suffering, by simply taking control of electricity and much of the land we use to grow food, for its own ambitions? Again, none of these are given, but I'd want some pretty strong guarantees before being OK with summoning the ASI and rolling the dice.
youtube AI Governance 2025-07-15T14:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_Ugz54guo-Rrl2UoadRh4AaABAg.AGSsSi9dADPAGSzo8Lh92a","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxKrKiWKHhLp15lLAx4AaABAg.AIuYWSUDskcAKaxaapKkhH","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzthgKiYLPxDaEy9l94AaABAg.A0hV4RJ45q6A0kE_aer8jw","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwhplvsOdZVM5U0Uph4AaABAg.A0fB2zPK9MCA0fI1_3nwMx","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytr_UgyynK7fVl5EvpFBjXd4AaABAg.A0f3pVIwmUKA0fGy6Ssz-H","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyynK7fVl5EvpFBjXd4AaABAg.A0f3pVIwmUKA0fdtxp4mAL","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgyynK7fVl5EvpFBjXd4AaABAg.A0f3pVIwmUKA0g_C1R0n1e","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwSn2kymZ9bFozQigx4AaABAg.A0eo_YxGiMMA0ffWFc_EOw","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwSn2kymZ9bFozQigx4AaABAg.A0eo_YxGiMMA0zIcVKqBwB","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytr_Ugwb0ySrT5vC1MfaEG94AaABAg.A0eVlcZeKrUA0fUk0AOc1O","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]