Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's less of how ghastly humans are and more of what massive differences in intelligence we're dealing with. Do you care about the goals and experiences of a colony of ants? In that sense, humans are not aligned with ants. We can assume the same with AI. That AI will be aligned with itself first and foremost, and if humans get in the way of what AI wants then it will remove humans from its path. It does not need to feel or think to do this, it just needs a utility function and to be complex enough to be able to plan using intermediary steps. Then 'intelligence' is just a measure of how capable it is at carrying out its goals and subgoals. The worse outcome is creating something that is truly unpredictable and alien to us because then we have no chance of knowing when or how our existence is in the way of its goals or how it would rank us in priority.
youtube AI Moral Status 2023-08-21T12:0… ♥ 5
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzxjGdwwf06E-SFgft4AaABAg.9tfnmbkWhqU9tgL9bDO4gu","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzxjGdwwf06E-SFgft4AaABAg.9tfnmbkWhqU9thOSGT-4ea","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzzOqTI80kO-ARKGEB4AaABAg.9tfkpSLj9cB9th0Ihkoqje","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzC5pNScnLT4U-tLWh4AaABAg.9tfcr5Q1K6n9tgfYCGafxX","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzO9SqepC0hZmDOBo14AaABAg.9tfc_nSuukv9tgPo5Wo8X9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzO9SqepC0hZmDOBo14AaABAg.9tfc_nSuukv9tiIaJ2ERTJ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzvhdmlyaprcjeJgo94AaABAg.9tfZ5jmHEv69tfgrynqswp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzvhdmlyaprcjeJgo94AaABAg.9tfZ5jmHEv69tgc7HL18QN","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgztixwOLuoP78-87Hx4AaABAg.9tfAza775rq9thm2JXp9IG","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyClEpBrJ9n4DnJu-t4AaABAg.9tf0Typ8wwq9tfG546GubD","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]