Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"Even when we say we want peace and stability" Who is this mythical peace loving altruistic "we"? Humans don't say one thing in agreement. Incentives, different for each individual, combine to create human made catastrophe and crisis. How will AI differentiate between what some hypothetical doom craving Western corporatist politician wants and what a subsistence farmer in Malawi wants? They are vastly different things. For example Mr Yudkowski here wants to be more famous than Altman, and sell books, to soothe his god complex infused ego. Even if all the doomerism comes to pass, it would take only a few thousand humans or less to allow the species to survive. Like the ones who have never owned an iPhone - not every human is a carbon copy internet trawling Westerner. If that happened it would ironically solve all our other problems like over population, resource scarcity and climate change. And even if none survive, global level extinctions have happened on this planet before. Time will keep ticking by, and we'll all be dead so we won't even know.
youtube AI Governance 2025-11-18T22:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgyjG32YOUy-BJ4NTgl4AaABAg.AQ9BSJh1pP7AQf-PzhlAkT","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyjG32YOUy-BJ4NTgl4AaABAg.AQ9BSJh1pP7AVV7tIsEivA","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyZZ9jHk4bbQfEP0Dx4AaABAg.AQ4UwEPAzCsAQ6hbqUr0k4","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugy4HfiN8Djm14kV0pR4AaABAg.AQ-hela8k7eAQgL20d3719","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugy4HfiN8Djm14kV0pR4AaABAg.AQ-hela8k7eATyhki_unaX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyXF8A8_gVuWh8jAWF4AaABAg.APrBnahp569AQ-AheTQaPD","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgwH9PmWEIV3zB8Zh2F4AaABAg.APa2ECqMBskATNIK6TpVv2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugz8bDUbH-UTVcgh8tp4AaABAg.APUwAuDaDfvAPgCAuSchC2","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugz8bDUbH-UTVcgh8tp4AaABAg.APUwAuDaDfvAPgYsl9AY7V","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugz8bDUbH-UTVcgh8tp4AaABAg.APUwAuDaDfvAPguw1pIMuR","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]