Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Check out the scenario forecast called AI 2027. It's just one scenario, and that specific set of things isn't going to happen. But it's backed by a lot of research that shows the overall trajectory. There are many reasonably likely routes to human extinction from AI. Some are sci-fi sounding like using advanced nanotech to create a new form of AI life from scratch. Some involve things we know for sure are possible, like designing an extremely deadly virus and getting someone to create it. Some just involve deftly manipulating a lot of humans and being nice to us while building up an autonomous economy that can operate without human oversight, until we become irrelevant and powerless. It's also helpful to notice that it's already impossible to "unplug" a computer virus, and ASI could operate as a massive botnet.
youtube AI Governance 2025-08-27T09:3… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzVV7sqMNfyHNBE9HN4AaABAg.AMISDRGbPv6AMITfx7m3x-","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzVV7sqMNfyHNBE9HN4AaABAg.AMISDRGbPv6AMIo5wbtxaH","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzVV7sqMNfyHNBE9HN4AaABAg.AMISDRGbPv6AMIr3GHg6Jp","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwZ0YqMbvvnWd3dP8h4AaABAg.AMIQn_XuqVbAMJDM_W6vcz","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgwZ0YqMbvvnWd3dP8h4AaABAg.AMIQn_XuqVbAMJFs4HL8V0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwZ0YqMbvvnWd3dP8h4AaABAg.AMIQn_XuqVbAMK8SU4PBIK","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugws38Dp8DfrtcM7b194AaABAg.AMIPmsGfXAnAMIl3p_dSVF","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwBvlYr2UcOhyTnDKF4AaABAg.AMIP9ac57itAMIp_absgEu","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgxnIFUI0T_CoYa8uwN4AaABAg.AMIOPUHnCsVAMKmXBitGjR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxhQ46PziHBHmXw99t4AaABAg.AMIMxqpcIu5AMUOXRxziZi","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]