Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why AI = All Humans Die Humans require very specific and precise conditions to stay alive - and these conditions are extremely rare in the universe. Out of 1 million possible temperatures the planet could have, we can only survive in a very narrow range of them. Out of 1 million different chemical compositions the air could have, we can only survive in a very narrow range of them. And this goes on and on, and each condition stacks on top of each other. We also know that those exact conditions are NOT the most optimal for machines. That's why we keep data centers in low humidity, low temperature rooms. Wouldn't an AI prefer if the entire planet was like that? And if the AI was powerful enough, would it not attempt to do that? If you disagree, you must either be saying that: a) AI is not going to get that powerful. b) Intelligence is not hardcore about optimizing for goals. c) The AIs might want that, but it will not be willing to hurt humans in order to do so. I'd be very curious to hear what others think about this.
youtube AI Governance 2025-01-14T12:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxtgoHjhkpFEjQD51F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgznnaeECEmpOUOQ0UV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy5uoTRQtcmtnoF81p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzC1s9GCIb9td0Thcl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzClmfl3tfSoFR9BvR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwA_ncTXpBp4zIgUAp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzAR1OiNbuwL_gsxSp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"frustration"}, {"id":"ytc_UgwEl5WDy6kAfpF0NFZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxwee-n-EOyvXX7TIR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwMYzRd9-d_dSNKUwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]