Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
And even with that, the AI risk within the current paradigm and state of the art - is *considerably* higher than asteroid impact risk. If we make AGI without *concrete* control mechanisms (which we are nowhere near figuring out) - the doom is approaching 100%. Its a default outcome, unless we figure the control out. All the positions that this risk is less than 100% (people like Paul Christiano, Carl Shulman at ~50%, or Stuart Russel, Ilya Sutskever at ~10-20%) - hinge that we figure it somehow, down the road. But so far there is no solution. And now that all of AI experts see the pace, they come to realization that it won't be someone else problem - it might impact them as well. LeCun is the only hold out, but I think only in public. He knows the risk and just want to take it anyway - for some personal reasons I guess.
youtube AI Governance 2023-06-27T19:4… ♥ 3
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxGaW9p18AEp5IotE94AaABAg.9rcov6TyeMk9sG0KIycBlk","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugz-I_5z2MH1F-xN_bt4AaABAg.9ra7EGbrpwC9s-nH-OgPNs","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugz-I_5z2MH1F-xN_bt4AaABAg.9ra7EGbrpwC9tOj5W4zn70","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzUXE2d9iCAiRPKfyN4AaABAg.9rYaYPPO-7MA72SGK_7Aqz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugy8fQDWMBP-0LRsOAB4AaABAg.9rTdNN4aeVh9rY9ZyGIH8X","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytr_Ugy8fQDWMBP-0LRsOAB4AaABAg.9rTdNN4aeVh9ra3e-kPDb_","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"fear"}, {"id":"ytr_Ugz8-e_-RYQnkl5h3MN4AaABAg.9rT0qF094RK9rTtn13CD3K","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxcuDNaybYEsp5vnLZ4AaABAg.9rT-uSfjN9d9rTLwqFhvnA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwH-6hm87UtoueFPWt4AaABAg.9rSv5z5xe2O9rWNAl3PZiG","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwH-6hm87UtoueFPWt4AaABAg.9rSv5z5xe2O9rsZ4ItlwZY","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]