Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@Alexander_Kale "People also said other things were an extinction risk and we are still here" is a good heuristic if you don't know anything about the technology. Obviously it's possible in principle for humanity to go extinct, and the invention of the atomic bomb brought about an era where we could wipe out most of human civilization pretty quickly. But nukes don't have their own goals. They don't make plans. They don't try to prevent themselves from being shut off. I highly recommend you look into the actual arguments and evidence here, rather than dismissing it up front. The fact that most leading experts are concerned should at least serve as a signal that this is worth looking into. The AI Safety Info wiki is a great place to start for beginners. Lastly, we are still alive. If the development of AGI was made illegal around the world today, superintelligence would never be created. Counterintuitively, frontier AI development _is_ actually possible to regulate and track and enforce. We can get there, because money is no good if no one can spend it. We have to try, or else we are abdicating our responsibility for the lives of our loved ones.
youtube AI Governance 2025-08-28T19:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxruEjafI6CsLocy7Z4AaABAg.AMLJsptIyK_AORfGra1UjP","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytr_UgzHLQ93nAOv4VA3x5R4AaABAg.AMLIXnMbnvaAMNoC_eiByr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgzHLQ93nAOv4VA3x5R4AaABAg.AMLIXnMbnvaAMX1JpVwLz_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyjxkmB2dSeputoN_R4AaABAg.AML6tJ0E45AAMNYemhIaET","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytr_UgyjxkmB2dSeputoN_R4AaABAg.AML6tJ0E45AAMNb0QQaQIc","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgyjxkmB2dSeputoN_R4AaABAg.AML6tJ0E45AAMOVIlYWH28","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzSzEnMjXwN1Cr9e714AaABAg.AMKtUOKpjC-AMLpi6xesXh","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzSzEnMjXwN1Cr9e714AaABAg.AMKtUOKpjC-AMNufewgB8h","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgzXVBSVTIxe4C3b8eF4AaABAg.AMKtOYpkxW-AMLjKwehExN","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxv_UEXyhKZ7R9Xii94AaABAg.AMKsrtpfL1uAMN_3a5mXTl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]