Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Steven, while I appreciate you platforming Stuart Russell and his warnings about AGI existential risks, there's a glaring irony here. You're both "winners" in the current system — you with your multi-million-pound podcast empire built on viral clips and hustle-culture branding, him with prestige, grants, and influence inside the very academic-tech complex racing toward superintelligence. The people who profit most from lecturing the masses about impending doom are often the ones with the strongest incentives to keep the status quo intact. Real solutions (pausing the capability race, democratising development, fundamentally rethinking profit-driven AI) would threaten your relevance, your speaking fees, your views, and the entire attention economy you both thrive in. Talking about human extinction while monetising the fear of it feels less like leadership and more like dinosaurs giving TED talks about the asteroid — alarming, eloquent, but ultimately invested in a world that’s already ending. If we're serious about survival, maybe the first roles AI should replace aren't truck drivers... but professional doom narrators who get rich warning us without risking their place in the game.
youtube AI Governance 2025-12-24T18:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw0l_heyAXQ4x4FK-h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzzdAWLM_iNJQNz6Qd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgyH_dg1fGqLFDFVQTl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwyctSnsREPJuVM0N94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzeI9Ir6B8I91SBAc54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_Ugymf4vN6oRrYuoUotx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyCM-AiwL-MnJNyXoR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzS8rM0MT_xe2t1KJV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwiJrC4XFYYkZopzmB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyPzcOARjovWMmxcox4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]