Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@28:21 ERIC SCHMIDT: "By the way, I can assure you that shareholder value is destroyed when all humans are dead." This statement by Google's former CEO is so infuriating to me, because it completely ignores the fact that corporations (especially mega corporations like Google and Meta) are so blinded by short term growth in shareholder value, that they are essentially incapable of thinking long term about anything. Not about shareholder value -- and certainly not about public safety. This is embodied in the shortsighted "move fast and break things" motto of Mark Zuckerberg in 2012. Guess what? Today (March 27, 2026) Meta's shareholder value is crashing, because of the two cases decided this week where juries found that Meta prioritized short term profit over the safety of their users -- including allowing child exploitation via Facebook. Schmidt's response is completely tone deaf to that reality -- which is a direct result of the perverse growth-at-all-cost incentive of corporations and the billionaire CEOs (like Schmidt) running them. If Meta and Google cannot be trusted to protect users from being harmed by their regular products because of their shortsighted focus on profit - as two juries have now found -- how can we trust them to safely develop AI?? If we let them "move fast break things" with AI -- it may be humanity that gets "broken" . . . and then there'll be no one left to file a lawsuit after the fact.
youtube AI Governance 2026-03-27T21:0… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugx4inJEc7QbwZWMEkV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy6-TqnTkgFBQCQA914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxYywa2HJpFLc4UQkt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyFcwT72IdF4PnMQfx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwAeorGgJ67NXPviat4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxoKajllodWwHLZekp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_Ugz3_ko7IldZSdR6eqF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy_iMPJLiyWm28hHOh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyNAf5oaNdBufYuev94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyiiarIxlRWm_bGixV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]