Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
A few issues: 1. It ignores the fact that Sam Altman is a profound liar; he has been bullshitting about the power of his company for a decade, and we are much, much farther away from AGI, or even Agentic AI than he claims. 2. We all see this coming, and the companies who have become early adopters of AI have suffered significant losses and productivity issues. The idea that AI will manage AI assumes that the AI was built by a brilliant, honest and benevolent person, because no grifter can successfully engineer a productive system. We might want AI managing AI, but we can’t figure out how to get people to effectively manage anything. 3. There are countless people who can and will profit off anyone attempting to destroy the economy. If you think Wal-Mart is going to sit back and let Microsoft destroy their customer base, then you severely underestimate their psychopathic Ozark ruthless rage. They’re just one of the many companies who will do wild things to undermine these tech ghouls, not to mention the People’s Republic of China, who could launch a nuclear weapon at Taiwan if they thought America might gain an insurmountable advantage over them using TSMC chip technology. This is a hell of a video, but it requires a lot of leaps in order to make sense. Again, never underestimate the potential psychopathy of the average human. If people think they’re in a hopeless situation, they won’t peacefully protest and raise awareness. That, and as good as humans are at building things, we are infinitely better and smashing things 😂
youtube Viral AI Reaction 2025-12-21T23:4… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz_qWfn7B-MLyFURgJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgylKJ3n7VfH-bcw7e94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxfk8YzPIcggOyWS8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwPfsAREwx8SWjtJVl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz6usvCw4L_IgE3fEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwD9o47tvL31Dzb6xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzfVOgrhDBuclTawyx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzyMhcbMH-wf5iB4jh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyIaXjJpJHamuxmtMF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzgOuPEUtrEz5wD1ep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]