Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Dr. Yampolskiy brings up Dune but skips the juicy part. Dune isn’t just desert politics in space. It’s set in our own universe after humans built a fully automated world run by AI and computers… until the machines took over. That ended in a brutal war we barely won. After the ban on machines, people turned to spice, a drug that let their minds work almost like computers. The story of Dune kinda can be seen as a warning of what happens when we lean too hard on tech and what oligarchs will do to keep power. Some versions of Dune's backstroy say AI turned dangerous because humans themselves gave it the drive to survive. A simple experiment spiraled out of effin control. Maybe Dune is less fantasy and more parable about our own reality. And this is why sci-fi matters! Orwell’s 1984 played with ideas of telescreens and mass propaganda. Star Trek gave us flip phones. The Diamond Age gave us e-readers. Clarke gave us HAL 9000. Asimov gave us Multivac. Sci-fi holds up a mirror showing us the social impact and consequences of tech before we live them. And the robots our scientists and engineers are building realllllly don't need to like humans. That’s just to make us feel comfortable, feel less alien and more acceptable and fun.
youtube AI Governance 2025-09-05T14:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxE9ChmCVCR7RIbItN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxWGdFmhoz6SCLASMx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyMiweQ6vzFHGXxD3Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwYGWoTRD95x74qJW14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxuZwB1avMpfkJUD_54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxUoEh98snuKDkbr_h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwT2-16VdP8sLn0na94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxrHv1I9E3pGyRACl54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwyCqqjAGn-EcQDDyl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxA2ZSd7emkPKsh3Up4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]