Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Every time I see TYT do a science/tech story, I cringe, because almost to a man they treat the subject matter like 90 year olds who think cars are new-fangled technology. TYT needs a dedicated science/tech member, I've said this for a long time, and I'm willing to do the work for free so that Cenk/Ana/etc. don't look like a bunch of morons when they do these stories. When Musk and Hawking advocate not developing AI for combat purposes, they're not saying that because they think it'll spawn a Skynet-esque robot army dedicated to our destruction; they're saying it because it would just further desensitize the 1st world to the effects of war.  It may be controversial to say this, but you kinda have to put people's lives in danger on the battlefield, even if you don't necessarily have to, because the application of force should always be risky, so you don't use the force capriciously.  It's bad enough with drones, because then the operator is detached from the risk of the actions, and psychologically this makes him or her care less about the targets and who they might be, and it makes the chain of command care less about who the targets are because there's no risk to their personnel, so they can toss missiles around willy-nilly because at worst, all you're doing is killing a bunch of civilians who are powerless to stop you militarily or politically.  Using autonomous combat systems would remove even that little bit of humanity left in the system, so all you'd have to do is toss some of them out into the field and walk away, never really having to know the outcome, aside from a statistical analysis of the machine's performance. Musk and Hawking are advocating moving away from this possibility for a couple reasons: one, they don't want AI and war to become synonymous, they don't want the resources that go into developing AI to be wasted on something like war, because there's plenty of other problems they'd be much better at solving, and because if autonomous combat machines become economically viable (which they're certain to become if nothing changes), then war will literally become a video game, declared and fought against those who can't produce or afford disposable soldiers, and have to field a fighting force out of its own sons and daughters.  War will then become, not a option of last resort, but the first option considered against any nation other than the 1st world powers, dropped at a moment's notice in order to enforce the will of the haves against the have nots. I'll say this again, TYT DESPERATELY NEEDS SCIENCE/TECH ADVISORS, because Cenk and Ana get so wrapped up in the whole Terminator rigamarole to focus on the deeper implications and subtler nuances. They're supposed to be the hip and edgy news channel, but they turn into old fogeys when it comes to stories like this.
youtube 2015-07-30T09:2… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugiq7KJ6T100kXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugh365DWKmrW13gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugiq02-FnzwitXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgjPJM6JnogjQ3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UggFv-a3g2noD3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UggsIQHlAlQBJHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugg_2NbNeYN8ZXgCoAEC","responsibility":"none","reasoning":"resignation","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugiz180S0BWrMXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugjz03jBITPdiXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugg1h-_yIXiDuXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"} ]