Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Actually, for the last 15-20 years or so the major countries have been developing missiles that DO make decisions. Various missiles have a mode where you launch it into an area, giving it only GPS coordinates, and then the missile looks for anything threatening and takes it out. The "problem" some have is that our autonomous target detection systems are not perfect. In fact, one test a decade ago included a missile that was launched at an area containing a target ship (this same missile has anti-vehicle and anti-ground capabilities). Some freighter had ignored the "DO NOT BE HERE!" warnings broadcast on the radio. The missile identified the freighter as a cruiser and slammed into the cockpit. Luckily, the warhead was a dummy test warhead because they wanted to see where it would choose to hit, so there were only minor injuries. In essence though, because these systems are less than perfect, there are people who consider them as indiscriminate as land mines and such. Regardless of all of this, it is impossible to stem the tide on drone research. The various militaries have already seen the first taste of advantages that these devices can give them. They have the additional advantages of leaving no evidence that they are being developed. At least with nukes, everybody knows when someone tests them. And you've seen how effective we are at banning nukes. In the end though, autonomous systems CAN actually be better than humans, or at worst, as good as humans. There is a paper I am writing on this topic, where the idea is that something we can implement could be thought of as a "lethal reverse Turing test". Lets say you have a robotic tank, it sees something it identifies as a threat and shoots. Turns out it was a little girl with a broom. Luckily with the drones instead of claiming racism and such, you can get its logs, so you can see EXACTLY why it made the decision it made. Now, you take that data and you set up a simulator where a human pilot can receive the same i
reddit AI Governance 1428581436.0 ♥ 124
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_f1enc3n","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_cq6g4qz","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"rdc_cq6fvcy","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_cq6o7cd","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"rdc_cq6g17e","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"} ]