Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I mean realistically... why not? The aversion to them is entirely based on works of fiction. If we are going to have weapons, and it seems like we will continue to live in a world where we will, why not remove the human element as much as possible? Humans are great at a lot of things but we are entirely prone to errors and clouds of judgement from emotions. Humans will do a lot of stupid things. Autonomous weapons are only as weak as we make them. I would rather put a weapon in the hands of something objectively running purely on data and sensors than in the shaky hands of a mind clouded by fear or rage. I mean the aversion seems entirely psychological no? Are we not already relying on computers making decisions from everything like weapon systems targeting, flight itself, etc? We already fully rely on weapons controlled by computers making thousands of decisions in the background. How is it any different to go all the way when we are already trusting computers to make like 80% of decisions for us? The human element is simply an illusion of control.
reddit Viral AI Reaction 1776891754.0 ♥ 4
Coding Result
DimensionValue
Responsibilitynone
Reasoningutilitarian
Policynone
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_ohvwc3g","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_ohpbljd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_ohpi3ky","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"rdc_ohprhgl","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_ohsk2ob","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]