Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We're not worried about the machines. We are worried about what the machines will be used for, and irresponsible use of them. This is one step away from the possibility of autonomous weapons. What's to stop a tyrant to send in machines to terrorize those who dissent? Machines aren't like humans, there is no fear or second guessing, there is only action and planning and the goal. And that's not mentioning the possibility of extreme accuracy. AP rounds would be the norm on these machines because they would be able to hit vital organs/major blood vessels over 99% of the time, cover would be useless when combined with target prediction and IR cameras. A hidding family can be shot without human morality or emotion in the equation.
reddit AI Moral Status 1574796462.0 ♥ 180
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_f8t5v5i","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_f8t9556","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"rdc_f8tgk2e","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_f8t5xlk","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_f8sw29c","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]