Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It’s likely already been developed by Palantir. The IDF has been working on a system called [Lavender](https://www.972mag.com/lavender-ai-israeli-army-gaza/) which they deployed in Gaza. It’s frightening. It beings together various strands of their ops in real time, for example digital tracking with live satellite imaging, and aligns that with on the ground movements in real time. Every human target gets a ‘score’ - when it gets high enough, there’s a missile with your name on it. If civilians knew just how insidious this technology is - because of course the enemy will soon have similar systems - they’d be calling for it to be banned. Once robotics comes into the picture, we are all toast. There is no escaping from such systems once you’re in them. This is where the sudden rush for wanting all our data becomes scary - and soon we’re all living in a Black Mirror espisode.
reddit AI Moral Status 1772358558.0 ♥ 24
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_o80okli","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"rdc_o8120wk","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"rdc_o8168xb","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_o80rh1p","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"rdc_o80ytmz","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]