Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Semi autonomous weapons have been around since the earliest set and forget traps. By the 1960s, there were mines designed to detonate only when a vehicle of sufficient weight passed over them. Torpedos in WW2 could identify and track targets by sonar. In the 1940s they were experimenting with pideon guided missiles. Infrared and radar guided missiles can select and track targets independently. By the 1980s there were cruise missiles that could self-navigate and choose between pre-selected targets. Self-targeting AI drones have existed since the 1990s. These kinds of weapons have always existed, they have just become more sophisticated. A Waymo autonomous vehicle hit and killed a pedestrian by mistake. There was an investigation and no one was held criminally liable. It was ruled an accident because the vehicle didn't recongnize a person carrying a bicycle. Autonomous weapons are not going away, but there should and will be regulations to govern their use and how they detect and engage targets. Millions of civilians have been inadvertently killed during wars by other humans. Collateral damage is nothing new. Treaties are in place to prevent the deliberate targetting of civilians, but large scale collateral casualties are considered acceptable. Hopefully, the autonomous weapons, if properly designed will be far more accurate in their targeting and will actually reduce civilian casualties. Missiles we have today can target a single individual where we used to blow up and entire building and everyone in it to kill that person. What is really scary isn't the idea of autonomous weapons, but the fact that we, as humans, seem to have an unquenchable desire to kill each other.
youtube 2026-03-12T08:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxBYCPoY_PbhfwnAKl4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwnPH2A4jSWAE9LGQF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxLoo6dnvWzEDrh5hd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgwQsqxcVFEnzIVkWEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy-AGVatNJ0_437ti14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwhDf0FJX9s1aOfpUF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxkW3BROdaW27-RKUh4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxwxvqHvMrOdcyo3El4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy2nbChYv4spMxmcql4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzJU0NJewlDpNpC00l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]