The military personnel's call for a ban on autonomous weapons

 

Italiano  Français  Español  Pусский (Russian)  Yкраїнський (Ukrainian)  Deutsch  Svenska  العربية (Arabic) 简体中文 (Simplified Chinese)  中國傳統 (Traditional Chinese)   Nederlands  日本語 (Japanese)  Urdu  Norsk

 

Fully autonomous weapons are weapon systems that can identify and fire on targets without a human controlling them. They are not armed drones which have human control but are machines that would decide whether or not to kill without human intervention. That decision to kill would not be the result of the skills, knowledge, intelligence, training, experience, humanity, morality, situational awareness and understanding of the laws of war and international humanitarian law that men and women in uniform use to make such decisions in battle.  A machine would determine if the target was a combatant based purely on programming likely developed in a sterile laboratory years before the decision to kill was made. Abrogating life and death decisions to a machine is morally, ethically and legally flawed.

No country has fielded fully autonomous weapons yet but they are under development in a number of countries.  Now is the time to put a stop to their development and ultimate deployment.

Some argue that these weapons are necessary and inevitable.  Among them the argument that they would improve the survivability of servicemen and servicewomen and that might be the case if the enemy did not have similar weapons, but if one side has them so does another.

We are told that machines do not have human frailties, they do not get tired, they do not get angry, they are not affected by weather or darkness to the extent that people are, they know no fear and that makes these machines superior to a soldier.  Machines do not have these frailties, nor are they responsible or accountable for their decisions – they could and would attack with impunity. We believe the characteristics of these weapons should be banned in accordance with existing International Humanitarian Law.

Technological advances in robotics are already assisting soldiers in such areas as the detection of explosive devices, search and rescue, and some engineering tasks. However, many in uniform both serving and retired are seriously concerned about the prospect of assigning decisions on whether, what and when to kill, to machines. Autonomous weapons are not accountable for their actions.  There is a great deal of concern, particularly when considering asymmetric warfare, that machines are capable of reliably discriminating between targets that might be engaged legally and those which are not legal.  

As Soldiers, Sailors, Airmen and Airwomen, both serving and retired, we join the call for a ban on the development, deployment and use of weapon systems in which the decision to apply violent force is made autonomously.

- John MacBride, LCol (Retd) 

46 endorsements

Will you sign?







Like this to spread the word:
Keep Killer Robots Fiction