The widespread deployment of highly autonomous weapons—systems capable of operating with minimal human oversight—is likely to transform the future battlefield, accelerating the pace of fighting and delegating many critical battle decisions to machines.
Technologies now in development could endow machines with the capacity to search for, identify, and kill humans on the battlefield or to hunt for and destroy an adversary’s nuclear deterrent systems, possibly igniting a nuclear exchange.
We agree with a growing number of governmental and nongovernmental experts that the unregulated deployment of lethal autonomous weapons systems (LAWS) could lead to violations of the Law of War and international humanitarian law and increase the risk of uncontrolled escalation in a major-power crisis.
We call upon responsible states to promptly pursue multilateral negotiations on a legally-binding instrument to ensure meaningful human control over weapons of war and decisions to employ the lethal use of force.
For four years, signatory states to the Convention on Certain Conventional Weapons (CCW)—a treaty signed in 1980 with the aim of eliminating munitions deemed excessively cruel or injurious—have sought to assess the potential dangers posed by autonomous weapons and to consider whether new measures were needed to control them. Most recently this investigative task was entrusted to a Group of Governmental Experts (GGE), which most recently met Aug. 28-30 in Geneva.
A significant number of governments have concluded that the use of fully autonomous weapons can never be reconciled with international humanitarian law and have advocated the adoption of a legally binding ban on such munitions; others have called for a nonbinding measure incorporating some basic principles on LAWS, like the necessity for ultimate human control; while a small minority, including the world’s major weapons producers, Russia and the United States, argue against any new measures regulating LAWS.
At its most recent meeting, the GGE agreed by consensus that humans should always retain ultimate control over weapons systems, but they failed to agree on a path forward other than to continue further expert-level discussions in 2019.
Given the rapid progress in autonomous weaponry research and development and given that many autonomous weapons systems are moving rapidly toward deployment, it is past time for responsible governments to act.
Current policies and practices are clearly insufficient to address the dangers posed by LAWS. The U.S. government’s guidelines, outlined in a 2012 Department of Defense directive, says such systems should allow for “appropriate levels of human judgment” over the use of lethal force, leaving open the question of what constitutes “appropriate.”
The Group of Governmental Experts, which began their deliberations in 2016, has had ample time to investigate the dangers posed by autonomous weapons. Although important technical issues regarding definitions relating to LAWS remain, we believe that the time for discussion is over and that the dangers of deploying lethal autonomous weapons have been sufficiently demonstrated to warrant the initiation of formal negotiations on meaningful control mechanisms.
The appropriate place for these to begin is at the next meeting of the CCW’s High Contracting Parties, set for Nov. 21-23 in Geneva.
We fully recognize that there are differences among member states on what sort of limits to place on lethal autonomous weapons, if at all. But as the U.S. has argued in another negotiating forum, the Conference on Disarmament (which also operates by consensus), negotiations do not assume any particular outcome but allow for careful consideration of competing proposals.
We therefore urge United States to act more responsibly and call upon all governments represented at the CCW to support the initiation of negotiations on autonomous weapons at their meeting in November and to help craft an outcome ensuring continued human control over weapons of war and the decision to employ lethal force.