Today in Geneva the CCW (Convention on Certain Conventional Weapons) Meeting of Experts on Lethal Autonomous Weapons Systems got underway. The primary purpose of the meeting is to develop recommendations regarding potential controls on lethal autonomous weapons system (LAWS) to be considered by the Fifth Review Conference, a regularly scheduled meeting of parties to the CCW that will occur in December 2016.
Lethal autonomous weapons systems are, in effect, robots programmed to kill without direct human guidance. Just as self-driving cars have advanced to the point where no human interaction is required beyond the point of identifying a destination, weapons systems are now at the point where a mission, including the use of weapons, can be programmed by humans and then executed by the machine without further human interaction. Clearly there are many technological, political, legal, and ethical questions to be considered in the face of such an important development.
There is no shortage of scholars, NGOs, IGOs, and governments interested in weighing in on lethal autonomous weapons and their implications. A range of papers and presentations prepared for this week's Meeting of Experts is available on the UNOG (United Nations Office at Geneva) website here. The International Committee of the Red Cross (ICRC), a key contributor to international humanitarian law, has weighed in with an argument for preserving human responsibility in decisions to kill. The Campaign to Stop Killer Robots, a project that brings together a number of NGOs including Human Rights Watch and the Pugwash Conferences on Science & World Affairs, has articulated a similar position.
There are many reasons to hope that the Fifth Review Conference in December will be able to make progress toward a ban on fully autonomous lethal weapons systems.