Recaps of the UN CCW meetings April 9 – 13

From April 9-13, 2018 in Geneva, the Convention on Conventional Weapons (CCW) Group of Governmental Experts on lethal autonomous weapons systems is meeting at the United Nations (UN). After meeting in November 2017, the group agreed to meet again for a total of 10 days in 2018, and this is the first session of two. The following YouTube videos and written reports by Reaching Critical Will recount highlights from each day. Statements from individual countries about their stances on lethal autonomous weapons, delivered at the meetings this week, can be found here.

An opening statement from Mary Wareham, coordinator of the Campaign to Stop Killer Robots: CCW Report, Vol. 6, No. 1: Time is running out

Day 1 

Video recap:

CCW Report, Vol. 6, No. 2: One coin, two sides

The Group of Governmental Experts (GGE) on Lethal Autonomous Weapons (LAWs) had a strong start to its resumption of discussions on Monday. Since the GGE last met in November 2017 a number of working papers have been submitted to inform discussions this week.

As is often said in the context of developing new agreements however—the devil is in the detail and some details expressed today indicate that there are some, potentially significant, differences in perspective that will need to be reconciled. These touch on outcomes for the GGE as well understandings of key concepts including the importance of whether, or not, to arrive at shared understandings. Indeed, this is an issue area where it is easy to go down technical rabbit holes and lose sight of the way forward, or overcomplicate the discussion. Some of these divisions mirror dynamics in other disarmament and arms control fora, with respect to states that have larger or more sophisticated military capabilities, versus those who do not and are most impacted by arms and conflict. As Human Rights Watch stated today, it is noticeable that some of those preaching caution are the ones rushing to develop increasingly autonomous technology.

Developments over the last few years have shown the strength and power that “smaller” states possess when they work together with common vision and this can also be true for moving forward to meaningfully and effectively prevent the development of autonomous weapon systems. It was also pointed out today that calling for meaningful human control and calling for a prohibition are really two sides of the same coin—a unifying reminder.

Ambassador Amandeep Singh Gill of India, who chairs the Group, opened by reminding delegates of the agreed outcomes from the previous session, including the applicability of international humanitarian law (IHL) to the issue of autonomous weapons, among other points.


Day 2 

Video recap:

CCW Report, Vol. 6, No. 3: Still in pursuit of the unizonk

In the course of the thematic debate about how to characterise, or define, autonomous weapon systems, quite a lot of ideas emerged. A small group of states believe that autonomous weapon systems do not exist while among most others there is a general sense of wanting to curtail further and future development.

There are pockets of agreement developing across several significant angles of this topic: is a working definition useful or needed? How to approach developing one? What should it include?

The Chair, Ambassador Gill, has helpfully produced some resources to guide us in the discussion, including a compilation of existing definitions, and often draw similarities between what he hears from governments by way of statements when providing a summary. He has also introduced a framework for organising suggestions and approaches that is useful but it does not always align neatly with how states are organising and presenting the same information, creating a bit of a “round peg in a square hole” effect.

One new development since November is that Switzerland, Ireland, Austria, Sweden, and Canada have started to view the term “lethal” as not useful and possibly problematic.  Estonia would like to consider this further, and Germany indicated not wanting to limit a definition to only autonomous weapon systems that are lethal.


Day 3

Video recap:

CCW Report, Vol. 6, No. 4: Lethal [autonomous] weapon

The word “lethal” is being questioned by a widening group of states in the context of whether it is an adjective or qualifier that should continue to be placed in advance of the term “autonomous weapon system”.  On Wednesday, Poland joined others that have made this point earlier in the week. One reason for removing this word is, as Switzerland noted, that keeping it overlooks autonomous weapons that do not necessarily inflict physical death, but lead to physical injury. The International Committee of the Red Cross (ICRC) noted that it is not lethality, but the use of force that triggers legal obligations under international humanitarian law (IHL) and international human rights law (IHRL).

Wednesday took a hard focus on the “human element” in the use of lethal force. All governments are united in recognising the importance of human control but views differ sometimes significantly at which point in a system’s life cycle it is needed, and at what threshold. The GGE chair distributed visual aids to map out trends from the discussion on Wednesday, including a diagram outlining a spectrum of autonomy, and a list of the words used most frequently by states to describe thresholds of human control, such as “meaningful” or “sufficient”. This was useful in enabling discussion with greater detail, including with examples of real and hypothetical situations that developed in a natural way, however a few states felt that this was distracting and asking things overly complicated. China urged re-focusing discussion away from these questions, and—somewhat ironically for the purposes of this editorial—back to the question of lethality.

Days 4 & 5

Video recap of Day 4:

Video recap of Day 5:

CCW Report, Vol. 6, No. 5: Choices

Discussions at the Group of Governmental Experts (GGE) meeting this week made it very clear that the concept of human control is firmly at the heart of the debate over what to do about killer robots. No state supports a weapon that operates entirely without it, particularly in taking decisions about taking human life. The clear take away message is that this is the right thing to do, for ethical, moral, and legal reasons that have been enumerated by so many voices in government, academia, military, science, and elsewhere, and were repeated at this GGE meeting.

The international community has made some progress in articulating norms for behaviour in cyberspace, but it is widely agreed that the gap between law and digital capability is growing exponentially. It may never close. There is a very real risk that the same will happen with autonomous weapons if decisive action is not taken quickly. To stay ahead of the curve, the opportunity of this GGE and its next meeting must be maximised in order to set out the course for swift policy response at the end of this year.

We can, of course, meet again in five or ten years to discuss how to react to the use of autonomous weapons and their systems, their proliferation, and inevitable “misuse” as we do virtually all other weapon types.  Wouldn’t it be a welcome change however, to get it right this time and not have that conversation? This is a unique opportunity to learn from mistakes and do better in future.