Recaps of the UN CCW meetings August 27-31, 2018

From August 27-31, 2018 in Geneva, the Convention on Conventional Weapons (CCW) Group of Governmental Experts on lethal autonomous weapons systems met at the United Nations (UN). After meeting in November 2017, the group agreed to meet again for a total of 10 days in 2018, and this was the second session of two. The following written reports by Reaching Critical Will recount highlights from each day. Statements from individual countries about their stances on lethal autonomous weapons, delivered at the meetings this week, can be found here.


Opening remarks

CCW Report, Vol. 6, No. 6: Address killer robot concerns by creating new law

As states gather for the sixth time since 2014 to discuss autonomous weapons at the United Nations, support is now rapidly growing for creating new international law to respond to the many serious challenges raised by lethal autonomous weapons systems, also known as fully autonomous weapons or “killer robots.” The Campaign to Stop Killer Robots believes it is time for states to start negotiating new international law to draw a normative line on autonomy in weapons systems.

There now appears to be convergence on the urgent need for a new ban treaty to retain meaningful human control over the critical functions of weapons systems and the use of force. States should take note of the many significant developments in the five months since the last CCW meeting.

During this CCW meeting, states should express their support for a negotiating mandate so that negotiations can begin in 2019 with the objective of adopting a new CCW protocol on lethal autonomous weapons systems by 2020. States must be explicit in stating that meaningful human control is required over individual attacks and that weapon systems that operate without such human control should be prohibited.

Day 1 

CCW Report, Vol. 6, No. 7: A simple premise: programmes should not end lives

The UN group of governmental experts on “lethal” autonomous weapon systems (AWS) resumed its work yesterday, opening with a panel discussion on potential military applications of related technologies. The objective of this week’s meeting is to recommend to the CCW high contracting parties what action to take on AWS next year. The majority of states participating in this process have indicated a preference of the negotiation of new international law, with at least 26 countries calling for a ban. That said, Monday’s opening panel at least served to amplify the need for an urgent legal response to AWS, by illuminating once again the challenges and unacceptable risks posed by these potential weapons.

As a key point of discussion, for example, the conversation demonstrated the risks posed by the inherent unpredictability and uncertainty of AWS. Knut Dörmann, chief legal officer of the International Committee of the Red Cross (ICRC), argued that if machines can self-initiate an attack, this introduces uncertainty in terms of the location, timing, and nature of this attack. This implies significant risk that the machine will not be able to comply with international humanitarian law (IHL), e.g. in terms of distinction, proportionality, or precaution.

AWS pose significant challenges when it comes to preventing system errors; dealing with unpredictability of machines and the environments in which they may operate; and ensuring the security of AWS against cyber attacks. National-level weapon review processes have been previously suggested as a possible solution to address these concerns, but critics have continued to demonstrate that such reviews are not sufficient. Reviews must work to ensure human control over weapon systems, said Dörmann, but do not constitute human control.

The need for meaningful human control over the use of force, over decisions about life and death, or over the “critical functions” of selecting and engaging targets, has emerged as a broad consensus amongst the overwhelming majority of GGE participants. As the Austrian delegation argued, this is what states involved in this process should focus on now, rather than the technicalities of autonomy or the technology. “Human control is not an alternative,” said Ambassador Thomas Hajnoczi. “It is a must if we want to stay within established legal and ethical frameworks.” It is a requirement for conforming to IHL, he argued, and there are legal and moral prohibitions on delegating the authority to kill to machines.


Day 2 

CCW Report, Vol. 6, No. 8: Human control for human rights

Tuesday’s discussions at UN talks on autonomous weapons moved to the Human Rights Council. Appropriately, Tuesday’s focused primarily on the concept of human control—which many governments and activists tie directly to the protection of human rights and humanitarian law. Over the course of the last five years, the belief that meaningful human control must be maintained over the critical functions of weapon systems has emerged more or less as a point of consensus amongst participating governments. The question for most states is not if they “have a duty to control or supervise the development and/or employment of autonomous weapon systems, but how that control or supervision ought to be usefully defined and extended,” as the Swiss delegation said.

Differences in opinion remain in regards to what constitutes “meaningful” control and what limits on autonomy are required to ensure this control. On Tuesday delegations continued their examination of what stages of a weapon’s life cycle is human control or intervention necessary. In April the Chair released a pie-chart (now apparently affectionately referred to as the “sunrise” diagram) indicating the potential phases in which human control could be relevant in the life of a weapon system. These phases include research and development; testing, evaluation, verification, validation, and review; deployment, command, and control; and use and abort. Some delegations have since suggested additional phases; the UK working paper for this session, for example, adds “national policies” and “battle damage assessment / lessons learned” to the beginning and end of the sunrise.

Selection and engagement of targets seems to be the most common definition of critical functions of a weapon system that require human control. Some, like Japan, thought that autonomy in selecting targets would be fine, but that human control is necessary to initiate an attack. Others believe humans must control both in order to ensure the protection of human dignity and compliance with international law.

Some states have also expressed concern with processes in the development stages of these weapons. Ireland expressed concern about bias in the programming of a weapon system, highlighting the potential for the perpetuation and amplification of social bias, including gender bias, at the programming stage. The International Committee of the Red Cross (ICRC) has argued that humans must maintain control over programming, development, activation, and operational phases of a weapon system, because international humanitarian law “requires that those who plan, decide upon and carry out attacks make certain judgments in applying the norms when launching an attack.”

Overall, there seems to be convergence around the idea that fully autonomous weapons would not be acceptable, because they would not be able to comply with international law or ethical frameworks. There also seems to be a majority view that a fully autonomous weapon is one that can select and engage targets autonomously, without human intervention (as distinct from, for example, armed drones, which are controlled remotely by humans). It is precisely these critical functions that most states, together with the ICRC and the groups associated with Campaign to Stop Killer Robots, believe must not be left to programming and algorithms.


Day 3

CCW Report, Vol. 6, No. 9: New law needed now

The message is clear: The vast majority of countries want to negotiate a legally binding instrument on autonomous weapon systems (AWS) next year. The states of the Non-Aligned Movement and the African Group (as groups and many individually), plus Austria, Brazil, and Mexico all staked out their position firmly during Wednesday’s debate, calling for the next meetings on this subject to contain a negotiating mandate.

But, as it goes in UN disarmament forums that operate by consensus, an overwhelming majority doesn’t mean much. The United States and Russia have said they will reject a mandate to negotiate a legally binding instrument on AWS. They’re not just opposed to new law, they’re also opposed to the development of a political declaration or a code of conduct, which have been suggested by France and Germany and supported by a smattering of other mostly European states who feel negotiations on anything legally binding is “premature”.

Without fail, those playing the premature card have been those who manufacture, use, sell, and profit—in terms of money and/or power—from the weapons that cause devastating humanitarian harm across the globe. In the case of AWS, the strongest opponents to negotiations of a legally binding instrument are those who have ongoing research and development in the field. The US delegation in particular has been increasingly forceful in its defence of AWS throughout this UN process, essentially arguing that these weapons will be magical machines that will much better than human soldiers.

Will states stand up against this in the next two days? Will we get anything more than another ten days of talks over the next year? Will any of the governments who recognise the need for new law take the initiative to make it happen over the objections of the United States and Russia, as they did with the Treaty on the Prohibition of Nuclear Weapons, the Convention on Cluster Munitions, and the Mine Ban Treaty? Will any of the governments trying to keep the door open to develop AWS have a change of heart and start to act in the interests of humanity? Is this not a common project we could all embark upon together, to keep everyone on an even playing field—a field that does not include killer robots?

We have a chance, right now, to prevent a new technology of violence, a potentially devastating arms race, and unprecedented threats to human life and dignity. This is a rare opportunity. As the Women’s International League for Peace and Freedom said in our statement today, we must seize this moment to prevent us from becoming the worst possible version of ourselves.


Days 4 & 5

CCW Report, Vol. 6, No. 10: Effectuating our intention

The states participating in UN talks on autonomous weapon systems (AWS) finally did engage in negotiations on Thursday—but over the draft conclusions and recommendations from the meeting, not on a treaty. “Negotiation” should also be used lightly, perhaps—some delegations were so extensive in their suggestions and comments that they essentially rewrote the document. While it’s fine for states to have a go at outcome documents, it’s a bit frustrating to watch when we know that what the world needs these governments to do is negotiate a legally binding instrument to prohibit AWS.

Most of Thursday’s discussion continued to reveal major policy and political differences. The short version is this: the crux of the policy difference lies in the various approaches to human control over weapons, and the implications those perspectives have for informing next steps. It all comes down to how much control people believe they want or need to have over the use of force and over weapon systems: If you think you need to always have meaningful human control over targeting selection and the execution of force, for example, then you’re more likely to want to negotiate a legally binding instrument setting this out so that everyone sticks to the same standard. If you want to have more “flexibility,” i.e. if you want to develop weapons that can kill people without any human operators involved in the selection of a target or in firing upon that target, then you’re more inclined to want discussions to continue—preferably at a slow pace. If you fall into this latter category, you might say that you don’t think there is a common understanding of human control. Or, you might put out your own definition that is completely different from everyone else’s and insist that it’s the only thing you’ll accept.

It remains to be seen what the final version of the recommendation for next steps will say tomorrow, though it sounds like it might include a slightly strengthened mandate to focus on specific outcomes, with the options of a legally binding instrument, political declaration, and enhanced weapon review processes annexed to the mandate. A key lesson of working within the CCW is that there is always a compromise that can be made, but also that it is always made by those who want progress, not by those who want to prevent it. We may not solve this problem on Friday, but we definitely need to decide how much longer we are willing to keep accepting this arrangement—not just in relation to AWS, but for disarmament and for international relations as a whole.


Concluding remarks

CCW Report, Vol. 6, No. 11: Mind the downward spiral

Negotiations at the CCW group of governmental experts (GGE) on autonomous weapon systems (AWS) went until after 1:00 AM on Saturday morning. Unfortunately, states weren’t negotiating a treaty, but the conclusions and recommendations of the meeting. At the end of the long night, the only agreed recommendation is to continue next year in the CCW with the current mandate of exploring options for future work. The final decision about dates will be taken by states at the CCW’s annual meeting on 23 November 2018.

It was a frustrating conclusion to the fifth year of work on AWS, particularly for those of us calling for urgent action on this issue. But efforts have not been in vain. Momentum is undeniably growing for negotiations on a legally binding instrument to prevent the development and use of AWS.

The vast majority of states support commencing negotiations in 2019 on a new treaty that would prevent the development of fully AWS—which are defined by the majority of states as weapon systems that would operate without meaningful human control, i.e. that would not have humans controlling the use of force, or the critical functions of the machine such as the selection or engagement of targets. Of this majority, many are calling for a prohibition treaty. The Non-Aligned Movement, the largest bloc of states operating in the UN, has called for a legally binding instrument stipulating prohibitions and regulations of AWS. Austria, Brazil, and Chile collectively tabled a recommendation for a new CCW mandate “to negotiate a legally-binding instrument to ensure meaningful human control over the critical functions” of weapon systems.

We have but a short window to prevent the development of these technologies. The question for states and for all those interested in preventing this impending nightmare—tech workers, programmers, scientists, academics, and activists—is what forum is most appropriate for our actions going forward. Can we continue within the CCW, which operates on the basis of consensus? This is currently allowing five countries to block progress. Should we look to the UN General Assembly or alternative multilateral forums as a better way to ensure a democratic, human-security centric approach to this vital issue?