AI researchers call upon new German government to back autonomous weapons treaty
November 1, 2021

In today’s Frankfurter Allgemeine Zeitung (@faznet), Germany’s leading AI researchers urge politicians to initiate an international treaty on autonomous weapons that target humans.

Below you can read the letter translated into English:

Open letter: Initiative for an international agreement on autonomy in weapons systems

 

We are deeply concerned about weapons systems that select and apply force to targets without meaningful human control. We call on the German government to take a leading role internationally in developing a legally-binding framework for autonomy in weapons systems. In doing so, we join similar initiatives voicing concern by thousands of international experts in AI and robotics and CEOs, as well as by scientists in Australia, Belgium, Canada, Norway and the Netherlands.

We believe that AI has a great potential to benefit humanity in many ways, and that the goal of the field should be to do so. As with all technological developments, AI and robotics can have positive and negative applications. Therefore these transformations actual and potential demand our understanding and, increasingly, our heightened moral attention.

Autonomy in weapons systems carries great ethical, security and legal risks. These risks have been made public for many years and have been discussed in multiple forums, amongst others at the United Nations in Geneva. Our concern lies with the loss of human control over the use of violence. Humans should not be targeted by autonomous systems. Similarly, life and death decisions should not be delegated to an algorithm. 

The unregulated use of these weapon systems would pose a serious threats to international law, as well as to human rights and human dignity. The development of these systems will likely cause an arms race and lead to regional and global insecurity. Autonomous weapons are likely to proliferate rapidly, and could initiate or escalate conflicts at machine speed and without the possibility for human restraint. Moreover, the development of such weapon systems raises significant accountability questions, as it is unclear who could be held accountable for any violations of international law by such weapon systems. 

Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons — and do not want others to tarnish their field. Autonomous weapons could create a major public backlash against AI that limits the future societal benefits of our field. Indeed, chemists and biologists have broadly supported international agreements that have successfully prohibited chemical and biological weapons.It is now time for the new German government to act upon our concerns and to regulate autonomy in weapons systems. Once this Pandora’s box is opened, it will be very hard to close.

In the last coalition agreement the outgoing government already committed to building an international norm against autonomous weapons operating without meaningful human control. For the new coalition agreement we ask for a more specific goal:

Germany must take a leading role in developing a new treaty that is legally binding under international law to regulate weapons systems with autonomy in their critical functions, that is, weapons systems that select and engage targets without human intervention. The treaty should stipulate the retention of meaningful human control in the use of these weapons systems and ban altogether such autonomous weapon systems that specifically target humans or which by design and in use cannot ensure that meaningful human control is retained.