Compilation of open letters against autonomous weapons

Scientists around the world are speaking out against the dangers of developing lethal autonomous weapons. The following open letters showcase their concern:

    This open letter was announced July 28 at the opening of the IJCAI 2015 conference. To date, the open letter has been signed by 3462 AI/Robotics researchers and 18909 others.
    “Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons — and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits. Indeed, chemists and biologists have broadly supported international agreements that have successfully prohibited chemical and biological weapons, just as most physicists supported the treaties banning space-based nuclear weapons and blinding laser weapons. In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.”
    “As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm. We warmly welcome the decision of the UN’s Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations.”
  3. Canadian Open Letter
    “In a letter delivered to the Prime Minister’s Office last week, we exhort Justin Trudeau to join an international call to ban autonomous weapons that remove meaningful human control in the deployment of lethal force. More than 200 leading AI researchers have also signed the letter, which is now open for all Canadians to sign and have their say. All who sign the letter are of the view that weaponizing AI is a very bad idea. Weapons that remove meaningful human control from target-and-kill decisions sit on the wrong side of a clear moral line. We have therefore asked Canada to support the call to ban such weapons at the UN Conference of the Convention on Certain Conventional Weapons, which convenes in Geneva on November 13.”
  4. Australian Open Letter 
    “It is for these reasons that Australia’s AI research community is calling on you and your government to make Australia the 20th country in the world to take a firm global stand against weaponizing AI. Lethal autonomous weapons systems that remove meaningful human control from determining the legitimacy of targets and deploying lethal force sit on the wrong side of a clear moral line. To this end, we ask Australia to announce its support for the call to ban lethal autonomous weapons systems at the upcoming United Nations Conference on the Convention on Certain Conventional Weapons (CCW). Australia should also commit to working with other states to conclude a new international agreement that achieves this objective.”
  5. Belgian Open Letter 
    “As members of the Belgian artificial intelligence (AI) and robotics research community, we express our deep concern about the development of fully autonomous weapon systems, which lack meaningful human control over the critical functions of targeting and engagement in every attack. Fully autonomous weapon systems threaten to become a third revolution in warfare. The development and use of such systems pose serious threats to international law, as well as to human rights and human dignity. Once developed, these weapon systems will lower the threshold to become involved in armed conflict, while allowing armed conflict to be fought at a scale greater than ever. The development of these systems will likely cause expensive arms races and lead to regional and global insecurity. Autonomous weapons are likely to proliferate rapidly, and could initiate or escalate conflicts without human deliberation. Moreover, the development of such weapon systems raises significant accountability questions, as it is unclear who could be held accountable for any misbehaviour of such weapon systems. Urgent action to address these concerns and prevent proliferation is needed. Once this Pandora’s box is opened, it will be very hard to close. We therefore call upon the Belgian government and parliament to join international efforts to preventively prohibit such weapon systems, and to resolve as a nation never to develop, acquire or deploy such weapon systems.”