Slaughterbots Sequel Warns of Widespread Proliferation of Lethal AI Unless UN Acts at December Meeting
Original film had over 70 million views and won awards at Cannes
BOSTON, Nov. 30, 2021 /PRNewswire/ — The Future of Life Institute (FLI), a nonprofit working to reduce extreme risks from powerful technologies, has today released Slaughterbots – if human: kill(), a short film that warns anew of humanity’s accelerating path towards the widespread proliferation of slaughterbots – autonomous weapons that use artificial intelligence (AI) to identify, select, and kill people without human intervention.
Produced by Space Film & VFX and released in advance of the United Nations Sixth Review Conference of the Convention on Certain Conventional Weapons (CCW), if human: kill() follows up on the FLI’s award-winning short film, Slaughterbots to depict a horrifying scenario in which these weapons have been allowed to become the tool of choice not just for militaries, but any group seeking to achieve scalable violence against a specific group, individual, or population.
“The recent battlefield appearance of slaughterbots means that time is running out to prevent these cheap weapons of mass destruction from falling into the hands of anyone who wants one,” said Prof. Max Tegmark, Co-Founder of FLI and AI researcher at MIT. “if human: kill() is a reminder that humanity faces an imminent choice: Ban slaughterbots, or spend decades regretting that we ruined our way of life.”
When FLI first released Slaughterbots in 2017, some criticized the scenario as unrealistic and technically unfeasible. Since then, however, slaughterbots have been used on the battlefield, and similar, easy-to-make weapons are currently in development, marking the start of a global arms race that currently faces no legal restrictions.
if human: kill() conveys a concrete path to avert the dystopian outcome it warns of. The vision for action is based on the real-world policy prescription of the International Committee of the Red Cross (ICRC), an independent, neutral organization that plays a leading role in the development and promotion of laws regulating the use of weapons. A central tenet of the ICRC’s position is the need to adopt a new, legally binding prohibition on autonomous weapons that target people. FLI concurs with the ICRC’s most recent recommendation that the time has come to adopt legally binding rules on lethal autonomous weapons through a new international treaty.
“Drawing a clear red line that AI must never decide to kill people is not only about averting gloom and doom, it’s also critical to realizing AI’s potential to transform our world for the better,” said Dr. Emilia Javorsky, who leads FLI’s efforts on autonomous weapons. “Today we think of biology as a force for good and curing diseases, which is in part due to the bioweapons ban, which prevented the weaponization of biology. Now is the time to act and safeguard a new science, AI. For the sake of our humanity and our future with AI, we have to get this right.”
Thousands of leading AI and robotics researchers and key technology stakeholders have signed an FLI-led open letter warning of the dangers of an AI arms race and calling for a prohibition on autonomous weapons systems. More than 250 organizations have signed the Institute‘s pledge to not participate in or support the development of lethal autonomous weapons.
For more information, visit autonomousweapons.org.
About the Future of Life Institute
The Future of Life Institute is a nonprofit working to reduce global catastrophic and existential risk from powerful technologies, particularly artificial intelligence. Its work consists of grantmaking for risk reduction, educational outreach, and advocating for better policymaking in the United Nations, U.S. government and European Union institutions.
The Institute has become one of the world’s leading voices on the governance of artificial intelligence, having created one of the earliest and most influential sets of governance principles: The Asilomar AI Principles.
Press Inquiries and requests for film distribution may be directed to Georgiana Gilgallon, Director of Communications, at email@example.com