A graphic new video posits a very scary future in which swarms of killer micro-drones are dispatched to kill political activists and US lawmakers. Armed with explosive charges, the palm-sized quad-copters use real-time data mining and artificial intelligence to find and kill their targets. Could this be the future? We hope not, but it is something we need to look at. Be warned, this video could keep you up at night:
The makers of the seven-minute film titled Slaughterbots are hoping the startling dramatization will draw attention to what they view as a looming crisis — the development of lethal, autonomous weapons, select and fire on human targets without human guidance.
The Future of Life Institute, a nonprofit organization dedicated to mitigating existential risks posed by advanced technologies, including artificial intelligence, commissioned the film. Founded by a group of scientists and business leaders, the institute is backed by AI-skeptics Elon Musk and Stephen Hawking, among others.
The institute is also behind the Campaign to Stop Killer Robots, a coalition of NGOs, which have banded together to called for a preemptive ban on lethal autonomous weapons.
The timing of the video is deliberate. The film will be screened this week at the United Nations in Geneva during a meeting of the Convention on Certain Conventional Weapons. Established in 1980, the convention is a series of framework treaties that prohibits or restricts weapons considered to cause unnecessary or unjustifiable suffering. For example, the convention enacted a 1995 protocol banning weapons, such as lasers, specifically designed to cause blindness.
As of 2017, 125 nations have pledged to honor the convention’s resolutions, including all five permanent members of the UN Security Council — China, France, Russia, the United Kingdom, and the United States.
The Campaign to Stop Killer Robots is hosting a series of meetings at this year’s event to propose a worldwide ban on lethal autonomous weapons, which could potentially be developed as flying drones, self-driving tanks, or automated sentry guns. While no nation or state is openly deploying such weaponry, it’s widely assumed that various military groups around the world are developing lethal weapons powered by artificial intelligence.