FYI.

This story is over 5 years old.

Tech

The Campaign to Stop Killer Robots Makes Incremental Progress at the UN

It overcame resistance from Russia at the last minute, starting a diplomatic process that could lead to a ban.
Image: Campaign to Stop Killer Robots/Wikimedia Commons

The killer robots of science fiction, like the Decepticons from Transformers, the T-800 Terminator, or the Cylons from Battlestar Galactica, all have one thing in common—they're designed, in part, to remind us of ourselves; giving the machines a humanoid form makes the fictional threat they present easier for an audience to comprehend. That's not likely to be the case with the real 'Killer Robots'— the autonomous weapons of the near future, or so-called Lethal Autonomous Weapons Systems (LAWS). They could be built by states using existing knowledge and drone technology, and used to independently neutralize military or civilian targets in a way that would be completely faceless and inhuman by design.

Advertisement

The Campaign To Stop Killer Robots has been pushing for new international legislation regarding LAWS at the Fifth Review Conference of the Convention on Conventional Weapons at the UN in Geneva, which took place last week. The Washington-based coalition of NGOs that we last covered last year deals with the ethical and legal problems associated with LAWS. I spoke with the campaign's global coordinator Mary Wareham recently by phone. At this stage, the campaign is a long way away from achieving its goal of a full ban; the best it can hope for is that incremental steps will be taken by the international community towards more controls. Thanks to the campaign's work at the conference, it looks more likely that a new legal instrument could be set up to regulate robot weapons, with a Group of Governmental Experts being set up to discuss the prospect next year.

The Northrop Grumman X-47B is a fighter-size drone prototype commissioned by the US Navy to demonstrate autonomous launch and landing capability on aircraft carriers and navigate autonomously. Image: DARPA/Wikimedia Commons

According to Wareham, several of the world's developed countries already have the technological expertise to delegate the task of killing in warfare to weapons systems that are capable of choosing and attacking targets by themselves. The technological feasibility of LAWS, combined with what Wareham described as a legal "accountability gap" for actions committed by robots is intensely troubling. As Wareham points out "There are no rules of the road on this," due to the lack of existing specific legislation covering killer robots.

"We're on the borderline here, and the window is closing to do something about this."

Advertisement

"It's unclear who, if anyone, could be held responsible if an autonomous weapon caused an atrocity," the campaign's legal expert Peter Asaro told me in an email. "In order to commit a crime or war crime there must be intention. Robots aren't capable of intention in the legal sense, so cannot commit crimes or be held accountable for their actions—This would make it easy to cause atrocities with killer robots, without anyone being legally responsible."

Wareham agrees. "The real worry here is that we're going to have 'stupid' fully autonomous weapons before we have ones that are able to abide by existing international law," she said. "We looked at whether fully autonomous weapons would abide with international humanitarian law. We found that they don't at this time, and possibly won't in the future. Therefore, we need to come up with new laws."

Image: Motherboard

After four days of deliberation, Wareham and her colleagues helped secure an agreement for an international "Group of Governmental Experts" (GGE), to be set up to consider the issues surrounding LAWS at the UN. The decision took a lot of diplomatic wrangling. Russia—which still opposes an outright ban and was previously sceptical of international cooperation on LAWS, would have been able to stall the process by opposing the formation of a GGE. Fortunately, it eventually chose to abstain, allowing the agreement to pass. Even so, it could be years until any effective legislation is signed. Developed nations like the US have already begun to research and implement elements of autonomy into defense systems. It's possible that many other nations could develop their own LAWS before international rules to control their use are formulated.

Advertisement

"We're on the borderline here, and the window is closing to do something about this," Wareham said.

One of the main reasons why some states are interested in developing robot weapons is an economic one: Drone-type weapons systems are cost-effective already, and—in theory at least—LAWS have the extra benefit that they can be deployed without needing to shell out on training personnel to operate them. With the ease with which robot weapons could be used to violate international law, there's also a disturbing possibility that they could be used in situations where regular troops would be unwilling to carry out orders for reasons of conscience.

"In 2012 we listed six countries that we believed were investing heavily in autonomous weapons: the US, UK, South Korea, Israel, China, and Russia," Wareham said. "More recently, we've reported at least another 10 that were investing in these kinds of weapons, although they're not yet fully autonomous."

"It's still possible to progress, and move forward in these challenging times, even when everything has become so polarized."

Even with the major possibility of states disregarding legislation to control LAWS, the seemingly inevitable adoption of autonomous elements in weapons systems, and the slow pace at which the UN is able to come up with new legal controls, Wareham is confident that the campaign is helping. There's been some encouraging news even before the UN decision on December 16 . Nineteen countries have now expressed their support for an outright ban on LAWS, with a further 78 publicly making their views on the subject known so far, indicating that more international discussion is taking place.

Advertisement

On July 28th 2015, academics and tech CEOs like Stephen Hawking, Noam Chomsky, and Elon Musk, signed an open letter from the Future of Life Institute, stressing the risk that autonomous weapons posed to humanity. More recently, on December 8, nine house democrats, led by Congressman Jim McGovern, submitted a letter to the US Government calling for a preemptive ban on the technology.

The US Counter Rocket, Artillery and Mortar (C-RAM) system can automatically destroy incoming artillery, rockets and mortar rounds. Image: US Army

In spite of the growing discussion and condemnation of LAWS internationally, it's debatable how impactful letters and petitions will be in influencing how states intend on developing or using them. Agreeing to set up a Group of Governmental Experts is by far the most influential event yet in creating a pathway towards actual legal change around the manufacturing and use of LAWS. When the newly formed GGE meets in 2017, it'll be able to start a timetable on creating laws and treaties to combat the spread of the weapons internationally. When I caught up with Wareham after the UN decision, she was happy with the progress that's been made.

"This was the biggest hurdle at this stage in the campaign and we've just jumped over it," she said.

Ultimately, it's going to be down to national leaders and diplomats from countries like the US, UK, Russia, and China to listen to public opposition, form an agreement, and turn that into international law. That's the most remarkable thing about the agreement to set up the GGE—the UN found consensus with everybody in the room, including Russia.

"It showed that multilateral diplomacy is not dead, and it's still possible to progress, and move forward in these challenging times, even when everything has become so polarized," Wareham said.