Human Laws Can't Control Killer Robots, New Report Says
​Image: Motherboard

FYI.

This story is over 5 years old.

Tech

Human Laws Can't Control Killer Robots, New Report Says

That's why we need to ban them, says Human Rights Watch.

​When a human being is killed by an autonomous machine, who takes the blame? Human rights non-governmental organization Human Rights Watch says it is virtually impossible to tell, and that presents unprecedented danger in the future of warfare.

The group released a report today showing how difficult it will be to hold commanders, operators, programmers or manufacturers legally responsible for crimes committed by autonomous machines under current legislature.

Advertisement

The paper is the latest for the Campaign to St​op Killer Robots, an international coalition co-founded by Human Rights Watch that has been fighting the production and use of fully autonomous weapons since 2012.

Although fully autonomous weapons do not yet exist, Human Rights Watch defines them as machines that would be able to "select and engage targets without meaningful human control," and is calling for a comprehensive preemptive ban.

"If a fully autonomous weapon commits a criminal act, someone should be held accountable and if nobody could be accountable those weapons should be banned," Bonnie Docherty, lead author of the report and senior Arms Division researcher at Human Rights Watch told Motherboard.

The report, called "Mind the Gap: The Lack of Accountability for Killer Robots," was released in advance of the Convention on Conventional Weapons (CCW) at the United Nations in Geneva, where a meeting on "lethal autonomous weapon systems" will take place from April 13-17. There, experts and leaders will discuss the future and potential regulation of the machines.

The report said the weapons would not be able to be held accountable or punished for their actions because, in short, they are not human beings.

"We think this is a revolutionary change in the ways wars are fought."

"Such a robot would not fall within the 'natural person' jurisdiction of international courts," the report said. "Even if such jurisdiction were amended to encompass a machine, a judgment would not fulfill the purposes of punishment for society or the victim because the robot could neither be deterred by condemnation nor perceive or appreciate being 'punished.'"

Advertisement

The humans who manufacture these machines would likely not be held liable under the current legal framework either.

"In most cases, it would also be unreasonable to impose criminal punishment on the programmer or manufacturer, who might not specifically intend, or even foresee, the robot's commission of wrongful acts," the report said.

Military accountability is also insufficient, the report said, because of the immunity granted to the US military and its contractors overseas.

Michael Horowitz, an adjunct senior fellow the Center for a New American Security (CNAS) who has co-authored papers on the ethics of autonomous weapons in the past, disagreed. He said establishing accountability for autonomous weapons will be "a complicated task," but not impossible.

"The critical question is the level at which you talk about accountability," he said. "If we are talking about accountability for the specific act of pulling the trigger, that's more complicated. But if we're talking about accountability for the officer who commands the mission in the first place, it's not that different from what we have today––in some ways, it's possible that the introduction of lethal autonomous weapon systems would push accountability up the chain of command."

Horowitz said we need to understand the technology before we ban it.

"It's premature to talk about legislation," he said. "In thinking about the issue of lethal autonomous weapon systems, there are still too many things where we lack basic agreement, including even the definition of a lethal autonomous weapon system."

Advertisement

Although these weapons have not evolved to be fully autonomous yet, a preemptive ban on military technology is not unprecedented. In 1995, the United Nations issu​ed a ban on blinding laser weapons, which came into force in 1998. However, Horowitz said that ban was easier to enforce due to its specificity. He said other weapons that are now common have been on the table for preemptive bans in the past, including submarines, aerial bombing, and crossbows.

"The danger of a preemptive ban would be if one defines the category so broadly that you ruled out vast sets of weapons that reduce civilian casualties in war today, or that protect ships and military bases from attack," he said. "We need a firm understanding of exactly what are the systems that are of concern before we can have a fully informed discussion about what to do."

But Docherty said with this technology, it is too dangerous to wait and see.

"We think this is a revolutionary change in the ways wars are fought, and that nobody could be held accountable for what happens in those wars," she said. "It detaches the means of war from the human, and we have machines making life and death decisions on the battlefield. It's important to preempt it before it's too late."

The full Human Rights Watch report can be foun​d here, and will be distributed at the meeting in Geneva this week.