A Former British Drone Pilot Says Killing With Robots Is ‘Utterly Wrong’
U.S. Air Force photo/Tech. Sgt. Nadine Barclay

FYI.

This story is over 5 years old.

Tech

A Former British Drone Pilot Says Killing With Robots Is ‘Utterly Wrong’

"There are things that you just can’t account for ahead of time."

A former British drone pilot who flew Reaper missions for the RAF over the Middle East has said that any push towards autonomous killing, which some consider to be the next step in drone warfare, is "totally and utterly" wrong.

In an exclusive interview with UK drone watchdog Drone Wars, Justin Thompson (a pseudonym), said, "There are things that you just can't account for ahead of time and that's why it's so important to have humans exercising their judgement."

Advertisement

Thompson piloted RAF Reaper drones over Afghanistan while stationed at Creech Air Force base in Nevada, and told Drone Wars founder Chris Cole, "We cannot automate decision making when it comes to taking life. To me that's utterly, utterly wrong. And I don't actually think you can actually do it, technically or legally."

But despite Thompson's assertions, drone strikes have resulted in hundreds of civilian casualties, suggesting that there may be room for improvement in the targeting process. Couldn't this be solved by artificial intelligence?

Thompson's interview also highlights the extreme psychological pressure drone pilots are under when they fly missions, pressure that was emphasized in a recently published RAND report on the mental health of RPAS operators. This would seem to be another reason in favor of using more heavily automated systems, to give human operators less psychological distress.

There are no currently purely-automated killing systems in use by drones, nor are there any concrete plans to introduce them. The UK's Ministry of Defence has publicised plans for introducing AI into warfare, namely for predictive analytics and automated planning roles.

And in May, the US military shared more information about its algorithmic warfare operation, which will see machine learning algorithms aiding investigators in identifying targets via video feeds from drones (one of many avenues the Department of Defense is investigating in regards to artificial intelligence, deep learning, and big data). That plan has drawn criticism from opponents who argue that AI-led target selection could be subject to automation bias with algorithms that lack human capabilities in decision-making.

Advertisement

Information analysis is the start of the US military's deployment of artificial intelligence and will augment human soldiers in making decisions about when to use lethal weaponry, and on whom. But there is a resistance even within the military to developing fully automated killing machines. Former US defense secretary Ash Carter pledged in 2016 that the US military will never use killer robots.

Ultimately, critics are worried that machines such as drones may one day complete entire missions, and kill, autonomously, all with an accountability gap where there would be no clarity on who is legally responsible for a robot's actions.

Cole, who is currently taking the UK Ministry of Defence to court over its refusal to disclose drone deployment numbers, told Motherboard in an email that despite the Ministry's insistence that drone operations should be shrouded in secrecy, the interview with Thompson proves that those involved can and should speak much more openly about their work.

"There are serious legal and ethical questions about the use of armed drones and the thoughts of those involved in using armed drones on a day-to-day basis is very valuable," said Cole.

Subscribe to Science Solved It , Motherboard's new show about the greatest mysteries that were solved by science.