FYI.

This story is over 5 years old.

Tech

Don’t Blame the Military Drone Pilots

The kill decision is now two screens away.

Opinion: Claudia Hauer teaches at the US Air Force Academy and interviewed the sources quoted here for academic purposes. Names are withheld for privacy reasons and quotes have been reconstructed.

The twenty-three-year-old pilot sat at his console in Nevada staring at his screen. He keyed the mic to the joint terminal attack controller (JTAC) in Afghanistan, who coordinated his attack orders with the ground force. "Standby 9-line. Standby. There are kids in the field of view. Confirm you copy kids?"

Advertisement

The lieutenant pilot and his crew watched on the silent video monitor as their target's children fluttered around him. They had been tracking the target, whose identity was unknown to them, for weeks, and the JTAC had just ordered them to prepare for a lethal strike, but the presence of the children was unmistakable. Aside from the height difference, which was pronounced in the long morning shadows, Afghan adults do not typically run.

The radio was silent for a few moments while the JTAC undoubtedly conferred with the ground force commander. The JTAC responded, "I copy kids. I see the kids. But when I tell you to shoot, you're gonna shoot."

The Air Force remotely piloted aircraft (RPA) program, also known as the drone program, is often criticized as a lethal video game played with innocent people. But who's really playing the game, if higher-ups can watch a live-feed via satellite and give orders to kill? The lieutenant pilot in Nevada wanted to hold his fire. It was his ground force commander, watching over his shoulder from the battlefield in Afghanistan, thousands of miles away, who took that decision out of his hands.

RPA technology has brought big changes to the way lethal strike decisions are made. Clausewitz talked about "fog of war," but maybe we should be talking about the "fog of virtual reality."

The Air Force corrects the myth that drones are autonomous machines by asserting that "human beings are an integral part of the system and will continue to be the decision-makers," but this human being is not necessarily the RPA pilot. One pilot recalled years later how a ground force commander with a laptop, viewing a Reaper's live-feed and angry about a recent strike on Americans, ordered the pilot to shoot down what turned out to be Afghani men and children digging an irrigation ditch.

The psychological effects on the airman, commander, and strategist as a result of changes in weapons and technology platforms are still unclear. Studies on post-traumatic stress disorder (PTSD) within the RPA pilot community have considered the telewarfare environment as an overall psychological factor, but we should also consider how virtual-reality technology has meant huge changes in the way lethal strike decisions are made.

This is best-available technology, and if ethical restrictions on the use of best-available information technology are improbable in the civilian and corporate contexts, they are even more improbable in the military, which is under enormous pressure to expand the RPA program to fight the war on terror.

In the war on terror, the most important decision is forbidden to RPA pilots: the determination of who makes it onto the target list and why. That's all top-secret. Most RPA pilots will never know the identities of their targets, or why their kills are considered worth the lives of those children in the line of fire. And now, increasingly, the decision to pull the trigger is also out of their hands.

Claudia Hauer teaches ethics at the US Air Force Academy in Colorado Springs. The views expressed here are those of the author and do not necessarily reflect the official policy or position of the United States Air Force Academy, the Air Force, or the Department of Defense.