The Second Amendment Case for the Right to Bear Crypto
Considering encryption under the Second Amendment helps us understand how individual rights to it should be limited.
Senate Commerce Committee debates encryption, 1997. Photo: Douglas Graham/Congressional Quarterly/Getty Images
On November 9, 1994, an American software engineer named Philip Zimmermann was detained by customs agents in Dulles International Airport as he returned from a speaking engagement in Europe.
His luggage was searched and he was interrogated at length regarding his possible illegal export of "dangerous munitions."
Though Zimmermann was carrying no guns, bombs, or chemical agents, he was carrying one item considered a weapon in the eyes of the US government: the strong cryptographic software of his own making known as "Pretty Good Privacy," or PGP.
While today it may seem surprising that software like PGP was ever considered a weapon, the US government has long viewed strong crypto—typically any encryption mechanism that cannot be bypassed efficiently—as a dangerous technology in civilian hands.
Legal proceedings over the right to encrypt have been largely inconclusive
Legally, in fact, the right of individuals to strong cryptographic technology has never been affirmed, even as privacy and surveillance concerns have prompted companies like Google, Apple and, more recently, WhatsApp and WordPress, to encrypt their devices and platforms by default.
Thanks to the "Crypto Wars" of the 1990s, legal scholars have debated the ways in which cryptographic research and technology might qualify for constitutional protection. Typically, however, these reflections have focused on interpretations of the First Amendment and the Fifth Amendment: the First, through the reasoning that code is speech, and the Fifth for its particular protection of the "liberty" to pursue one's chosen profession.
Yet the federal government's own decision to regard encryption technology as a weapon seems to suggest another constitutional lens: the Second Amendment, via the "right to bear arms."
In the United States, restrictions on non-governmental uses of cryptography go back to at least 1977, when a member of the National Security Agency sent a letter to the IEEE warning that some of the material to be presented at a Cornell cryptography conference might run afoul of weapons export regulations.
At the time, even the concepts of strong crypto appeared on the Munitions List of the International Traffic in Arms Regulations or ITAR, which govern US weapons exports.
In fact, it was not until almost two decades later that the US began to move some of the most common encryption technologies off the Munitions List. Without these changes, it would have been virtually impossible to secure commercial transactions online, stifling the then-nascent internet economy.
"The Second Amendment extends, prima facie, to all instruments that constitute bearable arms, even those that were not in existence at the time of the founding."
Additional regulatory changes in the early 2000s further relaxed the export restrictions, making the use of PGP and other open-source software legal for individuals to transport and use. But legal proceedings over the right to encrypt have been largely inconclusive.
For example, the three-year investigation of Zimmermann was eventually dropped, but without explanation. And while in 1999 researcher Daniel Bernstein secured a win against the Department of Justice in the 9th circuit, a series of legal technicalities accompanied by a big change in ITAR meant that the case was ultimately dismissed in 2003 without being decided.
Meanwhile, "Software (including their cryptographic interfaces) capable of maintaining secrecy or confidentiality of information or information systems" remains on the ITAR Munitions List today—and the export of more sophisticated encryption software is still subject to both government oversight and a complex licensing process.
So what are the chances of encryption technologies being viewed as a "bearable arm" under the Second Amendment?
"It's an interesting argument," says Mike McLively, a staff attorney at the Law Center to Prevent Gun Violence and a Second Amendment expert. "You can make the case, certainly."
Doing so is no simple task, however. According to McLively, 94 percent of the more than 1000 Second Amendment infringement suits brought since the landmark District of Columbia v. Heller case in 2008 have been rejected. As always in law, details matter.
"It would depend on which technology you're talking about," says McLively. "It is on everyone's phone, for example? Is it commonly used for self-defense, not just to defend your information? I think that the Court would be more inclined to say that the Second Amendment is for protecting your physical person."
Of course, for the millions of people who currently encrypt their phones, adequately protecting the sensitive data they contain is absolutely a form of physical protection.
A weakly encrypted phone, if lost or stolen, is a Pandora's Box of dangerously personal information: names, addresses, contacts, and photographs, to say nothing of the detailed calendar and appointment information that can act as a map to an individual's daily activities, or those of their children.
Likewise, with the increasing prevalence of app-based "smart devices" for the home, including security systems, a poorly encrypted phone is essentially a remote-controlled index to doing one physical harm.
Typically, of course, the "arms" considered for Second Amendment protection have been traditional firearms. Last month, however, the Supreme Court rejected the state's arguments in the case of Caetano v Massachusetts, in which the state held that the plaintiff did not have a Second Amendment right to a "stun" gun, in part because it was "a thoroughly modern invention."
In a per curiam decision, however, the Supreme Court sent the case back to the state, with the forceful assertion that "the Second Amendment extends, prima facie, to all instruments that constitute bearable arms, even those that were not in existence at the time of the founding."
Even this inclusive view of the Second Amendment, however, does not preclude limitations on the types of weapons that individuals can use. As McLively points out, "The Second Amendment doesn't protect the right to pick the gun we want."
Does that mean that the government could eventually mandate the use of only the "smart" guns described in President Obama's executive order earlier this year? Could the right to bear encryption be limited to only allow encryption that allows "exceptional access" for law enforcement, as the recentBurr-Feinstein bill would require?
"If it's reliable and it works," says McLively, "all the government would have to do is show that access to only smart guns is just as respectful of self-defense."
Unless online banking is suddenly outlawed, the "dangerous" uses of strong encryption are somewhat beside the point
The problem, of course, is that to date neither smart gun nor "exceptional access" encryption technologies have been able meet that challenge of being "just as respectful of self-defense."
Despite decades of research and development on both fronts, smart guns "can and will be jailbroken," as Ars Technica co-founder Jon Stokes put it in an LA Times op-ed in January, and encryption that allows access for law enforcement is no longer protective. A group of prominent computer security specialists—among them Whitfield Diffie, an author of an essential protocol for securing internet connections—recently detailed why "exceptional access" to encryption technologies is no more tenable today than it was 20 years ago.
Considering crypto under the Second Amendment is more than a semantic trick. It can help shed light on whether—and where—appropriate limits on individuals' right to encryption should lie.
Traditional firearms and encryption technologies both have the capacity to protect and destroy; and both can be put to both lawful and criminal use.
At SXSW this year, President Obama asserted that technologies like strong encryption "can empower folks who are very dangerous to spread dangerous messages," and there is little doubt that encryption technologies, like guns, can be used in dangerous ways.
But of course most of the technologies protected by the Second Amendment are inherently dangerous—otherwise they wouldn't be much good for self-defense. Sensibly, then, the test for whether or not a weapon is protected by the Second Amendment does not rest on whether or not it is dangerous.
Instead, it is the dominant application of a technology that affects its eligibility for constitutional protection. As Justices Alito and Thomas wrote in their Caetano opinion, "the relative dangerousness of a weapon is irrelevant when the weapon belongs to a class of arms commonly used for lawful purposes." So unless online banking is suddenly outlawed, the "dangerous" uses of strong encryption are somewhat beside the point.
Whether "weapons of offense, or armor of defense"—whether firearms or encryption technologies—the Second Amendment "extends...to all instruments that constitute bearable arms."
In the fight to protect ourselves from having our medical records altered through identity theft or our physical whereabouts tracked from a stolen phone or laptop, the stakes can truly be life or death. When it comes to self-defense in the digital age, strong encryption is the only weapon we have—we need to protect it.