[Motherboard has recorded a podcast discussing the technical aspects of the FBI order and Apple's potential First Amendment legal defense. "Radio Motherboard" is available on all podcasting apps and on iTunes.]
In the days since a federal judge ordered Apple to help the FBI brute force hack the encrypted iPhone belonging to one of the two terrorists who killed 14 people in San Bernardino, there have been lots of questions: Is it technologically possible? Will any new, FBI-endorsed firmware work on all iPhones? What’s Apple’s next move?
Here’s another, larger question: Is the FBI’s court order even constitutional under the First Amendment?
As Motherboard reported Thursday, the FBI’s order is a technically clever one that many security experts (and, most likely, the FBI) believe would allow law enforcement to force Apple to help them unlock other encrypted devices, including its newer models such as the iPhone 6 and iPhone 6S.
The FBI is asking Apple to write software that would remove two features on the iPhone: One that erases the device’s encryption key (which renders the data on it unreadable forever) if the unlock passcode is entered incorrectly 10 times in a row, and one that introduces a time delay with each subsequent failed attempt. The removal of these features would allow the FBI to “brute force” its way into the phone by trying every possible pass code until it works.
Apple has a strong freedom of speech defense that it could use on appeal to prevent the FBI from compelling it to make its products less secure, according to several First Amendment attorneys I spoke to and a review of case law on the subject.
"We’ve never seen a compelled speech case that comes close to doing what the government is asking Apple to do here."
The specifics of what the FBI and the courts are asking Apple to do are important here. The FBI notes in its order that “this case requires Apple to provide modified software, modifying an operating system—writing software code—is not an unreasonable burden for a company that writes software code as part of its regular business.”
It may be the case that Apple can help the FBI, but that doesn’t mean it has to. In the 1999 case Bernstein v US Department of Justice, the Ninth Circuit Court of Appeals (which covers the District Court the Apple case is being heard in) ruled that software source code is “speech” that is protected by the First Amendment. In that case, a graduate student at Berkeley was initially prevented by the DOJ from publishing the source code for an encryption protocol, claiming it could be considered an illegal arms export. The Ninth Circuit ruled that this was a “prior restraint” against speech.
That case is important because if code is speech, the FBI is asking Apple to create speech specifically for the government, according to Nate Cardozo, a lawyer at the Electronic Frontier Foundation.
The FBI cannot write its own firmware to update Apple’s iOS, because the hardware is designed only to run firmware that is digitally “signed” by Apple (the FBI notes this in the order: “Apple has designed its mobile device hardware, as well as its operating system software, to only permit and run software that has been ‘signed’ cryptographically by Apple using its own proprietary encryption methods”). Making Apple write and endorse code when it doesn’t want to is problematic constitutionally on a few different levels, according to Cardozo.
"Code signing is likely a protected, First Amendment speech act."
The government can only compel speech in a very few, limited cases. Most famously, the Food and Drug Administration is allowed to force tobacco companies to display surgeon general warnings on cigarette labels. The FDA has won on the argument that it’s allowed to compel “purely factual and uncontroversial disclosures.”
When it pushed for more graphic labeling in 2013, however, an appeals court ruled that the graphic labels were not “purely factual,” and were thus illegal. What the government is asking Apple to do here doesn’t fit that standard, Cardozo said, because the idea that breaking Apple’s systems would lead to a safer experience for consumers is not uncontroversial, and there’s a strong argument that it would make the bulk of its consumers less safe.
“The FBI is ordering Apple to sign a piece of code that it wants created involuntarily. Apple signing that code is essentially the computer version of Apple saying not only that this code is genuine, but that it was intended by Apple to run,” Cardozo told me. “We’ve never seen a compelled speech case that comes close to doing what the government is asking Apple to do here. We’ve never seen any case where the government has compelled a third party to subvert its own systems in this way.”
"The FBI has artfully selected a case with a horrific crime and a dead shooter. It takes a number of constitutional issues out of play"
In a case that could be considered the closest thing to a precedent for what the FBI is asking Apple to do, the government lost. In 2003, the Ninth Circuit ruled that the FBI cannot compel a company that developed in-car computers (believed to be OnStar) to spy on customers, because doing so would fundamentally break what the computers were designed to do.
“It’s not a perfect fit, but it’s damn close,” Cardozo said of that case. “The FBI is ordering Apple to turn its code signing feature intended to protect consumers against those same consumers. Code signing is likely a protected, First Amendment speech act.”
There is another potential First Amendment argument, though Cardozo notes it’s not quite as compelling as the compelled speech one. It goes back to this idea of “prior restraint”—that is, the government cannot review speech before it’s actually made. FBI action toward Apple could affect others in the tech industry.
“A company other than Apple might read the government’s argument here as a statement that they can’t build secure devices no one can get into, and if they do, they’ll get into trouble,” he said. “It’ll have a chilling effect on developers who are hesitant to build secure storage or secure communication protocols as well.”
This all assumes, of course, that Apple tries a First Amendment argument at all. We won’t know until Apple files its opposition to the order, which is due Tuesday (Apple has asked for an extension and will likely get it, Cardozo said).
Michael Froomkin, a First Amendment expert at the University of Miami, wrote about the constitutional issues associated with the encryption debate back in 1995. He told me that he believes the First Amendment questions here are a bit more nuanced than you’d expect, though he said he wanted to wait until all of the facts of the case came out. He did say, however, that the FBI has picked a very good case to try to set a precedent on, because it involves a dead terrorist who was using a government-owned phone.
“The government has picked a great test case. I cannot assert your First or Fourth Amendment rights. If Apple wants to defend on a First or Fourth Amendment argument, it has to say its own rights are being violated, not the owner of the phone’s,” Froomkin told me. “The FBI has artfully selected a case with a horrific crime and a dead shooter. It takes a number of constitutional issues out of play.”
Froomkin said that it’s going to ultimately come down to whether Apple writing software for the FBI proves to be replicable on other phones, which seems to be the most important question for security researchers and consumers as well.
“It’s not normal that it seems they’re being asked to work for free or no promise of payment, and further they’re being asked to do something that might trash the brand and cost them millions in losses of sales,” he said. “But the critical factual issue here is the extent to which this can compromise other phones.”
Both lawyers are in agreement that this isn’t going to be resolved anytime soon: “This is ultimately going to the Court of Appeals,” Froomkin said. “Who knows, it might end up in the Supreme Court.”