FYI.

This story is over 5 years old.

Tech

Apple's Security Expert Joined the ACLU to Tackle 'Authoritarian Fever'

Apple security expert Jon Callas, who helped build protection for billions of computers and smartphones against criminal hackers and government surveillance, is now taking on government and corporate spying in the policy realm.
JON CALLAS
Image: Cathryn Virginia

Jon Callas is an elder statesman in the world of computer security and cryptography. He's been a vanguard in developing security for mobile communications and email as chief technology officer and co-founder of PGP Corporation—which created Pretty Good Privacy, the first widely available commercial encryption software—and serving the same roles at Silent Circle and Blackphone, touted as the world's most secure Android phone.

Advertisement

As a security architect and analyst for Apple computers—he served three stints with the tech giant in 1995-1997, 2009-2011, and 2016-2018—he has played an integral role in helping to develop and assess security for the Mac and iOS operating systems and various components before their release to the public. His last stretch there as manager of a Red Team (red teams hack systems to expose and fix their vulnerabilities) began just after the FBI tried to force the tech giant to undermine security it had spent years developing for its phones to break into an iPhone belonging to one of the San Bernardino shooters.

But after realizing there's a limit to the privacy and surveillance issues technology companies can address, Callas decided to tackle the issues from the policy side, accepting a two-year position as senior technology fellow for the American Civil Liberties Union. Callas spoke to Motherboard about government backdoors, the need for tech expertise in policymaking, and what he considers the biggest challenge for the security industry.

***

MOTHERBOARD: What made you leave Apple to join the ACLU? And why now?

Jon Callas: The political situation that's going on all over the world, not just the United States, is that the world has caught what I euphemistically call authoritarian fever. [N]one of us expected the end of 2016 at the beginning of 2016. I've come to the conclusion that policy decisions are more important now than they used to be. And a number of my tech friends, including Bruce [Schneier] said that I am uniquely capable of doing things that are good here, and that it is the moment [to do this]. And so I have taken this particular leap because I also believe that this is a moment. If not me, who? And if not now, when?

Advertisement

What do you hope to do in the policy realm at ACLU that you weren't able to accomplish as a technologist inside the industry? Will you focus on encryption and backdoors?

I am going to be working a lot on surveillance in general. But surveillance in general is not just an encryption issue, and I even believe [encryption] is the minor part of it. I have been incredibly impressed at the sophistication and skill of the arguments that come from the new generation [of technologists]. I am happy with what the younger generation has been saying. So part of it is I believe that other people can handle the encryption thing [now], and I'm willing to help them. It is many of these other things that I am concerned about more than anything else.

What kinds of things?

There was the [recent] New York Times article on location data being leaked all over the place [by mobile apps]—that actually is one of my big bugaboos. I really do believe that metadata trumps everything.

I am [also] really really really concerned about machine learning, both the use of it and the accuracy. It is where I was going to go in Apple [if I had stayed there]. How the hell do we test machine learning systems? How do we know that they are working? And this gets directly to things like surveillance and algorithms. A photo characterizer that runs on my phone and tells me things that are text and things that are wine bottles, it's ok if that is only 90 percent accurate. [But] it is not okay in a predictive policing situation to be 90 percent correct.

Advertisement

This is the larger reason why I thought that I should get into policy for a while rather than just technology, because there are a whole bunch of things that [we can't solve through technology] that we have to solve through policy. Imagine there is a magic hacking box that you can plug into someone's phone; if every cop has one and they can pull you over for your taillight [being] out and get everything that is on your phone, that has a lot of issues that are policy issues. If you look at the policy issues where they get a warrant to go into a room [that has] a whole bunch of filing cabinets what they get to look at is something the policy issues people have dealt with for decades, centuries. We saw a number of those in the news recently where they took files for [President Trump's lawyer] Michael Cohen. There was a ‘special master’ appointed who was an impartial referee who says, 'This is relevant and this is not relevant to the investigation.' A magic hacking box doesn't have anything like a special master. The policy questions we are getting now are changing now because of technology.

Google announced last year it wouldn't renew a contract with the Defense Department to develop artificial intelligence to distinguish objects captured in video images taken by drone cameras from people captured in those images. The announcement came after employees voiced objections to the contract. Can employees really have a strong impact on employers or was this a fluke?

Advertisement

I'm glad that the discussion is happening. In my own career in the past when I was young, I made the decision that I would not work on things that killed people. So it meant that I didn't work on military contracts. [But] then you have the follow-on effects, so I'm working on an operating system and people who make military things are using my operating system is that something that I'm comfortable with or not? I had a very dear friend that I worked with and she left high tech completely because she was not comfortable with the way that the things she was doing were being used by the military industrial complex. We tech people have a tendency to assume that there are no consequences to what we do.

You co-founded and were CTO of Silent Circle, Blackphone, and PGP—long before encrypted communication was widely adopted. PGP encryption never really got widely adopted because many people complained about usability. Is there something you wish had been done differently with PGP to increase its adoption?

When I created PGP corporation, our code name was zero-click encryption. And it was absolutely a system that really did work [this way]. Recently, somebody asked me to sign their PGP key, and it was both amusing and embarrassing because I don't know how to do it on a command line because [with PGP software] I handed it over to a bunch of zero-click things that managed my keys and did everything for me, and then I forgot how [to do it].

Advertisement

I sold it off to a large company [Symantec] and I think that they botched the moment. They could have taken the ideas and the technology that we had built for doing transparent encryption and signing and literacy—taken all of the choices that someone is going to get wrong out of their hands—and making it just work. That could have been pushed throughout more systems. And I think that what we have not had is security people who are building, completely, systems out of PGP…I think that those of us who built the things that were easy to use and then sold it to another company—they lost the opportunity to take what we had done and to build from there. They essentially tore out the walls and said, I can use this copper pipe over here.

We do have easy-to-use encryption now with WhatsApp and Signal. Are you encouraged that there are so many user-friendly alternatives now?

I am encouraged and frustrated, because I have about six messengers running on my laptop, [to accommodate] the people who want to use Wire and the people who want to use Signal and the people who want to use Wickr and the people who want to use Skype. So while on the one hand I am yes absolutely thrilled that a thousand flowers are blooming, we need to in the future coalesce.

Great Britain's spy agency, GCHQ, recently expressed interest in exploiting a weakness in the identity authentication mechanism of messaging applications like iMessage so they can secretly join an encrypted group chat without automatic notifications that generally tell other parties in a group chat that someone new, or a new device, has joined the conversation. GCHQ thinks this is preferable to a backdoor because it doesn't involve weakening encryption or introducing a mandated door in the architecture. This of course raises questions about whether companies will fix the weakness to thwart this, and whether the British government would move to prevent a fix.

Advertisement

I see that it is precisely forcing a backdoor. I think that saying 'put a feature in' and 'don't fix this bug' is a difference without distinction. Because you're telling me how to design the system. They say [we want] to insert another [user in the conversation] and it needs to be silent. If what you're really saying is 'I want to make you suppress the [automatic] messages that say a new device was added,' then we're back to mandating backdoors. It's just a different mandated backdoor.

Could you design a system where every other user gets notified as normal if a new user joins a chat, unless it's a chat involving a targeted terrorist group?

I don't think so. Lots of [companies] are now rethinking the way that they do things, where they are designing their system where they consider themselves an adversary. Naively, most systems were saying [until now], we systems ourselves are in a privileged position. [But] many systems now are moving toward saying we need to get out of being in a privileged position.

This comes [as a result of] crime and theft and fraud issues, and [also] government law enforcement [issues].

I had personally an issue where I gave somebody an old device, a relative of mine. And when we tried to set it up for my niece, [the company] said this [device] is already assigned to someone else. Somebody exploited a backend in the network [and] claimed this device was theirs. We called their tech support and proved ownership of it by sending them a copy of the pdf receipt. This is exactly the [same as the] 'I'm going to insert a new device problem.' If you say I want to design a system where other users [can't abuse the system], I have to harden my own internal systems, because I can't really trust my own system [not to be abused].

Advertisement

After the Snowden leaks revealed NSA efforts to undermine encryption protocols and bypass legal processes to siphon data from undersea cables running between private data centers, many tech giants were outraged and began to regard their own government as an adversarial hacker like any other threat actor. How much of the outrage was genuine concern for customer privacy and how much was simply savvy marketing chasing the winds of change? One could ask why the companies weren't more proactive in protecting customer data before the Snowden leaks.

When the Snowden things came out, people who would sort of roll their eyes [before] when I would say there are privacy problems all over the place [now] said, ‘Oh my god this is actually happening.’ If customers as a group all of a sudden say this is actually important to me now, and the companies respond to that, yes it's winds of change, but the winds of change come from the fact that people out there have said this is important to me, when they didn't say that before.

Any person, any organization has only 24 hours in the day. There are plenty of things that I would like to do that I haven't got around to. And if people are seeing that this is an actual threat and not a hypothetical threat and now I'm worried about it, then that can move it up to what a provider considers to be important. The people who are making software and services and devices tend to build things that more people want than fewer people want.

You've been doing security for three decades, do you ever feel like we've made very little progress in that time?

I think some of the issues that we're dealing with now such as law enforcement hacking and government backdoors are issues because we in security are making progress at making secure systems. If something was completely open then there is no reason to have backdoors or anything else. Part of the reason why the showdown between Apple and the FBI happened was precisely because Apple had locked down those systems. [And] if you look at the exploit world now, exploits that are happening are not the way that they were 10, 20 years ago. [Attackers] have to construct chains of exploit [instead of using just one to compromise a system]. And there are lots of cases where you could load a bunch of malware into somebody's computer, but there's no way you can make it start running. That is a consequence of the fact that we've been winning.

What's the biggest security problem the industry has to tackle in your mind?

The identification issue is the hardest problem that we have in security. [Identification] is an issue that you have with SSL [Secure Sockets Layer, an encryption protocol used for securing communication between browsers and web sites], it is an issue that we have with PGP. The PGP way of doing this is you should verify the fingerprint of somebody's key, which is really good advice, but nobody ever does that. And even with tools like Signal, I will admit I do not really know how to evaluate somebody's safety number. I don't trust myself to do it correctly and therefore I don't do it. Which is why I say this is the hardest problem that there is. If I get a text message from an unknown number saying hi this is Kim, how do I evaluate that? If I want to verify my Signal number with you, I need to verify it on something that is not Signal. I often refer to the larger problem with this as Where's the bottom turtle? And perhaps more formally people will say what's the root of trust? And this is why it's hard.