FYI.

This story is over 5 years old.

Tech

Why Mass Surveillance Is Worse for Poor People

Even Google admits that it's hard to design privacy tools for the technologically unsavvy.

​Since Edward Snowden revealed that the whole damn world ​is a surveillance state, a host of encryption and privacy services have popped up. But, people who have the luxury of using them, and the luxury of actually worrying about their privacy, are overwhelmingly well off, overwhelmingly white, and overwhelmingly male. Is privacy only for the privileged?

As we learn that the NSA, FBI, and even local law enforcement have their hands on any number of surveillance tools, it's getting ever more complicated to actually keep your communications safe from them. ​Anonymity tools like Tor and encryption services aren't always easy to use, and even default encryption on the iPhone requires you to have enough money to buy and pay the contract on an iPhone. Technological literacy is something that only the privileged can afford, and, beyond that, there is increasingly a concrete dollar amount that will afford you a modicum more of privacy.

Advertisement

Last week, news broke that AT&T would disable so-called "super cookies," which track users throughout the internet, ​for those who pay an extra $29 monthly. In the developing world, many people don't know of an internet beyond Facebook thanks to ​Facebook Zero, a service that provides access to the social network, but not the real internet, for free.

"You have to design with people and not just for people"

"It's not just AT&T," Daniel Kahn Gillmor of the American Civil Liberties Union said Monday at the ​New America Foundation's cybersecurity event in Washington, DC. "If the only party you're talking to is Facebook, ​then it becomes a central point for data collection and surveillance. For communities without phones, that's the only way they're going to get access. Access to the whole, worldwide internet could become the domain of people who have the ability to pay for it."

Less obvious, however, is the fact that protecting yourself online requires a digital literacy that you don't learn when you're working two jobs or living on welfare. It's a problem that Google says it's constantly dealing with. The wording of the warnings Chrome gives you get when you click on a nefarious link, for instance, have to be constantly workshopped to be useful to the largest number of people while still being technical enough to describe what's going on to the computer literate.

"Usability is fundamentally important, and there are very hard problems we're grappling with. We have large numbers of users with different backgrounds, disabilities, ages, and technological literacy," Tara Whalen, a staff privacy analyst at Google, said at the conference. "A specific example of this is with an ​SSL certificate. When it breaks, it's hard to explain nuances—is this a risk, what went wrong."

Advertisement

And that's just for technology that people use every day, such as a web browser. How are you supposed to get people who haven't been taught that "password" is not such a good password to use an encryption app, especially when the majority of primarily white, primarily male, primarily rich developers haven't thought about making them accessible to everyone?

"If you look at some of the developer communities, the people who volunteer to make [encryption] tools, the diversity is not particularly large," Whalen said. "The numbers of minorities are low, the number of women participating is low. Anyone in a marginalized group doesn't have the resources to participate in a free labor project."

"Bridging that gap is a challenge. You have to involve people in participatory design—you have to design with people and not just for people," she added.

This is, of course, not an entirely new phenomenon. Crime ridden parts of cities are often under constant surveillance, ​whether it be police cameras or foot patrols. That problem, however, has extended to the digital realm.

The key here, according to Seda Gurses, a postdoc at New York University working on surveillance and privacy issues, is to push companies to build privacy into everything that they do. Asking users to take privacy into their own hands sounds good in theory, but simply doesn't work for everyone.

"There's this word, it's hard to say, but it's responsibilization, which is encouraging people to manage risk themselves," she said. "If you think there are risks, you are responsible for protecting yourself from it. This is problematic of course. Instead of burdening the users, we should ask the phone companies or whoever to give them secure phones. We should make sure the network is secured in a way it can't be eavesdropped on."

Right now, we're seeing little of that happening. The digital divide is extending to become a privacy divide.