Since retiring from a three-decade career at the NSA in 2001, a mathematician named William Binney has been telling anyone who will listen about a vast data-gathering operation being conducted by his former employers. "Here’s the grand design," he told filmmaker Laura Poitras last year. "You build social networks for everybody. That then turns into the graph, and then you index all that data to that graph, which means you can pull out a community. That gives you an outline of everybody in that community. And if you carry that out from 2001 up, you have 10 years of their life that you can then lay out in a timeline that involves anybody in the country. Even Senators and Representatives—all of them."
The invasive spying program Binney described—one that could build a "social graph" of nearly any user of the American Internet, like some massive, secret Facebook—was in the works, he says, when he left the agency. The details of this program, known as "Stellar Wind," have never been made explicitly public. Lawsuits and complaints about this and other programs (for instance, by lawyers for Guantanamo Bay prisoners, who suspect their phone calls were intercepted) have been dismissed by the government because potential evidence—like the court that administers these programs—is itself secret.
But now we know more about one aspect of the US's surveillance arsenal. A tool called PRISM, the top secret project described last week in the Guardian and The Washington Post, is sucking in data directly from the big Internet companies to do much the same thing that Binney warned about when he described "Stellar Wind." Rather than going to Internet companies piecemeal with search warrants and requests, a system like this provides "lock boxes" for data co-located at companies' servers, allowing government analysts a far more easier way to access entire troves of a person's data, and to do with it what they will. Obama and others have insisted that even if Americans' data is swept up, searches through this data are focused on foreign nationals, and are "very narrowly circumscribed." But when Senators asked for details last year about how many Americans have been swept up in the NSA's dragnet, the agency replied that revealing that number would "itself itself violate the privacy of U.S. persons."
Agencies like the FBI, which itself has been quietly pushing for a "back door" system like this, call it crucial for national security. The leaker of the document, government contractor Edward Snowden, who has sought asylum in Hong Kong, calls it a recipe for "turnkey tyranny." With a single PowerPoint, we've been teleported from shadowy hacker spy movies and giant Internet conspiracy theories (it's the CIA who actually invented Facebook, right?) into a reality that is simultaneously gut-wrenchingly alarming, and—unless you've been hibernating for the past dozen years—not terribly surprising.
This was not the kind of reality that Binney, like Snowden and other recent espionage whistleblowers, signed up to build. For decades, he and his colleagues were tasked with crafting systems for scanning the communications of foreigners, not Americans. A program that Binney and others championed, ThinThread, was designed to encrypt Americans' communications, but was dropped by the NSA in favor of a more expansive project (though reportedly not before it was tested on New Zealanders.) (Binney's story is told in Poitras' short film, "The Program," released last year by The New York Times, which you can watch it below; Poitras is also one of the journalists whom Snowden first contacted, and she filmed his interview with Glenn Greenwald for the Guardian.)
William Binney in Laura Poitras's "The Program," 2012.
Edward Snowden in a video for the Guardian by Laura Poitras
As another NSA whistleblower, Russ Tice, explained in 2005, "The very first law chiseled in the SIGINT [signals intelligence] equivalent of the Ten Commandments," a directive known as USSID-18, "is that 'Thou shall not spy on American persons without a court order from FISA,'" the Foreign Intelligence Surveillance Act court. But, he said, "The very people that lead the National Security Agency have violated this holy edict of SIGINT."
After September 11, 2001, the enemy wasn't outside the US, but potentially anywhere. Even as Congress outlawed a Bush administration program called "Total Information Awareness," complete with a logo that included an all-seeing eye, a new surveillance culture was mobilizing in secret.
As J. Cofer Black, the former director of CIA, told Frontline, "after 9/11, the gloves come off." The metaphor described the physical war on terror but also the digital one. Section 215 of the PATRIOT Act would give access to "any tangible thing" that is "relevant" to an investigation "of foreign intelligence information not concerning a United States person." Section 216 would specifically say that obtaining the metadata of Internet communication—not the content of an email but other details about it—does not constitute a "search." And Section 702, passed in 2008, permitted intelligence officials to conduct surveillance on the communications of “non-U.S. persons,” when at least one party on a call, text or email is “reasonably believed” to be outside of the U.S.
Technology was already well ahead of the law. As Binney and other people who have stumbled on the government's warrantless wiretapping programs have warned, after 2001, the new listening devices were turned inward, to snoop on the country's own chatter, on the assumption that some of that chatter contained "foreign" intelligence. The scale of that chatter is hard to imagine, but the giant data center the NSA is readying in Utah gives some idea; when it's finished next year, it will be the largest data center in the world, capable of storing data on the order of 1 quadrillion gigabytes, or an estimated hundred years’ worth of the world’s electronic communications.
Google data center. Douglas County, Georgia. Photo: Google
DATA AND METADATA
It is thought that the secret FISA court must give the final order to manually dig into, say, Google's servers and "intercept" an entire email box. That is not necessarily the case for the data about that data, which, as Senator Diane Feinstein pointed out, is not as protected: “Our courts have consistently recognized that there is no reasonable expectation of privacy in this type of metadata information and thus no search warrant is required to obtain it,” she said on Friday.
This leads to the curious legal distinction drawn in the signals intelligence world, between collecting data and searching it. The collection of data does not constitute a search, or a so-called "intercept"; Google, Facebook and other companies are already helping the collection process anyway. It is the government's ability to search that data and read it that legally constitutes an intercept. As President Obama told reporters on Friday, "if anybody in government wanted to go further than top line data and wanted to listen to Jackie's phone call, they'd have to go to a federal judge and indicate why." But consider how much that top line data tells us: enough to help determine one's location, medical issues, political and friend affiliations.
On Friday James Clapper, the Director of National Intelligence, released his own explanation of how PRISM meets FISA requirements. He specifically cited its legality under Section 702, which is meant to permit an unprecedented kind of data collection without a warrant, provided it does not "intentionally" target a U.S. citizen. The law, he adds, was "recently reauthorized by Congress after extensive hearings and debate," and it is administered in a way to "minimize the acquisition, retention and dissemination of incidentally acquired information about U.S. persons."
Is there a "Facebook" or a "Google" for America's private data? Does the NSA really keep a file on everyone? General Keith Alexander, the head of the NSA, answered this question at the DefCon hacker conference last year in Las Vegas. “No, we don’t. Absolutely no. And anybody who would tell you that we’re keeping files or dossiers on the American people knows that’s not true.” He continued: “We get oversight by Congress, both intel committees and their congressional members and their staffs,” he continued, “so everything we do is auditable by them, by the FISA court … and by the administration. And everything we do is accountable to them…. We are overseen by everybody. And I will tell you that those who would want to weave the story that we have millions or hundreds of millions of dossiers on people is absolutely false.”
Gen. Alexander even acknowledged PRISM, vaguely. The NSA scans the Internet for national security threats, and the FISA law "allows us to use some of our infrastructure to do that. We may, incidentally, in targeting a bad guy, hit on somebody from a good guy. [But] we have requirements from the FISA court and the attorney general to minimize that, which means nobody else can see it unless there’s a crime that’s been committed…. And so from my perspective, the people who would say that we’re [targeting Americans] should know better.”
The issue with a system like PRISM isn't just the abuses that may have already happened, but the risk of abuses in the future. History doesn't offer much comfort, and especially not recent history. The rules that oversee the NSA's collection of data remain cloaked in secrecy, rife with loopholes, and, according to the government itself, easily subject to abuse. In 2009, the Justice Department acknowledged it had raised concerns about the NSA's "unintentional" "overcollection" of Americans' personal data through the FISA system, but said the concerns had been addressed.
Last year, Sens. Ron Wyden and Mark Udall revealed the existence of a 2011 opinion by the FISA court that found that collection activities under FISA Section 702 "circumvented the spirit of the law” and violated the Fourth Amendment's prohibition on unreasonable searches and seizures. But that ruling, like much other evidence surrounding these programs, remains secret, to be revealed only at the discretion of the President. Meanwhile, a 2007 report by the Department of Justice Office of the Inspector General (OIG) described "significant" violations of law by the FBI in its use of the powerful national security letters that the FISA court issues. It is also rare for FISA applications to be turned down by the court. Through the end of 2011, over 30,000 FISA court orders were granted, according to the Union of Concerned Scientists and the Electronic Privacy Information Center; just 11 applications were rejected.
While controversy over warrantless wiretapping under Bush may have led to changes to the program, there is still no telling just how large the government's data mining operation is or how it's being used. It's not clear, for instance, how extensive the government's cursory searches are. If, hypothetically, I were to include the words "bomb" "Mexico" or "pork" in a sentence in this article—keywords that the Dept. of Homeland Security watches for— do bots then go searching through other things I've written? (For everyone's sake, I hope not.)
One alarming possibility is that, with or without a program like PRISM—or that other search tool known as a warrant—the government can use past and current data to create an unprecedented kind of dossier on anyone. With software built by government contractors like Palantir, an agency could use all of my data to create a massive secret social graph of my life—a portrait of me and my friends, neighbors, adventures, preferences, and finances, across time—a file that can be mined retroactively and even predictably for potential evidence. In fact, they could build a rudimentary Alex Pasternack social graph without doing anything illegal or clandestine, but simply by monitoring all the information I already release publically online everyday, what is known to spies as "open source" data. A program like PRISM would simply enrich that portrait.
"Think of a domain as an activity, as a specific type of activity," Binney told Daniel Ellsberg in Poitras's short film. "Phone calls, or banking is another domain. So you think of ‘graphing’ each domain and then you take each graph, and turning it in the third dimension—the trick now is to map through all the domains in that dimension—pulling together all the attributes that any individual has in every domain. So that now, I can pull your entire life together from all those domains, and map it out and show your entire life, over time."
An excerpt from Marilyn Monroe's FBI file
THE MOST THOROUGH LIBRARIES OF PERSONAL DATA EVER BUILT ARE BEING BUILT BY US EVERYDAY. NOW THEY'RE THE MOST POPULAR WEBSITES ON THE INTERNET.
In a 2006 academic paper partially funded by the NSA's ARDA program (now called the Disruptive Technology Office), the authors showed how social networks could be used to identify conflicts of interest among academic peer reviewers. The ARDA project, called "An Ontological Approach to Financial Analysis & Monitoring," seems to have been dedicated to keeping tabs on money flows and money laundering around the world (this was the pre-Bitcoin era). "Although social networks can provide data to detect COI," the authors write, "one important problem lies in the lack of integration among sites hosting them. Moreover, privacy concerns prevent such sites from openly sharing their data."
Those privacy concerns—about the things we keep in our drawers and the "things" we keep in our inboxes—find their legal protection in the Fourth Amendment:
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
Now that nearly every real-world activity gets turned into data, somehow, somewhere, we're all being spied upon. We're all spies too, if not as potential data points in some software's network analysis, then as its sensors, the uploaders of photos and videos, the writers of emails and tweets, all feeding the big beast of data. The irony of all this personal data collection of course is that we don't ourselves often have it or control it; legally, it's not clear who even "owns" it. Despite the preponderance of data, we're ignorant about exactly where that data is going.
Our cell phones track us, our websites know where we've been; for many of us, our inboxes are the unconscious ledgers of our daily lives. Our data gets traded between computers and people all the time for reasons and for prices unknown to us. Between the PATRIOT Act, Facebook, airport x-ray scanners, and the breathless excitement over big data, the idea of "privacy" has gone the way of the "friend" and "liking." It's melted into bits. And this is to say nothing of the cameras perched on every street corner, and soon, on our friends faces, and flying high up in our skies.
The most thorough libraries of personal data ever built have been built by us, in public, and now they're the most popular websites on the Internet. The question before this week was just exactly how the government was going about mining this data. Now that we have a better idea, the question is, how much do we care?
The revelations are uncomfortable, to say the least. The director of national intelligence has called the leak "literally gut-wrenching"; leaker Edward Snowden is certain to be living an uncomfortable life for some time too. And then there's everyone else, the people like us who are being watched, unintentionally or not. (The revelation may not discomfort foreign governments; if a low-level analyst at a government contractor had access to this information, it seems possible that America's adversaries already have more details than we do.)
But the revelations are discomforting for another reason: they ask us, Americans and people everywhere, to think hard about our own unacknowledged expectations about privacy, secrecy, and security at a time when terrorist fears are immanent and, as importantly, when technology has outpaced our ideas about what's possible.
A diagram of NSA data collection. Source unknown
A slide illustrating social network analysis from a presentation by the security companies Palantir, HBGary, and Berico
How Palantir's Graph application charts and searches relationships.
As has been pointed out in series like " What they Know" at the Wall Street Journal and The Washington Post's "Top Secret America," at Motherboard and elsewhere—our data is there for the mining, and billions of dollars are spent every year, inside and outside government, to do just that.
Software designed by companies like Palantir and Ntrpeid (described in documents that were obtained by Anonymous from the security firm HBGary in 2011) is intended to build social graphs out of public and private data. That year, for instance, Ntrepid used their software to chart “an amorphous network of anarchist and protest groups” within Occupy Wall Street and Occupy D.C using information from social media. Ntrepid is also one of a handful of government contractors that builds "persona management" software, used by government agents to manage fake social media profiles that can be used to extract information out of people on those networks.
What's a person on the Internet to do? For the most part, we choose to live with the Internet as much as we choose to live in cities. We might at least notice the surveillance camera in the room or on the street corner—or otherwise assume it's there. By contrast, it's easy to live online without thinking much about where the cameras are, without knowing the rules and the limits. We know, say, that when we talk to friends at a bar what we say exists then and there. We don't even tend to think about, for instance, where our messages on friends' Facebook walls go after we write them; when we do think about it, we're left with few answers. That's a new way of being, of thinking and talking.
Who, though, could blame some people for being caught in a kind of cognitive dissonance, between defending privacy and passively watching it slide away? Even as we expect these companies to carefully guard our data, and chafe at the notion that they wouldn't, we also readily give it up. We "sign" legal contracts with these companies, their terms of service, or TOS, by checking a box on a sign-up screen. Our data is theirs.
The five largest data collectors said to be involved in PRISM at first denied involvement. On Friday, an article by Claire Miller in the Times clarified those denials:
...instead of adding a back door to their servers, the companies were essentially asked to erect a locked mailbox and give the government the key, people briefed on the negotiations said. Facebook, for instance, built such a system for requesting and sharing the information, they said.
The data shared in these ways, the people said, is shared after company lawyers have reviewed the FISA request according to company practice. It is not sent automatically or in bulk, and the government does not have full access to company servers. Instead, they said, it is a more secure and efficient way to hand over the data.
In the case of a program like PRISM, which is said to provide legal immunity to the companies involved (and even give some of their executives secret clearance), if not plausible deniability, terms of service may not matter. Or clever mechanisms, like the use of a third-party, might circumvent legal restrictions.
Then again, immunity can come in the form of these companies' terms of service. Generally, a company dealing with your data gives itself the right to disclose that data in cases of business transfers (like a merger), service fixes, and when law enforcement comes calling. The wording is often brief, delicate, and vague, but it's there. All of the terms of service say essentially the same thing that Facebook's does:
Information we receive about you, including financial transaction data related to purchases made with Facebook Credits, may be accessed, processed and retained for an extended period of time when it is the subject of a legal request or obligation, governmental investigation, or investigations concerning possible violations of our terms or policies, or otherwise to prevent harm.
We cede our expectations to privacy at the sign-in screen, perhaps keeping in mind Eric Schmidt's edict: "If you've got something to hide, then maybe you shouldn't be doing it in the first place." Or, if you have nothing to hide, you have nothing to worry about. It's a line that reminds me of Truman, speaking in a Nixon sneer: "The only thing you have to fear is fear itself."
The White House's allies in Silicon Valley have been helpful in contorting our definition of "privacy" in ways that are convenient for big data miners, be they advertisers or governments. The entire Internet economy, after all, depends on sharing. (The economy of sharing and Liking — essentially, of increasing clicks and generating more ad revenue — is not the same as the sharing economy, although sharing data looks far more attractive when it's used to better share resources. I'm happy to share my location with an app and its users if it means, for example, it will help me find someone to share a taxi with*.) (*Disclosure: this is precisely what Bandwagon, a company I co-founded, aims to do.)
And maybe, just maybe, we secretly like (or are even just okay with) sharing our information with the government too, if it means being safer. Maybe we agree with Obama that "there are some tradeoffs involved," that "you can't have 100 percent security and also 100 percent privacy and zero inconvenience," that these programs "make a difference to anticipate and prevent possible terrorist activity."
General Keith Alexander, director of the National Security Agency, speaking at Defcon 2012, the hacker conference. Photo: dr3x
"We are overseen by everybody," said the NSA director. "And I will tell you that those who would want to weave the story that we have millions or hundreds of millions of dossiers on people is absolutely false.”
But even as we expect law enforcement to look out for own rights and not break the law, we also know—even if we don't always acknowledge it—that governments do ugly things in the name of security. Are those things justified? What counts as illegal or suspicious? In cases where we might think access to this data is reasonable (foiling a terrorist plot), how much do we want to know? How much do we want revealed about our government's secrets? How do we police those secret things? The questions impinge upon the trials of leakers like Bradley Manning—one of a record number of whistleblowers being prosecuted by the Justice Department—but also on the "secret" things those leakers expose, like extraordinary renditions and signature strikes by drones.
Put another way: if we already expect that our governments keep and sometimes steal secrets—among other controversial things—in the name of safety and security, what do we have to say about how that's done? And if it relates to us, our personal and private information—and it has for a long time—what will we have to say about how that data is used? If we thought that it was helpful for spotting would-be terrorists, would we consent to the government using that data to create its own private Facebooks? If we're already signed up for the nanny state, like some secret social network, are there ways of opting-out of part of it? How can we tweak our NSA privacy settings?
Or, as Ethan Zuckerman asked a conference room on Friday at the Personal Democracy Forum, "Have we given up the possibility of being anonymous and not having our movements tracked?" We won't give up that easily. But in the face of secret power, political methods feel woefully inadequate. Zuckerman summed up his feelings: "What keeps me up at night is not only the expansion of surveillance culture, but that people in this room haven't found a way to be politically effective."
Foreigners only, federal judges, oversight, audits, private contractors: those details about PRISM are important. But the new details also emphasize much older questions, which people on the Internet, foreigners and Americans, will need to contend with in order to begin to separate what's technologically possible from what's ethically responsible. We need to talk about the way we keep and steal secrets in the future, especially because it relates to a digital space where we expect information to run free and remain "ours," even while we're largely ignorant of its limits and its rules. There has to be a better way.
"We're going to have to make some choices as a society," Obama said on Friday, hours before mingling with donors in Silicon Valley. But currently, the choices he describes look like false ones. We can't make decisions or even have a conversation when what we're talking about is kept hidden from us. And given the ugly optics here and the general awareness of this surveillance, let alone the potential for abuse, there's no reason that PRISM or programs like it, need to be secret.
Now we can write letters to our congressmen or vote them out of office. We can exercise our right to protest, online and off. The fight the Internet waged over bills like SOPA and CISPA—the debate over who gets to control our data—was a precious, electrifying moment of civic Internet consciousness. The campaign for privacy on Facebook has been slow going but it's had its successes. Meanwhile, groups like the Electronic Frontier Foundation and movements like Do Not Track are helping draw thinker lines between private and public online. Freedom of Information Act geeks like MuckRock are helping keep governments open. Journalists are keeping governments honest, and doing so in a climate where whistleblowers are often suspected enemies. We'll need more of all of these things.
But we can't choose not to participate in the Internet anymore than we can choose not to be a part of the economy. We can however choose what kind of life we might like to live inside the panopticon. Both online and off, we'll need to decide not just what kind of democratic public spaces we want—the kind of public space that people are fighting for in Istanbul, for instance—but also, what kind of democratic private space we want too. If we are our data, then what expectations do we have for that data, the data that we knowingly and unknowingly yield to Facebook, and other, more secret Facebooks, every day?
Follow Alex on Twitter and probably on PRISM.
More on secrecy and privacy