Quantcast
We Give Up Our Data Too Cheaply

​Our data has enormous value when we put it all together.

​Our data has enormous value when we put it all together. Our movement records help with urban planning. Our financial records enable the police to detect and prevent money laundering. Our posts and tweets help researchers understand how we tick as a society. There are all sorts of creative and interesting uses for personal data, uses that give birth to new knowledge and make all of our lives better.

Our data is also valuable to each of us individually, to keep private or disclose as we want. And there's the rub. Using data pits group interest against self-interest, the core tension humanity has been struggling with since we came into existence.

The government offers us this deal: if you let us have all of your data, we can protect you from crime and terrorism. It's a rip-off. It doesn't work. And it overemphasizes group security at the expense of individual security.

The bargain Google offers us is similar, and it's similarly out of balance: if you let us have all of your data and give up your privacy, we will show you advertisements you want to see—and we'll throw in free web search, e-mail, and all sorts of other services. Companies like Google and Facebook can only make that bargain when enough of us give up our privacy. The group can only benefit if enough individuals acquiesce.

The government offers us this deal: if you let us have all of your data, we can protect you from crime and terrorism. It's a rip-off.

Not all bargains pitting group interest against individual interest are such raw deals. The medical community is about to make a similar bargain with us: let us have all your health data, and we will use it to revolutionize healthcare and improve the lives of everyone. In this case, I think they have it right. I don't think anyone can comprehend how much humanity will benefit from putting all of our health data in a single database and letting researchers access it. Certainly this data is incredibly personal, and is bound to find its way into unintended hands and be used for unintended purposes. But in this particular example, it seems obvious to me that the communal use of the data should take precedence. Others disagree.

Here's another case that got the balance between group and individual interests right. Social-media researcher Reynol Junco analyzes the study habits of his students. Many textbooks are online, and the textbook websites collect an enormous amount of data about how—and how often—students interact with the course material. Junco augments that information with surveillance of what else his students do on their computers. This is incredibly invasive research, but its duration is limited and he is gaining new understanding about how both good and bad students study—and has developed interventions aimed at improving how students learn. Did the group benefit of this study outweigh the individual privacy interest of the subjects who took part in the study?

Junco's subjects consented to being monitored, and his research was approved by a university ethics board—but what about experiments that corporations do on us? The dating site OKCupid has been experimenting on its users for years, selectively showing or hiding photos, inflating or deflating compatibility measures, to see how such changes affect people's behaviors on the site. You can argue that we've learned stuff from this experimentation, but it's hard to justify manipulating people in this way without their knowledge or permission.

Again and again, it's the same tension: group value versus individual value. There's value in our collective data for evaluating the efficacy of social programs. There's value in our collective data for market research. There's value in it for improving government services. There's value in studying social trends, and predicting future ones. We have to weigh each of these benefits against the risks of the surveillance that enables them.

The big question is how do we design systems that make use of our data collectively to benefit society as a whole, while at the same time protecting people individually? Or to use a term from game theory, how do we find a "​Nash equilibrium" for data collection: a balance that creates an optimal overall outcome, even while forgoing optimization of any single facet?

This is it: this is the fundamental issue of the information age. We can solve it, but it will require careful thinking about the specific issues and moral analysis of how the different solutions affect our core values.

I've met hardened privacy advocates who nonetheless think it should be a crime not to put your medical data into a society-wide database. I've met people who are perfectly fine with permitting the most intimate surveillance by corporations, but want governments never to be able to touch that data. I've met people who are fine with government surveillance, but are against anything that has a profit motive attached to it. And I've met lots of people who are fine with any of the above.

As individuals and as a society, we are constantly trying to balance our different values. We never get it completely right. What's important is that we deliberately engage in the process. Too often the balancing is done for us by governments and corporations with their own agendas.

Whatever our politics, we need to get involved. We don't want the FBI and NSA to secretly decide what levels of government surveillance are the default on our cell phones; we want Congress to decide matters like these in an open and public debate. We don't want the governments of China and Russia to decide what censorship capabilities are built into the Internet; we want an international standards body to make those decisions. We don't want Facebook to decide the extent of privacy we enjoy amongst our friends; we want to decide for ourselves. All of these decisions are bigger and more important than any one organization. They need to be made by a greater and more representative and inclusive institution. We want the public to be able to have open debates about these things, and "we the people" to be able to hold decision-makers accountable.

I often turn to a quote by Rev. Martin Luther King, Jr: "The arc of history is long, but it bends toward justice." I am long-term optimistic, even if I remain short-term pessimistic. I think we will overcome our fears, learn how to value our privacy, and put rules in place to reap the benefits of big data while securing ourselves from some of the risks. Right now, we're seeing the beginnings of a very powerful worldwide movement to recognize privacy as a fundamental human right, not just in the abstract sense we see in so many documents, but in a meaningful and enforceable way. The EU is leading the charge, but others will follow. The process will take years, possibly decades, but I believe that in half a century people will look at the data practices of today the same way we now view archaic business practices like tenant farming, child labor, and company stores. They'll look immoral. The start of this movement, more than anything else, will be Edward Snowden's legacy.

I started this book by talking about data as exhaust: something we all produce as we go about our information-age business. I think I can take that analogy one step further. Data is the pollution problem of the information age, and protecting privacy is the environmental challenge. Almost all computers produce personal information. It stays around, festering. How we deal with it—how we contain it and how we dispose of it—is central to the health of our information economy. Just as we look back today at the early decades of the industrial age and wonder how our ancestors could have ignored pollution in their rush to build an industrial world, our grandchildren will look back at us during these early decades of the information age and judge us on how we addressed the challenge of data collection and misuse.

We should try to make them proud.


Excerpted from Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World by Bruce Schneier. Copyright © 2015 by Bruce Schneier. With permission of the publisher, W. W. Norton & Company, Inc. All rights reserved.