Thousands of third party apps were designed solely to obtain and sell your data. It's no surprise that the data ended up being used again on Facebook, one of the biggest advertising platforms on Earth.
Image: David Paul Morris/Bloomberg via Getty Images
This weekend, while Facebook was quibbling about whether the information used by Cambridge Analytica to target voters in the lead up to the 2016 election was obtained in a data “breach” or somehow using fraudulent means, I decided to check my privacy settings.
Since creating a Facebook in 2006, I have associated my account with 100 separate third party apps. Besides common ones like Spotify, Venmo, and Uber, I have given access to my account to apps like “Typing Maniac,” “I bet I can guess your favorite color,” and “Crazy Cabbie,” among others. Many of these apps I remember only as a faint memory. According to these settings, however, lots of these apps still have access to the same information Cambridge Analytica used to target Facebook users with political ads that helped Donald Trump win the 2016 presidential election.
Typing Maniac—a game I vaguely remember from college—has access to my public profile, my friend list, my relationship status, my “relationship interests,” my birthday, my work history, my status updates, my education history, my events, my hometown, my current city, photos I’m tagged in and that I’ve uploaded, my religious and political views, my videos, my website, my personal description, and my “likes.”
I do not know what MetroGames—which has gone out of business—ultimately did with my data. But between the hundred apps I’ve given access to my Facebook, there’s a good chance at least one of them has sold my information to data brokers and ultimately used to target me on Facebook and elsewhere. Facebook did not respond to a request for comment for this article.
“You have to proceed on the assumption that this information has been extracted from you,” Woodrow Hartzog, author of Privacy’s Blueprint: The Battle to Control the Design of New Technologies, told me on the phone. “Cambridge Analytica used an information extraction technique that was well-known to technologists for years. The implications of this debacle is about crystallizing the threat about how dangerous the information ecosystem is.”
The point I am trying to make is that, though Cambridge Analytica’s specific use of user data to help a political campaign is something we haven’t publicly seen on this scale before, it is exactly the type of use that Facebook’s platform is designed for, has facilitated for years, and continues to facilitate every day. At its core, Facebook is an advertising platform that makes almost all of its money because it and the companies that use its platform know so much about you.
Facebook continues to be a financially successful company precisely because its platform has enabled the types of person-specific targeting that Cambridge Analytica did.
Though Facebook technically asks apps not to sell user information to data brokers, researchers say that enforcement is rare and that many fly-by-night companies have historically popped up, gathered information, and then sold that data (or sold the entire company itself) to marketing firms and data brokers. It would be difficult for Facebook to even know this was happening, and many companies would simply be kicked off Facebook once it had already harvested data.
The way Cambridge Analytica obtained the data of millions of Americans has been done—and in many ways is still being done—by many different data broker firms throughout the United States and the world. The same data that is used every day to sell shoes and meditation apps is used to sell us political candidates, too. Whether Cambridge Analytica misused the data or obtained it in some nefarious way is beside the point—thousands of apps have been used to gather information about millions of Facebook users, and there is little stopping them from selling that data to other people.
“I think there should be concern about other apps engaging in digital strip mining,” Hartzog said. “They create a generic type of app that might be somewhat appealing to users, they get as much information as they can, and they get out. There’s a huge incentive to get the data, and there’s not much penalty for doing it. Not from Facebook and not within our legal structure.”
Cambridge Analytica got much of its data from a Facebook personality test called “thisisyourdigitallife” that was much like many of the games I and millions of other people signed up for years ago. These types of apps were trojan horses to gain access to profiles that could then be resold.
This was not incidental, it was the business model of the apps. I ultimately don’t know if MetroGames sold my data, but many similar apps were created and were eventually purchased by marketing firms and data brokers.
"People were writing apps manically—photo of the day, quizzes, any sort of garbage people would install so you would have a big user base to take the data. And there were absolutely cases where people developed Facebook apps for fun, got a huge user base, and then sold the apps for hundreds of thousands of dollars to a marketing company."
In this way, companies were not technically breaking their privacy policies—by purchasing Facebook app companies, the “third party” data brokers suddenly because the first-party owners of the data. For example, a data analytics company called RapLeaf was caught purchasing user information from various Facebook apps. Facebook eventually shut down RapLeaf, but the company simply sold itself to two data brokers, Acxiom and Tower Data. Today, Acxiom is a “Facebook Marketing Partner” which allows would-be advertisers to supplement that information they already have with the data that Acxiom has already gathered.
“Once Facebook created the [data sharing] API, there was a land grab,” Christo Wilson, a researcher at Northeastern University who has studied Facebook advertising and third-party data sharing tactics, told me on the phone. “People were writing apps manically—photo of the day, quizzes, any sort of garbage people would install so you would have a big user base to take the data. It was a huge market back in the day. And there were absolutely cases where people developed Facebook apps for fun, got a huge user base, and then sold the apps for hundreds of thousands of dollars to a marketing company.”
Granting an app access to your Facebook profile gave it a “token” that could then be used—in many cases indefinitely—by the app to access your data, according to Wilson.
Wilson said that in the course of their research, he and colleague Alan Mislove created apps to determine what types of information could be harvested by data brokers and advertisers; that “thisisyourdigitallife” was nominally created for academic research ultimately has no bearing on what data was collected and how it was ultimately used. Thousands of other apps have harvested the same data.
“You told us you had 100 apps installed—you’re not the only one,” Wilson said. “I have restrictions on my behavior because of things like institutional review, but 99.9 percent of these apps were commercial for the purposes of monetizing people.”
An API change in 2015 made data sharing more restrictive—Facebook no longer allowed apps to take your data simply because your friend had installed an app—but third party apps could still collect data directly from your profile if you gave them permission. And because of the way Facebook’s advertising products work, once data was taken, it remained valuable to data brokers, who collate and compile information for advertisers. Much of the damage is already done.
"The incentive is to extract every iota of value out of users"
Data brokers create databases of information that’s pulled from Facebook and elsewhere. And once taken off the platform, data can be analyzed and re-imported later to be used to target people. Facebook allows advertisers to upload .CSV spreadsheets of information about people and use that information to specifically target people using its “Custom Audiences” product. That data can then be further used to target “Lookalike Audiences” on Facebook—people that data brokers might not know much about but who Facebook itself does know a lot about, according to Mislove.
Facebook’s partnerships with data brokers like Acxiom and Experian allow advertisers to leverage what Facebook knows about you with what data brokers know about you to create a more complex profile of who you are.
Together, Facebook and data brokers have made a list of things that it thinks I like, based on my likes, my apps, my web behavior, and, crutially, what data brokers already know about me. It also creates a rough profile of who I am:
“If I was Cambridge Analytica, I would have taken 50 million users worth of data, crunched the numbers and figured out who was susceptible to influence,” Mislove said. “Then I’d upload that data back to Facebook as a custom audience and ask Facebook for lookalikes of those people, who are new people I didn’t know about but who are susceptible.”
When I signed up for these hundred apps years ago, I generally had two choices: Accept the terms, or not play the game. In recent years, Facebook has allowed for greater, app-specific privacy setting control, but the fact remains that Facebook makes its money by selling advertisements, and those advertisements are more valuable to companies—and Facebook—because Facebook and the companies it works with can target you, specifically.
Though Facebook has made a few changes to make privacy control better, that basic calculus has not changed, and the old adage still applies: If you’re not paying for something, you are the product.
“The incentive is to extract every iota of value out of users,” Hartzog said. “The service is built around those incentives. You have to convince people to share as much information as possible so you click on as many ads as possible and then feel good about doing it. This is the operating ethos for the entire social internet.”