FYI.

This story is over 5 years old.

Tech

Facebook Promised Transparency About User Experiments, But a Trust Gap Remains

The new changes to Facebook's research policies are unlikely to quell longstanding fears about Facebook’s respect for user privacy.
Mark Zuckerberg at this year's F8 conference. Image: Maurizio Pesce/Flickr

Facebook vowed on Thursday to change the way it conducts research on its users, but the social network's statement fell short of the reforms that critics have called for after a furor over the company's manipulation of user emotions.

More broadly, the new changes, which were announced in a company blog post, are unlikely to quell longstanding fears about Facebook's respect for user privacy.

For years, Facebook has pushed the boundaries of user privacy in an effort to amass a gargantuan database about the preferences of its users, in order to better target them with advertising—which, after all, is Facebook's business. Most Facebook users are aware of this, or should be.

Advertisement

But most Facebook users were not aware that in 2012, the social network conducted research on "emotional contagion" by subtly tweaking the news feeds of nearly 700,000 users without informing them.

The goal of the research was to see what effect weighting news feeds with either "positive" or "negative" posts had on the emotions of users. The company found that users who were exposed to positive posts felt happier, results that contradicted earlier research that showed the opposite.

we were unprepared for the reaction the paper received

In June, Facebook published the results in the Proceedings of the National Academy of Sciences, and immediately faced a strong reaction from users who objected to the company's manipulation of their emotions. The release of the results prompted a wide-ranging discussion among academics, privacy experts and the general public about the propriety—and even legality—of the research, and rekindled fears about Facebook's privacy policies.

One prominent law professor accused the company of breaking the law, and the Electronic Privacy Information Center filed a complaint with the Federal Trade Commission.

On Thursday, Facebook acknowledged that it had mishandled the research project.

"Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," Facebook's Chief Technology Officer Mike Schroepfer said in a company blog post. "It is clear now that there are things we should have done differently."

Advertisement

Despite the contrite tone, Schroepfer did not apologize for the research experiment, and in fact made clear that the company has every intention of continuing to conduct research on its users. Nor did he say that Facebook would enhance its efforts to obtain "informed consent" from users before experimenting on them.

Schroepfer did say that Facebook would modify its research policies, but it's unclear if those modifications are designed to address the substantive concerns of privacy advocates, or merely to avoid another public relations black eye.

Schroepfer said the Facebook researchers "studying particular groups or populations (such as people of a certain age)" or pursuing work related to "content that may be considered deeply personal (such as emotions)" would be subjected to "an enhanced review process before research can begin."

Schroepfer also announced the creation of a panel "including our most senior subject-area researchers, along with people from our engineering, research, legal, privacy and policy teams, that will review projects falling within these guidelines." And he said that that Facebook would incorporate "education on our research practices" into the company's six-week training program, and add a section on research into the company's annual privacy and security training.

Facebook users who were hoping that the company would enhance its efforts to obtain informed consent for future research would have been disappointed by Schroepfer's statement. When the study was released, Facebook said that its research was "consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research."

Advertisement

But that standard of informed consent is not sufficient for many of Facebook's critics, who were hoping that the company would announce more robust policies designed to obtain consent from users before experimenting on them.

"Tightening up research practices is a step in the right direction," Marc Rotenberg, president of the Electronic Privacy Information Center (EPIC) told the New York Times. "But human subject research requires consent and independent review. It does not appear that Facebook has taken those steps."

EPIC has filed a complaint with the Federal Trade Commission over Facebook's experiment accusing the company of deceptive trade practices, as well as a 2012 FTC consent order that requires Facebook to obtain users express consent before sharing user data with outside parties.

Screenshot from EPIC's complaint against Facebook

Informed consent, or lack thereof, is at the heart of the criticism aimed at the company from James Grimmelmann, a University of Maryland law professor who has blasted Facebook over the experiment.

"This is a positive development on oversight, but Facebook still hasn't given an inch on informed consent," Grimmelmann told the Washington Post. "As long as this panel says OK, Facebook will still feel free to manipulate its users in the name of science. I hope that other companies emulate Facebook's commitment to better training and consistent policies, but there's still a long way to go."

Last month, Grimmelmann wrote a blog post asserting that Facebook's experiment violated Maryland state law, by running afoul of what's known as the "Common Rule," which requires informed consent for "all research involving a human subject."

Facebook failed to obtain informed consent, according to Grimmelman. "Informed consent under the Common Rule means telling participants about the research," he wrote. "It means warning them about the risks. It means giving them a chance to opt out without penalizing them if they do. It means giving them a chance to ask follow-up questions to someone who'll provide answers."

Grimmelman's post prompted a lively debate about the legality of Facebook's research. (Techdirt's Mike Masnick, who disagreed with Grimmelman's analysis, has a good discussion of the various legal issues at stake.) But you don't need to be a lawyer or a legal scholar to see that Facebook's efforts to obtain informed consent in this case fell short by any common sense definition. And the company gave no indication in its Thursday statement of any willingness to increase those efforts moving forward.

As a result, Facebook's statement only reinforces longstanding fears about the company's handling of user data. Facebook may bolster its training and oversight programs, and enhance its research review process, but Facebook members should be under no illusions: the company will continue to experiment on its users' accounts. And in all likelihood, those users will not be aware of those experiments, let alone be in a position to provide consent for them.

Put simply, companies are constantly experimenting on their products in order to improve their appeal. Facebook is no different, except that in its case, the company conducts experiments on its users, because, after all, its users are the product.