Is Facebook Lying?
Photo: David Ramos/Getty Images

FYI.

This story is over 5 years old.

Tech

Is Facebook Lying?

Comparing the company's public statements about its Trending news module to its internal guidelines.

Did Facebook lie publicly about how it determines which stories to put in its trending news module?

The company has been under intense fire since a Gizmodo investigation earlier this week that alleged the company has a liberal news bias. In response, Facebook vehemently denied the accusations and released what it says are the current guidelines for its trending news section.

However these guidelines, and a set of older training documents leaked to The Guardian, appear to show that Facebook lied or misrepresented the truth to journalists.

Advertisement

The documents appear to conflict with previous statements from the company assuring that algorithms—not humans—select what you see in your news feed, and that its curators never "inject" stories into the trending tool.

Claim #1: It's just an algorithm

The social media company has long insisted that proprietary algorithms for its trending section determine what you see, who you see it from, and when.

In a 2015 story titled "How Facebook decides what's trending," Recode wrote: "Once a topic is identified as trending, it's approved by an actual human being, who also writes a short description for the story. These people don't get to pick what Facebook adds to the trending section. That's done automatically by the algorithm."

Based on the guidelines Facebook released today, this appears to be out-and-out false. (Recode has since updated its story.) While the stories that Facebook's curators are allowed to select from are chosen by an algorithm, it's humans who pick what stories to put into the module, based on a variety of factors including whether the story is a hoax (keep it out) or the story is breaking news (put it in).

Stocky's statement is very carefully worded, and hinges on the word "artificially."

Here's how it works, according to public statements from Facebook before today: Topics are identified as trending by the algorithm. They are subsequently approved by news curators, who serve no greater role other than writing a brief description of the issue. Choosing what's added to the trending module is done automatically by an algorithm. Editors "just get to pick the headline," Facebook told Recode.

Advertisement

Here's how it really works: Editors are instructed to sift through "Live" topics surfacing in several of Facebook's algorithmically sorted, backend news feeds. Primarily, stories are selected from "Review Tool," which seems to aggregate the largest stories of the day, or topics preferred by Facebook's algorithms.

A secondary feed called "Demo Tool," reflects topics that are trending based on content users are sharing and discussing on Facebook. This tool may include "blacklisted" or deactivated stories, which are topics that have been suppressed due to insufficient "credible" news coverage, or allegedly, because a curator felt that was a good reason for doing so. When they reappear, editors will see them with a strikethrough. When an curator sees a story in the Demo Tool they wish to include in Facebook's trending module, they can choose to sort it in the trending section if they feel it complies with editorial guidelines for timeliness, relevance, and credibility.

Curators are also advised to manually "un-blacklist" these items. In their instructions, Facebook wrote that they "will track these instances so the engineers can fix for the future." It's unclear whether flagging topics can influence algorithmic biases.

Claim #2: Curators don't "inject" stories

Whether Facebook lied about this depends on how charitable you're feeling, but the company's statements before and after the Gizmodo story are definitely misleading.

Advertisement

In a statement, Facebook's Vice President of Search Tom Stocky wrote, "We do not insert stories artificially into trending topics, and do not instruct our reviewers to do so."

But according to the internal guidelines that Motherboard reviewed, Facebook's editorial team can "inject" a topic in two scenarios: if a topic is appearing twice, or if a topic is breaking news and appearing in the Demo tool but has not yet hit the Review Tool.

We now know that human judgement played a much bigger role than people would have assumed

This was also confirmed to Motherboard by a source who used to work on the trending news team but declined to speak on the record due to a non-disclosure agreement.

Stocky's statement is very carefully worded, and hinges on the word "artificially." He may be referring to the fact that curators are not allowed to inject a story that did not appear in any of Facebook's backend tools—even though they are allowed to "inject" stories into the module when the topic is breaking and important, as determined by their editorial judgement and that of their superiors.

Claim #3: User preferences are a factor in Facebook Trending

When the module first launched in early 2014, Facebook told TechCrunch it would be tapping into its bounty of data on what individual users like, and who they interact with most on the platform, to inform what shows up in their trending sections.

Again, Facebook told TheNextWeb that trending stories are "based on topics or pages that you're interested in, as well as keywords that are trending worldwide."

Advertisement

Facebook explicitly stated to the New York Times that trending items would appear to users based on four criteria: a person's own interests, "the authority of the people commenting on the topic, how recently the topic has surged on Facebook and how much users are engaging with it."

These statements would imply that Facebook's trending section shows users what they want to see, and what other people are currently interested in. That's true to an extent, except individual preferences appear to be much less of a factor than these early statements from Facebook would suggest. The platform's algorithms can detect what users are chattering about, and that might bring topics to an editor's attention, but the trending module's customization only comes into play once stories have already been selected. Curators will select potentially popular topics to highlight, and Facebook's ranking algorithms will determine which of them show up in your feed.

Example of a trending topic page.

Facebook ranks trending stories for each individual user based on their likes and dislikes; that determines the order in which to show 10 automatically selected stories. But curators tag stories with keywords once they've been selected to appear in Facebook Trending, which is how Facebook matches stories to individual users. And based on the guidelines Facebook shared today, individual preferences do not seem to affect whether a story qualifies as trending, at least as the module runs now.

Advertisement

So, did Facebook lie?

There's nothing inherently wrong with editorial judgment; newsrooms would cease to function without it. But what is arguable is when a social media company that's gradually become the internet for several million of its users—a walled garden for what people see and consume online—starts to conduct itself like a news business while trying to pitch itself as something different.

On its own help page for trending topics, the company wrote that stories are based on "a number of factors including engagement, timeliness, Pages you've liked and your location." Nowhere on the page does it say anything about human curation.

Furthermore, Facebook's repeated references to the dominance of its "algorithms" give the impression that the stories in the trending module are chosen by robots and hard data alone—when that's just not true.

There's also the fact that even algorithms have human bias, since they're written by human engineers.

The point is, Facebook wants people to think that its trending section is automatically populated based on what stories people are organically sharing and talking about. That's why it's called "Trending." We now know that human judgement played a much bigger role than people would have assumed based on Facebook's public statements.

Did Facebook lie? In a "the sky is green" way, no. But in the polished, expertly-hedged, truth-with-a-spin way that corporate communications departments are so good at, yes.