FYI.

This story is over 5 years old.

Tech

Facebook's Filter Bubble Is Getting Worse

New changes to the company's algorithm will make the social network even more of an echo chamber.

Hi there. If you're reading this piece, consider yourself lucky. Don't take it for granted. Especially if you're like the 62 percent of American adults who get their news from social media, as a recent Pew Research poll showed, and you usually find our posts on Facebook.

Facebook announced earlier this week that it will change the algorithm used to decide what every single user sees on their timeline. "Facebook was built on the idea of connecting people with their friends and family," Lars Backstrom, engineering director at Facebook, said in a statement. "Our top priority is keeping you connected to the people, places and things you want to be connected to — starting with the people you are friends with on Facebook. That's why today, we're announcing an upcoming change to News Feed ranking to help make sure you don't miss stories from your friends."

Advertisement

Now, why we should care about this announcement, and how are the first two paragraphs of this piece related?

The changes announced by Facebook will mainly impact one of the most important values for Facebook's brand and publisher-owned pages: "reach." This value shows how many users will be shown a certain post. In other words, how many users see that single update on their timeline.

Worse and worse

This premise takes us to the point. The condition created by this policy is often called a "filter bubble."

Social networks that use algorithms to define which updates are most relevant for their users tends to gradually supply the users with things that align with their established interests and opinions.

Take the recent media boom about Brexit, the controversial vote in the UK over whether to leave the European Union.

One pro-"Remain" Facebook user explained how hard has been for him to find posts from the opposing side. On the day the "Leave" campaign won, he looked for Facebook posts celebrating for the win—and came up short.

Facebook is becoming an echo chamber that prevents us from being confronted with opinions we don't agree with

It wasn't only about his News Feed list: He also tried to use the Facebook search function, also to no avail. It wasn't that there were no posts about how great the Leave victory was. It was that Facebook, having identified him as a Remain voter, just wasn't allowing him to see them.

Advertisement

An appeal to everyone I know who works at Twitter, Facebook, Google etc, and for the people who influence them Tom SteinbergJune 24, 2016

It's not just the opinions expressed in posts, but also where they're coming from. Facebook has a double interest here: On one side, it needs to be able to charge publishers money who want more exposure. On the other, it needs to boost the number of user interactions on the social network.

This mostly affects content published by Facebook pages. The owners of these Facebook pages are finding it harder and harder to reach the users who liked their page. The only solutions are to pay Facebook to boost the posts, or create content designed to be shared. "If a lot of your referral traffic is the result of people sharing your content and their friends liking and commenting on it, there will be less of an impact than if the majority of your traffic comes directly through Page posts," Backstrom explained in the announcement.

How Facebook decides what posts to show users. Image via TechCrunch

This dynamic is what leads to the filter bubble. Facebook is becoming an echo chamber that prevents us from being confronted with opinions we don't agree with.

Raising awareness

This is about more than the news media's bottom line. A public inside the filter bubble will find itself enclosed in a shrinking infosphere, cut off from people who aren't like them.

Facebook says its new algorithm changes will emphasize posts that are "informative." Do you want to see information that is one-sided, or multi-faceted? Can you imagine a filter-bubbled newsstand? If you're a left-wing person, no right-wing magazines for you. Are you with Trump? There's a good chance your favorite newsstand will never tell you anything about Bernie Sanders.

In a certain way, maybe, we should now think of social networks as true social services, a public good—not simply a fun product from a company. The recent scandal over Facebook's "Now Trending" section, which was more influenced by human curators than the company let on, showed that people are starting to think about the ways in which news reaches them.

We're still grappling with all the implications of a public that gets its news from social media, and there is a long way to go before we understand it. But first of all we should remind ourselves that the "filter bubble" exists, and it's getting worse.

This article was originally published on Motherboard Italia.