Facebook's echo chamber can turn the obnoxious into the dangerous.
The filter bubbles Facebook and Twitter have allowed us to set up aren't just concerning to those who hoped social media would help break down communications barriers—it's bothering officials with the US State Department, who fear that the silos that "liking" things places us in are only going to cause more political and social unrest moving forward.
As we've reported before (and as Wired journalist Mat Honan experimented with earlier this week), what you engage with in social media plays a huge role in what you end up seeing on Facebook (and elsewhere). Like primarily conservative Republican groups, and statuses, and that's all you'll see—anything more center leaning will, more often than not, be filtered out, as if it doesn't exist at all, even if your friends are the ones posting it.
When Facebook feeds you the latest celeb gossip or BuzzFeed lists, it's perhaps innocuous, but what happens when it feeds you (and your friends) a constant barrage of propaganda, hate speech, and one-sided news stories? The argument here is, perhaps, that Facebook is giving you what you asked for (the proverbial you did "like" this stuff at some point, right?)—but should that be enough?
These robots make choices about their [likes] and reinforce biases. That, to me, is a little nuts.
"You start to understand why people in Iraq may see things very differently than people in Tel Aviv," Macon Phillips, the State Department's coordinator for the Bureau of International Information Programs said at an event discussing the role technology plays in developing society in Washingtonm DC yesterday. "These robots make choices about their [likes] and reinforce biases. That, to me, is a little nuts. It's a huge problem that's much more subtle and scary to me than [other ways technology can be abused, like] a bunch of bots trying to rig a Twitter vote."
After the presentation, I caught up with Phillips and asked him if the State Department has had any conversations with Facebook or other tech companies about the issue, and whether it would be the government's place to ask social media companies to stop people from creating their own filter bubbles. He said that, though he'd certainly be "watching what happens very closely," there's not a whole lot he can do or say.
"I don't think it's something where it'd be appropriate for me to call any tech company and tell them, 'Hey, fix this sorting and this siloing,'" he said. "A lot of customers aren't motivated by what the government wants, they're motivated by what their users want. There's a challenge between customization and offering something that's informative."
And what its users want, or at least, what its advertisers want, isn't to be connected to a diverse group of people throughout the world with diverse viewpoints. What Facebook's algorithms are (increasingly, apparently) determining is that people want to be siloed off with like-minded people, to be segregated from people whose viewpoints they find challenging.
That's why ReaganBook, the Facebook for conservative Republicans, died within a week of launching. Trolls may have had something to do with it, but it was doomed to fail anyway. It's simply not necessary. Facebook and its algorithms have made it so that, no matter your interest, you can automatically create your own personalized social network, and you'll never have to hear from anyone you disagree with ever again.
In a way, social networking has taken fringe groups and given them power they never had before. The small-town white supremacist or misogynist might be obnoxious on his own; armed with an internet community of like-minded individuals, he becomes outright dangerous. In that way, technology has taken the fringe and "exacerbated" the problem of extremism, Phillips said.
That's why the State Department is worried about this: Localized movements like the Islamic State have become worldwide ones, Facebook becomes an echo chamber where an inkling of hate or an errant like here or there is quickly amplified into something larger than that. Before you know it, before you've even realized what's happened, the only content you're being fed is simply reinforcing that initial thought.