FYI.

This story is over 5 years old.

Tech

Facebook Is Using Bad Free Speech Arguments to Defend InfoWars

In a series of tweets, Facebook told reporters that conspiracy theories are a matter of free speech.
InfoWars founder Alex Jones. Image: Wikimedia Commons

As Facebook doubles-down on thwarting the spread of disinformation on its website, recent tweets from the company’s official Twitter account belie its promise to be better at moderating specious content shared by Pages.

At a press event hosted by Facebook’s New York office this week, reporters questioned John Hegeman, the head of News Feed, and Sara Su, a product specialist for News Feed, about its plan to stop hoaxes and conspiracy theories from propagating on Facebook.

Advertisement

According a report on Wednesday from CNN’s Oliver Darcy:

When asked by this reporter how the company could claim it was serious about tackling the problem of misinformation online while simultaneously allowing InfoWars to maintain a page with nearly one million followers on its website, Hegeman said that the company does not "take down false news."

To that, Hegeman said: “I guess just for being false that doesn't violate the community standards. [And InfoWars has] not violated something that would result in them being taken down.”

When Darcy later tweeted the story, Facebook was provoked to reply, citing a counterargument embraced by the right that moderating problematic content is a matter of free speech—taking down conspiracy theories, so the argument goes, would violate an ever-moving but also inviolable boundary of what is and isn’t protected by the First Amendment.

“Facebook’s choice has nothing to do with speech,” Whitney Phillips, co-author of The Ambivalent Internet: Mischief, Oddity, and Antagonism Online, told Motherboard on the phone. Phillips also recently published an in-depth report with Data & Society titled The Oxygen of Amplification, which offers better practices for reporting on far right extremists, antagonists, and manipulators.

“They can make any choice they want on their own terms. Framing it in terms of free speech is a cop-out to the extent that it’s so abstract,” Phillips added, noting that Facebook may have used the words “free speech” colloquially to mean freedom of expression, and not to invoke the First Amendment.

Advertisement

The company was equally vague in its use of whataboutism to say that both sides, left and right, spread misinformation. This type of argument was espoused by InfoWars’ Alex Jones when he launched a smear campaign against victims of the Sandy Hook Elementary School mass shooting, calling them “crisis actors,” and saying the tragic incident was “a 'false flag' orchestrated by anti-gun groups.” Both conspiracy theories are categorically untrue, yet circulated widely on Facebook and other platforms like Twitter and YouTube.

(Jones and companies related to InfoWars are now the subject of a defamation lawsuit filed by the families of several Sandy Hook victims.)

“I can’t even assess that claim because I don’t know who [Facebook is] talking about,” Phillips said. “[Are they saying there is] a dangerous, conspiracy theory hoax website on the left that’s comparable to InfoWars? Were not talking about that, we’re talking about InfoWars.”

In tweets, Facebook defended its decision to demote these posts. Users are now able to report news posts as “false,” and the company is working with third-party fact-checkers to determine a story’s accuracy.

Nobody knows if this strategy will work, and Facebook hasn’t published any data to suggest otherwise. Some have asked why it doesn’t outright ban Pages that spread false information, to which experts say: It’s complicated.

“If they start banning Pages, they’ve taken on the role of publisher, as opposed to platform or technology company,” Phillips said. “Think about the corporate reasoning behind that. ‘If we admit that we’re publishers, that means we could be legally responsible for anything that gets published to Facebook, ever.’ On no plane of reality is any team of lawyers going to let that happen.” (While Phillips doesn’t agree with all of Facebook’s decisions, she tries to understand the corporate motivations behind them.)

Advertisement

The disappointing reality is this: No Goldilocks solution exists right now, and Facebook’s massive scale (2.2 billion monthly active users) probably prevents otherwise good remedies, like content and Page demotion, from working. It’s doubtful that a capitalist Facebook can be an ethical Facebook, serving the interests of the people. “The better Facebook does, the worse it is for the world,” Phillips added.

What’s certain, however, is that Facebook’s modus operandi, to assess bad content on a case-by-case basis, doesn’t work well. Subjectively handling content opens the company to accusations of bias. (“Rules, if they are to be at all actionable, have to be consistent and explicit,” Phillips said.) And as Motherboard previously reported, Facebook’s guidelines for moderators have been troubling at times, allowing versions of white supremacist ideology to flourish on its platform.

Facebook seems unwilling to take responsibility for its own mess. In one tweet, the company wagged its finger at YouTube and Twitter for doing the same—a counterpoint most often used by unrepentant toddlers.

But it’s true that platforms are terrified of looking biased toward conservatives.

During his Congressional testimony this year, Facebook CEO Mark Zuckerberg fielded numerous questions about the company’s alleged censorship of conservative content. Senator Ted Cruz excoriated Zuckerberg for what he called “a pervasive pattern of bias and political censorship,” in part because it deemed the Page of conservative YouTubers Diamond and Silk “unsafe.” (Facebook apologized to them in an email.)

Advertisement

“If big conservative corporate media systems were to turn on Facebook, that could potentially have a big impact on how Facebook does business,” Phillips said. “There’s a lot of money in that echelon of conservative media.”

Indeed, Facebook has long tried to control the optics of seeming anti-conservative. The company is consulting conservative groups on the matter of bias, and did so even in the past.

Will Facebook ever be Not Bad? Who knows. Probably not. Maybe Facebook should nationalize and become Facebook.gov. Maybe Facebook shouldn’t even exist.

But in the immediate future, Facebook could do one thing to engender a little goodwill. Simply be more transparent. With journalists and everyone else. For a digital behemoth, we know shockingly little about the inner working and mechanisms behind the ubiquitous website. And if the Cambridge Analytica saga taught us anything, it’s that Facebook won’t discuss its shortcomings until powerful people come after it.

Lastly, it’s worth noting that Facebook’s penchant for beefing with journalists on Twitter is a relatively new development.

After the Cambridge Analytica scandal broke, the company began using it to riff on news stories and to debate reporters. (The account is still mostly a customer service channel, though.) As Business Insider noted, this signaled an interesting change in Facebook’s public relations strategy, which is notoriously secretive, characterized by unhelpful blog posts and exclusive interviews with prestige media outlets.

Facebook did not respond to Motherboard’s questions about who writes these tweets, and why Twitter is an appropriate place to selectively argue with journalists.

It did respond to Digiday with a single, bizarre emoji: 😱