Why Election Interference Campaigns on Facebook Are Still Working

“If they’re not caught, it leads to action in the real world. If they’re exposed, they’ve already undermined trust in the conversation."

|
Aug 1 2018, 7:22pm

Image: Shutterstock

Wednesday, Facebook revealed it had discovered what it claimed were inauthentic actors operating on its platform. Before telling the public, Facebook deactivated the eight pages, 17 accounts, and seven Instagram accounts. The various groups and pages targeted specific, typically left-leaning, political activists and one had even coordinated a counter-protest to the upcoming Unite the Right 2 rally in Washington, DC with five other legitimate pages.

Facebook was unwilling to definitively attribute the disinformation campaign to Russian hackers, but the tactics are similar to those used by the infamous Internet Research Agency (IRA), the St. Petersburg based troll factory thought to be responsible for many of the Kremlin-backed disinformation campaigns that helped sway the 2016 election. One account linked to the IRA got administrative access to one of the now disabled pages for seven minutes.

It fits a pattern typical to Russian disinformation campaigns: Moscow is more interested in sowing division and distrust than it is about supporting any particular political movement. A day before it killed the pages, Facebook alerted the Atlantic Council’s Digital Forensics Research Lab (DFRL) to the pages. On short notice, the team at DFRL scrambled to find out as much as it could, and published a report on its findings shortly after Facebook’s press conference.

“Frankly, 18 hours is a is a tough amount of time to have a high confidence assessment of anything in terms of disinformation,” Graham Brookie, Director and Managing Editor at DFRL, told me over the phone. Brookie knows what he’s talking about. Before joining the Atlantic Council, he spent four years on the National Security Council. Before that, he was an adviser to the President’s Homeland Security Advisor Lisa Monaco.

Like Facebook, DFRL isn’t ready to definitively say that Russian hackers are behind these pages, but the similarities are there. “The pages that we analyzed exhibit patterns of behavior and activity that correlate very highly with that of the Russian troll farm that was operating from St. Petersburg from 2014 to 2017,” he said. “That’s not attribution, that’s correlation at best…they’re getting more sophisticated and harder to detect.”

That correlation is strong though. The tactics are similar to those used by known Russian hackers in previous years and the pages are full of language quirks specific to Russian language speakers. Colloquial language problems also betrayed DNC hacker Guccifer 2.0, who claimed to be Romanian but had a poor command of the language.

This round of alleged Russian political interference differs from previous attempts though. Its efforts in 2016 were broad, aimed at spreading misinformation to as many people as possible. This time, the hackers were very specific.

“They were all focused on audience engagement for specific demographics,” Brookie said. The five pages Facebook removed included one dedicated to health and well being, one about progressive politics, one dedicated to organizing resistance against Trump, one targeted at African Americans with a focus on Egyptian mythology, and one targeted at Latin Americans.

More than 290,000 Facebook accounts followed these pages, with Resisters—the anti-Trump activist page—being the most popular. “What we saw in the case today was more tailored to specific demographics, likely more tailored to the specific political climate and the specific political cycle,” Brookie said. He explained that midterms are fertile ground for activist groups and that the Resistance movement has changed the political climate, which forced the groups who want to meddle in America’s political process change their tactics.

Brookie said this new method involved driving user engagement for a specific demographic to a specific page, then creating a call to action once the page reaches a critical mass of followers.

“So you have audience growth online then translating that to activity in the real world,” he said. “In this case a counter protest, and that’s significant.”

One of the things that makes disinformation campaigns like this so dangerous is that even when the hackers lose, they win.

“If they’re not caught, it leads to action in the real world,” Brookie said. “In this case, a counter protest that might lead to violence based on what we saw last year in Charlottesville. If they’re exposed, they’ve already undermined trust in the conversation we’re having right now. So in both those scenarios, they win.”

The Resisters page had partnered with five other activist groups to organize a protest against the upcoming white supremacist rally in Washington DC. When Facebook disabled the page, it disabled the event and it has left the legitimate activists scrambling to quickly reorganize. “The event was created by Resisters, but was used for legitimate protest organizing and promotion,” ShutItDownDC—one of the other activist groups—said via Twitter. “Specifically, local organizers put our own messaging, graphics, and videos in it. We did not promote anyones views except our own.”

Funding or supporting both sides of a divisive issue is a popular method of spreading disinformation that the Kremlin has used in its own country for years. Typically, the Kremlin will organize a pro-government rally of some kind then inform opposition groups and fund them through a third party. After the two clash, it will reveal it funded both sides. After the information comes out, activists on both sides lose faith in their cause. It’s a quick way to create a cynical public. Recently, we’ve seen Russian trolls use the same tactic in the US—most famously to organize a protest and counter-protest about Muslims in Houston, Texas in 2016.

It’s a new world and we’re only beginning to understand the way hostile foreign powers can use social media to manipulate a rival country’s politics.

The more educated the public becomes about the specific methods used by Russian trolls, the less effective those methods will be. The tactics are changing, but it seems like the tagerts will always be those places where people have the most arguments and people are more willing to believe the worst about the opposition. “That's the point of disinformation,” Brookie said “They're inserting themselves into a very real and organic conversation.”