FYI.

This story is over 5 years old.

Tech

Dakota Pipeline Protesters Claim Facebook Censored Video of Mass Arrest

Are Facebook’s accidental takedowns a symptom of its subjective view toward news?
Screenshot: Facebook/Unicorn Riot

Yesterday morning, 22 protesters were arrested at two Dakota Access pipeline sites in North Dakota, according to the media collective Unicorn Riot. The oil project has been decried by the Standing Rock Sioux tribe for destroying cultural artifacts, including ancestral burial sites. The entire incident was broadcast on Livestream, which the group linked to on their Facebook page, only to find that it had been deleted.

Advertisement

In a statement provided to Antimedia, Unicorn Riot claimed that "Facebook's automated censorship system blocked our video URL, shortly before two of our journalists were arrested onsite." Other Facebook users also reported not being able to share the livestream. The link has since been restored, and is now visible on their page, but a spokesperson for the collective believes the livestream was censored.

"Posts and comments with the URL both immediately triggered popup security alerts. We tried putting the same URL through Bitly shortening and that official Unicorn Riot page post was deleted by Facebook within a few minutes…We also verified that the 'Facebook Debugger' warned that our live video URL violated 'community standards.'"

The livestream depicted police dressed in riot gear, armed with what witnesses identified as assault rifles and pellet guns, arresting two dozen protesters, including two Unicorn Riot journalists and a medic. Police had warned the unarmed protesters, or "water protectors," that trespassers would be arrested. According to Red Warrior Camp, an organization involved with the demonstration, one protester was sprayed in the face with pepper spray by a construction employee.

A Facebook spokesperson told Motherboard that the post was removed from Unicorn Riot's page after the platform's automatic spam filter mistakenly flagged the Livestream link.

"The link was removed in error and restored as soon as we were able to investigate. Our team processes millions of reports each week, and we sometimes get things wrong. We're very sorry about this mistake," the spokesperson said.

Advertisement

Spam is a fact of life on the internet, so it's understandable that the world's largest social network has spam filters in place. But less than a week after Facebook censored one of the Vietnam War's most iconic images, it's worth questioning how the platform separates news, which is often graphic and disturbing, from stories it deems "clickbait" or "unwanted."

This isn't the first time that Facebook's algorithms have been accused of censoring benign or important content. In 2012, some users were prevented from commenting on posts because of content that was "irrelevant or inappropriate." According to the company, at the time, its filter was automatically blocking comments based on their length or inclusion of multiple links—characteristics the algorithm classified as spammy.

Read more: You Need to Care About Facebook Censoring an Iconic Vietnam War Photo

In July of this year, Facebook briefly removed a horrific video showing the fatal shooting of Philando Castile by a police officer. The company chalked this up to a "technical glitch," but the incident was an upsetting reminder that Facebook wields the power to subjectively decide what news is seen by its 1.65 billion users.

That same month, Facebook admitted to temporarily blocking Wikileaks links to hacked Democratic National Committee emails. Again, the company alleged this was an accidental result of its automatic spam filter over-eagerly flagging "unsafe" content.

Advertisement

Unfortunately, it's difficult to understand how Facebook's spam filters work. Currently, there's little information publicly available to determine whether its filters are as automatic, or autonomous, as the company says.

Based on pages from Facebook's Help Center, spam is classified primarily as unwanted or malicious content. There are tips for how to report it, but few details about how the filter works. If you've been blocked from sharing or commenting on a post, Facebook's advice is to basically wait it out (blocks can last days) or to "slow down or stop this behavior."

Screenshot: Facebook/Unicorn Riot

In the case of Unicorn Riot's post, Facebook seemed to imply that the issue was with the link itself, and not the page's content. What this indicates is that Facebook's wide-net system for making its platform a safer place is both flawed and unreliable.

"Also, as one member of the collective, I should point out it is obviously concerning when a large media conglomerate blocks URLs to competing video platforms," a Unicorn Riot member stated.

Today's mix-up may not be censorship in its most traditional sense, but it suggests the company views accidental takedowns as a necessary sacrifice for the network's greater good. And is that really what we've come to expect from the internet's gatekeeper?

Get six of our favorite Motherboard stories every day by signing up for our newsletter.

Correction: The story mistakenly referred to "Unicorn Riot" as "Unicorn Outlet" three times. Those instances have been corrected to accurately reflect the media collective's name.