FYI.

This story is over 5 years old.

Tech

Facebook Decides Which Killings We’re Allowed to See

The social media giant has repeatedly censored and then restored newsworthy images.
Philando Castile shortly before his death. Screencap via YouTube

A video of the aftermath of a fatal shooting of a black man by a police officer was temporarily removed from Facebook. The company has said the removal was due to a "technical glitch."

"We're very sorry that the video was inaccessible," a Facebook spokesperson told The Telegraph. "It was down to a technical glitch and restored as soon as we were able to investigate."

The 10-minute video depicted Philando Castile of Minnesota, covered in blood after being shot by an officer. Castile's girlfriend Diamond Reynolds, who goes by Lavish Reynolds on Facebook, filmed the Facebook Live video, which has been viewed over 2.3 million times at the time of writing, and shared over a quarter of a million times.

Advertisement

In the video, the woman says the police officer asked her boyfriend for his license and registration. She adds that the man said he had a pistol that he was licensed to carry.

"Oh my god, please don't tell me he's dead," the women says when Castile slumps in the driver's seat. "Please don't tell me my boyfriend just went like that."

The video has since been restored, but with a "Warning—Graphic Video," disclaimer.

"Videos that contain graphic content can shock, offend and upset. Are you sure you want to see this?" the disclaimer continues.

Judging by the timestamps on tweets, the video was restored within around an hour of being removed.

Facebook did not respond to a series of questions about the apparent glitch, or if the video was flagged by a user or by Facebook itself.

Facebook has become the self-appointed gatekeeper for what is acceptable content to show the public

As Facebook continues to build out its Live video platform, the world's most popular social network has become the de-facto choice for important, breaking, and controversial videos. Several times, Facebook has blocked political or newsworthy content only to later say that the removal was a "technical glitch" or an "error."

In April, for instance, Facebook temporarily blocked six pro-Bernie Sanders groups and five groups supporting Filipino politician Rodrigo Duerte. It elected to leave up a video depicting the murder of its videographer, Antonio Perkins, but removed a live video uploaded by ISIS sympathizer Larossi Abballa, who filmed himself after murdering two people in France. Facebook has also removed images of women breastfeeding and images of breast cancer survivors' mastectomies.

Nearly two-thirds of Americans get their news from social media, and two thirds of Facebook users say they use the site to get news. If Facebook is going to become the middleman that delivers the world's most popular news events to the masses, technical glitches and erroneous content removals could be devastating to information dissemination efforts.

More importantly, Facebook has become the self-appointed gatekeeper for what is acceptable content to show the public, which is an incredibly important and powerful position to be in. By censoring anything, Facebook has created the expectation that there are rules for using its platform (most would agree that some rules are necessary). But because the public relies on the website so much, Facebook's rules and judgments have an outsized impact on public debate.

"This is the risk we take when we put all of our data in the hands of a company like Facebook, that has demonstrated it cares more about the bottom line than it does about users' freedom of expression," Jillian York, director of international freedom of expression at the Electronic Frontier Foundation, told Motherboard. "That said, there are genuine concerns to be had with Live—what about on-air beheadings? I don't think Facebook has thought through sufficiently other ways (than censorship) to mitigate against risks like that."

Most would agree that deleting propaganda videos uploaded by an ISIS terrorist is acceptable, but where is the line ultimately drawn? Can we trust Facebook's judgement? And can we trust Facebook to get it right every time without technical glitches and errors?