FYI.

This story is over 5 years old.

Tech

Facebook Needs a Public Editor

Facebook deleted two videos showing a standoff between Korryn Gaines and a SWAT team, who later shot her to death.
Image: Facebook

On Facebook earlier this week, a Baltimore woman named Korryn Gaines uploaded videos of a standoff she had with a Baltimore County SWAT team. And yet, we still don't know what happened. We know now that Gaines was shot and killed by the police, apparently after threatening them with a shotgun. We don't know anything more than that because, during the encounter, police asked Facebook to delete the videos, and Facebook complied.

Advertisement

Last month, Philando Castile was shot to death by Falcon Heights, Minnesota police. His girlfriend, Diamond Reynolds, livestreamed his last moments on Facebook, but for at least an hour the stream was removed from Facebook. The company blamed a "technical glitch" for the temporarily deletion of the video, but hasn't given details about what sort of glitch caused it or if it had taken steps to fix the issue. A report in The Register suggested that there was no glitch and claimed that police deleted the video from Reynolds's phone (which Reynolds has also alleged), a report that Facebook has denied. What's the truth? We don't know.

Increasingly, Facebook's new video features are being used to allow people to stream newsworthy, unfiltered videos as they play out in real time. Gaines did not broadcast live, according to Facebook, but was uploading the videos as she was taking them. The video of Castile's shooting is an indispensable document when it comes to documenting and spreading the word about unprovoked law enforcement shootings of black men. And yet Facebook, as a private company, has no legal responsibility to host or display these videos, it has no legal responsibility to explain why any particular video was taken down, and it has little legal incentive to push back against a law enforcement request like the one made in Gaines's case.

Image: Facebook/Korryn Gaines

If Facebook is going to decide which killings we're allowed to see, however, then it has a moral responsibility to its customers and to the public to explain how specific videos are removed and why they are removed. And then it needs to empower someone to hold the company accountable when it makes mistakes.

Advertisement

To help facilitate this, Facebook needs to hire an ombudsman, and it needs to give this ombudsman the access and platform necessary to allow them to provide a useful service to the public. Inverse reporter Joe Carmichael also advocated for the position last month.

If you're not familiar with the position, ombudsmen, sometimes called "public editors," serve as a public advocate for the readers of a newspaper, the listeners of a radio station, or the viewers of a cable news outlet. Ombudsmen are usually hired on a fixed contract, meaning they can't be fired for criticizing the outlet that they're covering. The New York Times's new public editor, Liz Spayd, has recently written columns explaining why the newspaper recently referred to Michelle Obama as "Mrs." and Melania Trump as "Ms.," as well as one taking the Times to task for changing the language of an essay critiquing police interactions with black people (after it had been published) without including a note that the article had been changed.

A Facebook ombudsman would be able to answer questions about Facebook's content decisions after the fact, and would also have access to the internal decision makers who are responsible for them. They would send the message that Facebook at least cares about the notion of transparency, especially at a time when Facebook has huge influence over the media, which it tends to exert at a whim. (As many have pointed out, Facebook's opaque tinkering with its news feed is part of its self-proclaimed role as "public editor of the internet.")

Advertisement

Esther Enkin, president of the Organization of News Ombudsman, told me that "accountability to the public is a critical part of responsible journalism." As Facebook increasingly becomes a place where news is not only discussed, but is broken, altered, and deleted, it has a responsibility to explain its editorial decisions.

Facebook told me in an email that it has no plans to hire an ombudsman, but that it does have "community standards" that explain when content will be removed. "Anyone can report content to us if they think it violates our standards," Facebook told me. "Our teams review these reports rapidly and will remove the content if there is a violation."

Having an internal team that works behind the scenes isn't enough, however.

"Members of the public have the right to question when those stated policies and values might be breached," Enkin told me. "Facebook and other platforms exist in a different environment with a different set of expectations than legacy media, but there is always a need for accountability."

Facebook has a moral responsibility to its customers and to the public.

Removing Gaines's video during an active SWAT team raid may have been the right move, but then again, maybe it wasn't. We don't know, because we haven't seen the video, and in removing the video, Facebook is inherently allowing us only to see the law enforcement narrative. Facebook did not immediately respond to a Motherboard request for comment about the removal of Gaines's videos.

Advertisement

Local authorities asked Facebook to take down the video "in order to preserve the integrity of the negotiation process with her and for the safety of our personnel [and] her child," Baltimore County Police Chief James Johnson said. "Ms. Gaines was posting video of the operation as it unfolded. Followers were encouraging her not to comply with negotiators' request that she surrender peacefully."

Yet time and time again, we've learned that police narratives of shootings aren't always accurate and are often outright lies.

"Facebook is creating a dangerous precedent of censorship at the request of police," Nicole Carty, a senior campaigner at SumOfUs, a nonprofit that seeks to hold corporations accountable for their actions, said in a statement.

"Social media and shareable video have been instrumental in helping build awareness about the ongoing epidemic of police violence against people of color in the United States, and Facebook should never censor such content," she said. "Facebook must clarify its actions, fully reactivate Korryn Gaines' account, including the videos on her page, and cease this dangerous precedent of censoring user accounts at the request of the police."

Taking calls for transparency seriously would be a good first step.