When Death Becomes Social Media Spectacle
Image: thierry ehrmann/Flickr

FYI.

This story is over 5 years old.

Tech

When Death Becomes Social Media Spectacle

Media professionals are wary of Facebook moderation guidelines.

Facebook's moderation guidelines—the infamous Facebook Files leaked this weekend by The Guardian—have journalists, media ethics scholars, mental health experts and most breathing organisms scratching their heads over the dubious nature of many of the social media platform's policies regulating issues such as pornography, racism and terrorism. Chief among the controversial guidelines set forth by the leaked internal manual is one that allows users to livestream suicides and other attempts at self harm.

Advertisement

According to the documents, Facebook "doesn't want to censor or punish people in distress who are attempting suicide" and hopes that by leaving such content up for a certain period of time, other users may be able to intervene.

However, many media professionals are concerned by what they see as a hasty and misguided approach.

"Facebook has no obligation to give a platform to someone who wants to kill themself any more than the keepers of the Golden Gate Bridge have no obligation to make it easier to jump to your death," Emmy Award-winning journalist Al Tompkins, currently a senior faculty at the Poynter Institute, told Motherboard.

"Allowing death to become a social spectacle does nothing to further thoughtful conversation about suicide or to point to alternative solutions to life problems," he said.

On his end, Aidan White, director of the Ethical Journalism Network, called Facebook's policy "utterly self serving."

"People who are disturbed and self-harming need support and help, and promoting their distress helps no one," he told Motherboard. "Facebook are besotted by a business model that makes no distinction between the quality of information or the nature of the content that is uploaded."

Earlier this month, Facebook CEO Mark Zuckerberg announced the social media giant would be adding 3,000 people to a moderation task force in charge of regulating sensitive content. The decision was largely fueled by outrage over a series of viral videos showcasing users harming themselves or others.

Advertisement

Susan McGregor, assistant director of the Tow Center for Digital Journalism at Columbia University, told Motherboard the diffusion of violent content, including self harm footage, is nothing new to the media industry. Images of immolating Buddhist monks were mass distributed in the '60s, for example. But social media platforms don't enjoy the same help and training, developed over decades, that traditional media does, McGregor says.

"All of a sudden these Facebook moderators, who have a couple of weeks of training and who are supposed to spend most of their time reviewing a lot of very disturbing content, have to make judgment calls about [content] and there's no indication they have the training or expertise to do so or the support to deal with it," she said.

Ultimately, McGregor says it will be a matter of time before we see how livestreaming self harm impacts users.

"The only way to go about this responsibly is to go slowly and make sure you understand what the ramifications of these new possibilities may be," she says.

"But no one in the tech world wants to hear, 'Go slowly.'"

Subscribe to Science Solved It, Motherboard's new show about the greatest mysteries that were solved by science.