There's No Safe Way to Keep Child Porn and Murder Off Facebook
Image: Shutterstock

FYI.

This story is over 5 years old.

Tech

There's No Safe Way to Keep Child Porn and Murder Off Facebook

Facebook is hiring 3,000 more people to do what experts say is one of the most psychologically traumatizing jobs in tech.

Responding to widespread outrage after a Facebook Live murder went viral, Mark Zuckerberg announced that the company is hiring 3,000 additional people to moderate live video and other content on the social network. This means that there are now 7,500 people around the world whose job is to make sure Facebook users aren't subjected to child porn, regular porn, suicide, and murder as they scroll through their news feed.

Advertisement

It is well documented that these jobs are, psychologically speaking, among the worst in tech. Reporting for Wired and in a recent documentary called The Moderators, Adrian Chen has detailed the long hours, low pay, and occasional horror of scrubbing Facebook and other social media platforms of dick pics, graphic images of sexual assault, child exploitation, and violence.

"Content moderation is a necessary but limited solution," Kat Lo, a PhD student at the University of California, Irvine who is studying online communities and moderation, told me. "The fact that Facebook is adding 3,000 more content moderators is great for Facebook consumers but the fact is you're subjecting a lot more people to something we've seen as very traumatic."

Many of the jobs are outsourced to contractors, and according to Chen's reporting and a bombshell story published in December in Germany's SZ newspaper, the people doing it do not feel adequately supported or trained.

Here's what some of Facebook's content reviewers told SZ:

"I've seen things that made me seriously question my faith in humanity. Things like torture and bestiality."

"Since I saw child pornography videos, I may as well have become a nun. I can't even handle the idea of sex any more. I haven't been intimate with my partner for over a year. I start shaking the moment he touches me."

Content moderation is the invisible hand that shapes what we see on the internet. Even in its most mundane form, Facebook content moderation is a dystopic job—an underclass of semiskilled workers scrolling endlessly through thousands of selfies, vacation photos, and links in search of the small percentage of posts that are illegal or unsavory, so that the public at large doesn't have to see them. When asked why they do it, most moderators utter some variation of "someone has to do it."

Advertisement

Right now, there simply is no psychologically "safe" way to keep Facebook clean.

Examples of categories in which one can be an expert: child safety and Arabic language

Facebook has moved toward using artificial intelligence tools to assist human moderators. According to a spokesperson, the company uses AI to make sure that if 1,000 people report a pornographic image, only one reviewer actually has to see it. It also uses machine learning to sort posts into categories that are then funneled to experts in certain subject matter (examples of categories in which one can be an expert: child safety and Arabic language).

Image recognition software can automatically remove previously deleted images if they are reposted by other users, and Facebook also uses databases of known child porn to flag and remove posts as they are uploaded.

These are good, proactive steps, but the fact remains that for now—and probably for the foreseeable future—humans will have to remain in the loop. They will need to assist with false positives—the removal of, say, a woman breastfeeding, menstrual art, or war atrocities—as well as make calls on content that toe the line of whatever Facebook's community standards happen to be at that particular moment.

"As a job category, they need far more support—you would expect things like PTSD counseling—and there aren't any companies that are really doing that"

While Facebook is right to invest in AI to make content moderation less horrible, it has utterly failed to be transparent about the working conditions and psychological toll the job takes on the people who do it.

A Facebook spokesperson admitted that the work can be difficult and said Facebook makes psychological support and wellness professionals available to everyone on its content moderation team. The company said that it makes these services available to its contractors as well.

But Facebook hasn't been specific about what these resources are or how many people take advantage of them. While it is happy to perform psychological experiments on its users, it hasn't published any research about the mental effects of scrolling through the worst humanity has to offer on a daily basis.

"As a job category, they need far more support—you would expect things like PTSD counseling—and there aren't any companies that are really doing that," Lo said. "Facebook needs to be more visible and transparent about how they're helping these moderators and helping them maintain their mental health in the face of a job that's quite clearly traumatizing."