FYI.

This story is over 5 years old.

Tech

Real Humans Will Review the Nudes You Send Facebook as Part of Its Anti-Revenge Porn Program

Facebook is testing a new feature in Australia to combat revenge porn on the platform. This is how it works.
Real Humans Will Review the Nudes You Send Facebook as Part of Its Anti-Revenge Porn Program
Image: Shutterstock

Will Mark Zuckerberg one day witness a blurred photo of my nude body? I asked myself this question after finding out how Facebook designed a new pilot program its testing in Australia to curb the spread of revenge porn. We now have more clarity from Facebook about how it will work.

As part of the program, users are being asked to upload nude photos of themselves to the social network voluntarily. The process works like this: First a user files a report with the Australian eSafety Commissioner’s office. Then, they upload an explicit picture of themselves to Facebook Messenger (they can do this by starting a conversation with themselves) and flag it as a “non-consensual intimate image” for the social network.

Advertisement

A member of Facebook’s community operations team then manually confirms that the image is in violation of the company’s policies. The images will be blurred-out and only accessible to a specially-trained team, according to an email from a Facebook spokesperson.

The remainder of the process is then automated. Facebook builds what is a referred to as a “hash” of the image, meaning it creates a unique sketch or fingerprint of the file. If another user tries to upload the image to Facebook or Instagram, Facebook will test it against its stored hashes, and stop it from being distributed if it is marked as revenge porn.

In the original article I wrote explaining the new anti-revenge porn feature, I said that Facebook was not storing the photos, only the hashes. Answers to a series of follow-up questions I sent to Facebook revealed that the social network is indeed storing the blurred out images for a period of time. It’s not clear how long that period is.

The spokesperson said that while they are being stored, the naked photos will only be accessible to a small group of people. After Facebook discards the images, it will only retain the hashes. The spokesperson said that Facebook is not able to reconstruct images based on the hashes they keep.

Facebook is likely having human reviewers be part of the process in order to prevent legitimate images from being inadvertently tagged as revenge porn. As recent studies have shown, image-recognition algorithms are still extremely easy to spoof.

Got a tip? You can contact this reporter securely on Signal at +1 201-316-6981, or by email at louise.matsakis@vice.com

Get six of our favorite Motherboard stories every day by signing up for our newsletter.