FYI.

This story is over 5 years old.

Tech

Facebook Is Launching a New Machine Learning Algorithm to Combat Revenge Porn

Facebook told Motherboard that this tool is trained on "previously confirmed non-consensual intimate images."
A woman scrolling Facebook Messenger on her phone.
Image via Shutterstock

Facebook announced on Friday that it will roll out new technology for detecting intimate, non-consensually shared images known as revenge porn.

A machine learning system will detect “near nude” images and videos that are shared on the platform without the owner’s permission. This is launched in conjunction with a pilot program for providing resources to victims of revenge porn.

Previously, Facebook and Instagram relied on user reports to detect revenge porn when it arose on the platform. Photo-matching technology attempted to keep the images from being shared on the platforms again, but the process largely relied on people reporting videos and photos when they saw them. Now, Facebook hopes to put AI to work for the reporting process.

Advertisement

The announcement states:

This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared.

According to the announcement, a “specially-trained” member of Facebook’s Community Operations team will review the content that the technology flags, and decide whether it violates Facebook’s community standards, and disable the account that posted it. An appeals process will be in place if the technology or the moderator makes a mistake.

I asked Facebook for details on how the algorithm works: “We use technology to determine two main things: whether an image or video contains nudity or near nudity and whether the image or video was shared in a vengeful manner,” a spokesperson told me in an email. “Our technology has been trained on previously confirmed non-consensual intimate images to recognize language patterns and keywords that would indicate whether an image or video was shared without consent.”

Alongside algorithmic detection technology, Facebook is launching a section within its Safety Center to help revenge porn victims access resources after they’ve been affected, as well as tools for reporting and prevention.

Facebook is still asking users to submit their own nudes to help detect nonconsensually-shared imagery in the future. On the resource hub page, a section called NCII Pilot outlines how Facebook is partnering with prevention and harm reduction organizations to report and hash those shared images, so that they’re not sharable again. These organizations include the National Network to End Domestic Violence, the Cyber Civil Rights Initiative, UK-based Revenge Porn Helpline, and India-based Center for Social Research.

Advertisement

Brooklyn-based attorney Carrie Goldberg, whose law firm specializes in revenge porn, told Motherboard that they’ve helped people victimized by revenge porn “remove tens of thousands of images from the internet,” much of them from Facebook.

“We’ve found that social media is one of the main places the images propagate and are shared. So it’s excellent news that Facebook is using its technical sophistication to do good for its users, and frankly, to address a problem that it’s partly responsible for causing,” Goldberg said. “I’d like to think that this is more than just a PR stunt, given the timing of Facebook being so irresponsible with user privacy. The involvement of guardians of online privacy like the Cyber Civil Rights Initiative indicates Facebook’s efforts are genuine.”

The fact that Facebook is partnering with organizations that have been working on the issue of revenge porn for years is promising, but certainly not a guarantee that it will never happen on the platform again.

“It’s always a relief to see huge social media platforms like Facebook take image based abuse seriously, and rolling out programs and practices to protect its users,” Katelyn Bowden, founder of BADASS, a group of activists against revenge porn, told Motherboard. “I’m hopeful that this pilot will work how it’s intended and prevent images from being shared without consent, but I’m hesitant in putting too much faith in Facebook’s ability to adapt to a form of abuse that is constantly changing methods and platforms.”

Joseph Cox contributed reporting.