FYI.

This story is over 5 years old.

Tech

Leaked Documents Show Facebook’s Struggles With Dick Pics

Motherboard has obtained training material for Facebook moderators which shows the social media giant's shifting policy on "unsolicited adult nude genitalia imagery sharing," or dick pics.
Image: Sam Cole

This piece is part of an ongoing Motherboard series on Facebook's content moderation strategies. You can read the rest of the coverage here.

Facebook’s moderators are grappling with the challenges of revenge porn, sextortion, and people sharing unsolicited dick pics with users, according to newly leaked documents obtained by Motherboard. The company’s moderators have recently been told to stop punishing people who complain about receiving dick pics, which shows how Facebook’s own policies around nudity are constantly evolving.

Advertisement

Motherboard has obtained a selection of training materials for Facebook moderators, as well as a copy of an extended training video that offer more context on how Facebook trains it moderators to fight against revenge porn and sextortion. Some of the material overlaps with what The Guardian published in 2017, but the material provided to Motherboard is more recent and includes different information. Motherboard confirmed that some people mentioned in the training video and materials are employees of Facebook, suggesting that the materials are genuine. Facebook did not dispute specific and non-public details included in other parts of the cache when asked about them.

Revenge porn is a global problem across tech platforms and social networks, with people sharing intimate and explicit photos of ex-girlfriends, partners, and others without their consent. And sextortion, where people harrass targets either for money or more nude photos, is a huge issue particularly on Facebook.

“‘Sextortion’ is a gigantic problem on Facebook—we’ve seen it grow into a volume business that entraps and extorts people every single day,” Carrie Goldberg, an attorney specializing in sexual harassment and revenge porn cases, told Motherboard in an email.

Caption: A section of the training materials obtained by Motherboard. Image: Motherboard.

In the video obtained by Motherboard, the trainer acknowledges that Facebook has been disabling the accounts of victims who post about receiving unwanted dick pics, or “unsolicited adult nude genitalia imagery sharing” in Facebook’s own parlance. In several examples given during the training, Facebook users posted screenshots of unsolicited dick pics on their walls to complain about being sent them.

Advertisement

In the past, Facebook moderators may have flagged these images as revenge porn, and would have pushed the material for a superior to review (called an “escalation” in Facebook’s materials), who then may have disabled the victim’s account. In one publicly reported case, Facebook deactivated the account of a woman who posted censored screenshots of her interaction with a man who sent her a dick pic; she responded to the man with a barrage of other dick pics she found online.

But, according to the video, Facebook is changing its policy, telling moderators to simply delete the images for violating Facebook’s policy on nudity instead, rather than penalising the user themselves. The training material obtained by Motherboard also acknowledges that Facebook will need to balance the approach to protect male victims of non-consensual intimate image sharing, and make sure they aren’t impacted by this change in policy either.

Showing how complex the rules for Facebook moderators can be though, the victim of the dick pic has to explicitly condemn the dick pic in their own post for Facebook to take that action, according to the video. Otherwise, moderators may see the post as another piece of revenge porn.

Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

Indeed, although Facebook’s training materials say that distinguishing between financially driven sextortion and revenge porn should be easy, the trainer acknowledges that some situations can be far less clear. This is the sort of ambiguity that Facebook is grappling with, and the company has been criticized recently for miscategorizing nudity. For example, it has not only banned dick pic victims in certain cases but also flagged a historical photo of the so-called Napalm Girl as a piece of pornography.

When it comes to flagging something as revenge porn, the vengeful context of the post can be crucial as well as the image itself, according to the video and training material.

“Interesting that they are focused on motivation—our stance is that it doesn’t really matter. The crime of nonconsensual pornography occurs when an offender distributes an image without consent, we don’t think (and several state laws reflect this) that the offender’s intent matters,” Goldberg, the attorney, said.

For revenge porn, moderators check that the image contains some nudity or has a sexual pose, and confirm the lack of consent by checking if the post has a vengeful context, as the slides previously published by The Guardian show. But in the newly leaked slides, moderators are now also told to check whether the face of the person in the photo and the person reporting the image matches, as well as their name.

“We use photo-matching technologies to help prevent further sharing of images, and have a dedicated team of content reviewers who are trained to review and remove them,” Facebook told Motherboard in a statement.