FYI.

This story is over 5 years old.

Tech

Fake Porn Makers Are Worried About Accidentally Making Child Porn

Images of celebrities as minors are showing up in datasets used in making AI-generated fake porn.
Screenshot from a face dataset of Emma Watson.

Hiding in a .zip file among thousands of otherwise mundane images of Emma Watson’s face are a handful of photos of her as a child.

This collection of images, or faceset, is used to train a machine learning algorithm to make a deepfake: a fake porn video that swaps Watson’s face onto a porn performer’s body, to make it look like she’s having sex on video. If someone uses the faceset that contains images of Watson as a child to make a deepfake, that means that a face of a minor was in part used to create a nonconsensual porn video.

Advertisement

The people making deepfakes and trading these facesets are worried about this. They write disclaimers that younger celebrities’ facesets might contain photos of them as minors. Some are deleting whole sets, such as one of Elle Fanning, until they can be sure it doesn't contain images of her as a minor.

“I deleted all posts with Elle Fanning because it's impossible to prove that she was 18 years old in the old faceset,” user Anton wrote on one deepfakes forum. “It's better to be safe than sorry.”

In a deepfakes Discord server that was recently deleted by the platform, users discussed the possibility that facesets they were using to create fake porn videos contained images of celebrities as minors:

The images in one faceset are close-ups of Watson's face, which The Sun previously traced to scenes in the early Harry Potter movies, when Watson was around 10 years old. Motherboard also found several other facesets of Watson and Fanning that contained images of them as minors.

We don’t know how faces of minors end up in these datasets, but it's possible facial recognition software like Microsoft’s open-source FaceTracker and FaceDetector, which scan through videos and capture stills of a target's face, grabbed them by mistake. One of the Watson facesets, for example, also includes a close-up of actor Matthew Lewis as Neville Longbottom, showing that these facesets contain other types of images that shouldn't be there.

Advertisement

In 2003, Congress passed the PROTECT Act, which classified computer-generated child porn as obscene. In other words, one can be brought to court on an obscenity charge, not a child porn charge, for creating or possessing computer-generated child porn, where no actual children were involved in creating the pornography. A first-time offender convicted of producing child pornography faces up to 30 years in prison, while an obscenity charge carries a punishment of at least five years and up to a maximum of 20 years in prison.

“The courts have ruled that if you take the face of child and put it on the body of an adult that is not protected speech,” Hany Farid, a professor of computer science at Dartmouth College told me in a phone conversation. “Not because of the harm to the child but because the child’s interests are at stake.”

Page Pate, an Atlanta, Georgia-based attorney, told me in an email that in the US, whether someone can be charged based on a few images of a minor within these facesets depends on whether the result actually appears to be a minor engaged in a sexually explicit conduct, per the definition of child pornography.

“But it probably wouldn't cover an image where the face is that of a child and the body was clearly that of an adult,” Pate said. “It's really a case by case determination, and it's somewhat subjective. But unless the body looks like a child's body, I doubt this would be prosecuted in the US.”

Sadly, the issue of virtual child porn isn’t new, but deepfakes, which crowdsources and automates the process of creating fake videos, makes that issue more complicated.

“People have been pasting faces onto things for a long time, I don’t think that this being more technologically complicated would change that analysis,” Charles Duan, associate director of tech and innovation policy at the advocacy group R Street Institute think tank, told me on the phone. “It’s still just as bad.”