FYI.

This story is over 5 years old.

Tech

We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now

A user-friendly application has resulted in an explosion of convincing face-swap porn.
Daisy Ridley's face in a porn performer's video, made using FakeApp.

In December, Motherboard discovered a redditor named 'deepfakes' quietly enjoying his hobby: Face-swapping celebrity faces onto porn performers’ bodies. He made several convincing porn videos of celebrities—including Gal Gadot, Maisie Williams, and Taylor Swift—using a machine learning algorithm, his home computer, publicly available videos, and some spare time.

Since we first wrote about deepfakes, the practice of producing AI-assisted fake porn has exploded. More people are creating fake celebrity porn using machine learning, and the results have become increasingly convincing. Another redditor even created an app specifically designed to allow users without a computer science background to create AI-assisted fake porn. All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.

Advertisement

These are developments we and the experts we spoke to warned about in our original article. They have arrived with terrifying speed.

Read more: People Are Using AI to Create Fake Porn of Their Friends and Classmates

Shortly after Motherboard published its story, deepfakes created a subreddit named after himself dedicated to his practice two months ago. In that short time, it has already amassed more than 15,000 subscribers. Within the community, the word “deepfake” itself is now a noun for the kinds of neural-network generated fake videos their namesake pioneered.

Another user, called 'deepfakeapp,' created FakeApp, a user-friendly application that allows anyone to recreate these videos with their own datasets. The app is based on deepfakes' algorithm, but deepfakeapp created FakeApp without the help of the original deepfakes. I messaged deepfakes, but he didn’t respond to a request for comment on the newfound popularity of his creation.

Deepfakeapp told me in a Reddit direct message that his goal with creating FakeApp was to make deepfakes’ technology available to people without a technical background or programming experience.

“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” he said. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”

Advertisement

In early January, shortly after Motherboard’s first deepfakes story broke, I called Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation, to talk about the implications of this technology on society at large: “I think we’re on the cusp of this technology being really easy and widespread,” he told me, adding that deepfakes were pretty difficult to make at the time. “You can make fake videos with neural networks today, but people will be able to tell that you’ve done that if you look closely, and some of the techniques involved remain pretty advanced. That’s not going to stay true for more than a year or two.”

In fact, that barely stayed true for two months. We counted dozens of users who are experimenting with AI-assisted fake porn, some of which have created incredibly convincing videos.

Redditor UnobtrusiveBot put Jessica Alba’s face on porn performer Melanie Rios’ body using FakeApp. “Super quick one - just learning how to retrain my model. Around 5ish hours - decent for what it is,” they wrote in a comment.

Redditor nuttynutter6969 used FakeApp to put Daisy Ridley’s face on another porn performer:

Fakes posted in the subreddit have already been pitched as real on other websites; a deepfake of Emma Watson taking a shower was reuploaded by CelebJihad—a celebrity porn site that regularly posts hacked celebrity nudes—as a “never-before-seen video above is from my private collection, and appears to feature Emma Watson fully nude and flaunting her naked sex organs while showering with another girl.”

Advertisement

Other redditors have taken video trained from celebrities’ public Instagram stories and used them to transfer faces onto nude Snapchats posted by amateurs: “I lucked out that this amateur does similar silly dancing moves and facial expressions as Chloe sometimes does in her instagram stories,” the creator of a deepfake of actress Chloe Bennet wrote.

Most of the posts in r/deepfakes so far are porn, but some users are also creating videos that show the far reaching implication of a technology that allows anyone with sufficient raw footage to work with to convincingly place any face in any video. A user named Z3ROCOOL22 combined footage of Hitler with Argentina’s president Mauricio Macri:

According to deepfakeapp, anyone who can download and run FakeApp can create one of these videos with only one or two high-quality videos of the faces they want to fake. The subreddit’s wiki states that FakeApp is “a community-developed desktop app to run the deepfakes algorithm without installing Python, Tensorflow, etc.,” and that all one needs to run it is a “good GPU [graphics processing unit, the kind that high-end 3D video games require] with CUDA support [NVIDIA’s parallel computing platform and programming model].” If users don't have the proper GPU, they can also rent cloud GPUs through services like Google Cloud Platform. Running the entire process, from data extraction to frame-by-frame conversion of one face onto another, would take about eight to 12 hours if done correctly. Other people have reported spending much longer, sometimes with disastrous results.

Advertisement

“In honor of all who have tried and failed to make this work - my fucked-up video,” user MrDrPresidentNotSure wrote of their extremely horrifying botched face swap:

“When I get this in convert, it means I should trained more time?” user yu78156853 wrote of their failed FakeApp attempt:

The Princess Leia face swap from Rogue One by user derpfake stands out to deepfakeapp as particularly good, he told me. The FakeApp version and the Hollywood-produced scene from the actual film are nearly identical, at least in this low resolution gifs.

“My hope is that in the next few years machine learning tools like this one become more widely available and give everyday people that don’t necessarily know tech the opportunity to explore and create with the high-tech digital manipulation technology that’s mainly the domain of big-budget SFX companies today,” deepfakeapp told me.

“Top is original footage from Rogue One with a strange CGI Carrie Fisher. Movie budget: $200m,” derpfake wrote of his creation. “Bottom is a 20 minute fake that could have been done in essentially the same way with a visually similar actress. My budget: $0 and some Fleetwood Mac tunes.”

An incredibly easy-to-use application for DIY fake videos—of sex and revenge porn, but also political speeches and whatever else you want—that moves and improves at this pace could have society-changing impacts in the ways we consume media. The combination of powerful, open-source neural network research, our rapidly eroding ability to discern truth from fake news, and the way we spread news through social media has set us up for serious consequences.

“Socially and culturally, this is exploitative but quite survivable,” Jay Owens, digital media analyst and research director at audience intelligence platform Pulsar told me in an email. “Viral videos and celebrity media already operate on a plane of pure entertainment—but this'll only get sexier and meme-ier and lulzier and ever-more unreal.”

Deborah Johnson, Professor Emeritus of Applied Ethics at the University of Virginia’s school of engineering, told me there's no doubt this technology would get so good that it’d be impossible to tell the difference between an AI-generated face swap and the real thing.

“You could argue that what’s new is the degree to which it can be done, or the believability, we’re getting to the point where we can’t distinguish what’s real—but then, we didn’t before,” she said. “What is new is the fact that it’s now available to everybody, or will be… It’s destabilizing. The whole business of trust and reliability is undermined by this stuff.”