Personalized Fake Porn Videos Are Now for Sale on Reddit

Selling AI-generated fake videos is a good way to get sued.

|
Feb 6 2018, 3:00pm

Screenshot from thedeepfakesocietyxxx.com / Shutterstock / Composition: Samantha Cole

Until last week, people in Reddit's deepfakes community, which creates fake porn videos of celebrities using a machine learning algorithm, have been content to post their work for free, framing it as their hobby. But increasingly, they’re taking the opportunity to make a buck off of nonconsenting women’s likenesses, by selling face-swapped fake porn creations for cryptocurrency.

In the weeks since we first reported on it, the r/deepfakes subreddit—home base for AI-generated fake porn videos, mostly of unconsenting celebrities—has exploded to more than 85,000 subscribers. Users have spun several subreddits out from the original subreddit, like r/deepfakesNSFW, r/FakeApp, and r/VideoFakes.

Read more: Everyone Is Making AI-Generated Fake Porn Now

One of those subreddits, r/deepfakeservice, is dedicated to commissioning deepfake videos from other users. The pinned rules post includes guidelines for formatting requests and service offers: For requests, the seller would ask for a description of the video, price, what they need to work with (images of the celebrity needed to create the fake video), and how much time it will take. Where there's demand, there are people waiting to turn a profit.

Here’s how one user outlined their offer:

“Require at least a 2 min source video,” one user wrote in a recent post on r/deepfakeservice. “Porn actress/actor can be suggested or I will choose. Models must be 18+ 72 hour turnaround time. Bitcoin donations only. PM for request.”

The subreddit has been up for about a week and has over 200 subscribers and a handful of requests. It raises the question: If trading fake porn videos for free exists in a legal gray area as we’ve reported, does putting a price tag on these videos change the game?

Bringing commercial use into the deepfakes practice opens the creator up to a lawsuit on the basis of right of publicity laws, which describe the right of an individual to control the commercial use of their name, likeness, or any other unequivocal aspect of their identity, legal experts told me.

Read more: People Are Using AI to Create Fake Porn of Their Friends and Classmates

“The videos are probably wrongful under the law whether or not money is exchanged,” Charles Duan, associate director of tech and innovation policy at the advocacy group R Street Institute think tank, told me. “But what's important is that the commercial exchange creates a focal point for tracing and hopefully stopping this activity. It might be easy to be anonymous on the internet, but it's a lot harder when you want to be paid.”

Deepfake makers' creations infringe on both the copyrights of the porn performers, and the celebrities whose faces are taken from interviews or copyrighted publicity photographs. Duan told me that in cases of copyright violation, one of the best ways to prove wrongdoing is through financial payment processors, such as credit card companies or banks. “Financial institutions are generally single points of failure for people trying to hide improper activity,” he said.

For the two groups of people who are currently most targeted by deepfakes—celebrities and porn performers—copyright laws could provide some protection, after the images are posted. Like celebrities who’ve copyrighted the use of their images, porn performers whose videos are used in deepfakes have slightly more control over their own likenesses than private citizens. Porn sites often have a link for filing a report under the Digital Millennium Copyright Act (DMCA), a 1998 law that escalated penalties for distributing copyrighted work on the internet without permission. That starts the process of a takedown notice.

“The makers of the video would have a perfectly legitimate copyright property case against the uploader, and they would be able to take advantage of the DMCA to have the website take the vid down,” Duan said. “Chances are, a similar practice would work as well for these sorts of videos... Not on behalf of the victim [whose face is being used] but the maker of the video. Which is the weird thing about that whole situation.”

David Greene, Civil Liberties Director at the Electronic Freedom Foundation, told me on the phone that buying and selling, like everything with deepfakes, may be clearly unsavory behavior, but not necessarily illegal.

“I want to separate something that’s probably a dumb legal idea from something that’s just a socially bad thing to do,” Greene said. “If you’re doing it to harass somebody, it’s certainly a bad idea legally and socially.”

Even so, for most of us, there’s not a lot to be done. The deepfakes community is moving from celebrities to less famous people like YouTubers and live streamers—and some still post ideas or requests for making videos from people they know in real life. The copyright protections that cover performers won’t save the rest of us.