Quantcast
deepfakes

Targets of Fake Porn Are at the Mercy of Big Platforms

With little legal recourse and no technological solution in sight, people who end up in deepfakes can only turn to platforms like Reddit for help, and Reddit isn't responding.

Samantha Cole

Illustration: Seth Laupus

It’s been eight years since television news host Ana Kasparian first found an image of her face Photoshopped onto a pornographic image online.

Since then, she’s been the target of endless online harassment—much of it involving editing photos of her into hardcore porn that didn’t happen, sometimes used as blackmail. Tuesday, a digitally altered video of her was posted on the subreddit r/deepfakes, showing what looks like Kasparian masturbating.

The fake video was created with a machine learning algorithm that's able to take Kasparian's face from online videos and believably place her likeness on the body of a porn performer.

Since Motherboard first reported on the deepfakes community, many of our readers have expressed concern about the implications of this technology, which allows anyone with consumer-level PC hardware, access to photos of their target, and a willingness to experiment with machine learning applications to create fake porn. One question has come up repeatedly: What can we do in the face of this strange new AI-porn world, when all of our faces are already scattered across the internet?

Read more: AI-Assisted Fake Porn Is Here and We’re All Fucked

There isn’t much legal recourse for victims of a deepfake—this is, in part, why the genre took off explosively, as people realize that making them is probably not illegal. But there are a few actions individuals can take to protect themselves. And as a society, we will have to grapple with the fallout of the democratization of a powerful technology that may fundamentally change the nature of visual media and how we interpret truth.

As as host on the news commentary show The Young Turks, Kasparian is a particularly vulnerable target. The deepfake algorithm needs hundreds of clear images of a person's face from multiple angles in order to successfully swap it into a porn movie, and there are countless hours of video of Kasparian online. “At first I was really concerned,” she told me in a phone interview. “Then I realized the more I tried to do something about it, the less likely it was that it would stop. I think people love to see that something gets under your skin.”

Celebrities, porn performers, newscasters, politicians, popular YouTubers, Twitch streamers, and social media celebrities have become early targets of the deepfake community.

But it’s not only them: We’ve also seen people talk about trying to make fake porn of their friends, acquaintances, and colleagues by scraping their social media accounts for images of their face.

It’s unreasonable to expect anyone to delete or hide their photos from the internet, and it’s asking a lot of people to delete their online presences to avoid falling victim to creeps. But if you’re really worried about deepfakes, this might be your only option at the moment.

"If you are running a business model that accommodates vengeful assholes in humiliating other people, the website should have to pay for it and the vermin running the website should be exposed.”

Parker Higgins, a digital rights activist at Freedom of the Press Foundation, told me in an email that the only “surefire” way to guarantee your images won’t be used without your permission is to deprive people of access to them. But even if you want to, making all social media accounts private is unfeasible for many of us, who may have already put hundreds or thousands of photos on social networks—once those photos are in the wild, it’s almost impossible to rein them back in.

“For most people, that ship has sailed, or they're just unwilling to keep their face under wraps because of what bad actors might do with it,” Higgins said. “But in this case, because a lot of images are necessary, you don't have to be a complete ghost do avoid this kind of fake video.”

Limited Recourse

If you do spot your own face in fake porn, there’s not much you can do, but revenge porn experts say they do have some suggestions.

“It was only a matter of time before bad actors found a workaround for revenge porn laws,” Carrie Goldberg, a Brooklyn-based attorney whose firm focuses on representing victims of online harassment, sexual assault, and blackmail, told me in an email.

It’s essential for targets to respond with “lightning speed,” Goldberg said: Take screenshots or download the video, call an attorney and the police, and then begin the process of diligent removal, a series of steps Goldberg suggests taking for each platform the images are found on. “Acting during those first few hours of awareness can suppress the attack.”

Legally speaking, Goldberg said there may be limited options for people who have found themselves the subject of a deepfake: “There are civil laws that apply—intentional infliction of emotional distress and defamation.”

Charles Duan, associate director of tech and innovation policy at the advocacy group R Street Institute think tank, told me on the phone that in certain situations, you may be able to sue the person posting the videos to have them stop the activity or pay you damages for any harm caused by it. This could fall under the privacy torts of false light and defamation.

But generally, it can be difficult to prove intentional harm from videos like this; a deepfake made with publicly available media and posted without context can arguably be seen as a creative endeavor or art (regardless of how creepy it might be).

It’s deeply unfair that so much of the responsibility of recourse and prevention is on the user. Social platforms can moderate communities that become toxic, but ultimately they choose to cover their own legal obligations and not much more. When you sign up to use almost any online platform, you’re agreeing to some variation of a Terms of Service agreement that absolves the platform of responsibility. It’s a “use at your own risk” situation for most sites.

Some sites have taken a firm stand against deepfakes. Chat app Discord shut down at least two servers dedicated to deepfakes and banned users from the site beginning around January 26. Gfycat followed suit, and has begun removing gifs of deepfakes videos.

But other platforms have been completely silent, and generally, online platforms have been awful about removing abusive, harassing, and harmful content from the their sites.

When Kasparian first started being harassed, she said she attempted to contact the websites where images were posted to take them down. “Platforms are very slow to take things down,” she told me. “It’s my experience that often times they won’t even do it. There’s no response, no reaction. You feel helpless.”

Read more: People Are Using AI to Create Fake Porn of Their Friends and Classmates

On Reddit, the r/deepfakes subreddit now has more than 70,000 subscribers and continues to rapidly grow. It’s also spawned several offshoot subreddits for posting results and trading tutorials. In his annual “State of the Snoo-nion” announcement post, Reddit co-founder Steve Huffman wrote that he’s “particularly proud” of the work the company’s Community, Trust & Safety, and Anti-Evil teams are doing, especially in “catching issues before they become issues.” It has been 11 days since Motherboard sent its first request for comment to Reddit on r/deepfakes. We have sent Reddit six emails requesting for comment over that time period, and we have not yet heard back.

While targets of harassment are left hanging, platforms have legal protections that exempt them from penalties. The main law protecting platforms is called the Communications Decency Act, passed in 1996 as one of the first attempts to regulate porn on the internet. The Electronic Frontier Foundation calls CDA Section 230 one of the most important laws protecting internet speech, but the law also gives broad protection to platforms that host revenge porn, harassment, and abuse.

"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” the law states, meaning platforms are immune from liability for the conduct of their users and are able to profit off of their behavior and posts.

"You get mad at somebody, put their face on a porno and ruin their life.”

“If fake AI porn proliferates without any consequence to the creator AND the website host, then we’re doing a huge disservice to victims of this new weapon of harassment,” Goldberg said. “Websites don’t deserve to be given these special treatments. If you are running a business model that accommodates vengeful assholes in humiliating other people, the website should have to pay for it and the vermin running the website should be exposed.”

In the overall unresponsiveness of some platforms and the gross inability of the law to protect victims, the only solace that Kasparian has left is to just not give a fuck. “I'm a stubborn bitch,” she wrote in an email prior to our phone call. “I won't be silenced, intimidated or triggered by their nonsense. Plus, I don't find porn or human sexuality something to be ashamed of. If seeing fake pornographic images of me makes their lives a little better, then so be it.”

But of course not all victims will feel the same, and it’s very likely that deepfakes will end up hurting people. I talked to Dahlia Dee, a porn performer who’s outspoken about piracy and privacy issues, about what it’s like to have your images run rampant without your control:

“I spend my whole life putting myself out there and putting my naked body out there and making sure it has my own face on it,” she told me in a phone conversation. “It doesn’t concern me, as much as it concerns me that it’d happen to my neighbor. You get mad at somebody, put their face on a porno and ruin their life.”

Dee told me she hopes for a future where we have total control over our own images—our faces, our bodies, the things we create. “Will that happen? Probably not.”

There are a lot of reasons that, ultimately, the law may not be able to protect us. Revenge porn laws that have been passed so far have been a mixed bag; they have to balance protecting victims with the First Amendment (and many revenge porn bills that have been proposed are so overly broad that they would have far-reaching ramifications beyond their original intentions.)

That certainly doesn’t mean we shouldn’t try to make them the best, most comprehensive policies we can—but we’ll need to have nuanced, in-depth conversations with a lot of different stakeholders to build them. So far, that hasn’t happened. Once again, our laws lag behind as we hurtle toward dystopia.