Quantcast

AI-Generated Fake Porn Makers Have Been Kicked Off Their Favorite Host

Reddit is still silent.

Samantha Cole

The most popular hosting site for “deepfakes” is now actively removing fake porn gifs from its site.

“Deepfakes” are fake porn videos generated using a machine learning algorithm, allowing people swap the faces of celebrities—or their friends and acquaintances—onto the bodies of porn performers. The form has gone viral over the last few weeks as they have become easier to make.

Tuesday, users in the r/deepfakes subreddit started noticing that some of the gifs uploaded to image hosting platform Gfycat were removed. Gfycat is similar to Imgur and several other sites in that it is a popular place to host media that is posted to Reddit (and elsewhere on the web). A spokesperson for Gfycat confirmed to Motherboard that it finds deepfakes “objectionable” and is deleting them from the site.

"Our terms of service allow us to remove content that we find objectionable. We are actively removing this content,” a spokesperson for Gfycat told me in an email.

Read more: People Are Using AI to Create Fake Porn of Their Friends and Classmates

Uploads older than a day seem to be nuked from the site and have been replaced with an error message. “Seems like a targeted purge, so people should avoid using the website,” one user in the subreddit wrote. “Anyone who posted a link before should try and rehost as well.”

Gfycat’s terms of service doesn’t explicitly prohibit adult content (it’s been widely used to host porn on Reddit for years), but does address intellectual property laws, stating that “any content that (i) infringes any intellectual property or other proprietary rights of any party” is subject to removal. Anything Gfycat deems “unlawful, harmful, threatening, abusive, harassing, tortious, excessively violent, defamatory, vulgar, obscene, libelous, invasive of another’s privacy, hateful racially, ethnically or otherwise objectionable” is subject to removal, as well.

This ambiguity around adult content—in addition to the facts that Gfycat is free, has unlimited size uploads, and allows easy downloading—made Gfycat an ideal place for Redditors to host deepfakes. The community is now seeking alternative platforms to host the videos.

“It started to get bad press and they took them down. It’s not worth bad press to them,” another r/deepfakes user wrote in a comment thread about the takedowns. “They might even spin it as a brave website purges evil group of porn people who infested the website. ignoring of course that I can find a million porn scenes, movies clips, and woman shooting pokeballs out their vaginas on their precious site. But deep fakes are the evil thing, this week.”

Read more: We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now

Though the legality of deepfakes has been widely discussed in the weeks since we reported on their existence, platforms are of course free to police the content that appears on their websites.

If you’re keeping score at home, chat platform Discord has started banning deepfakes-focused servers for violating the site’s community guidelines: "We view solicitation or creation of this content as a violation of our policy against non-consensual pornography, which is clearly prohibited in our terms of service and community guidelines,” Discord said in a statement. “Non-consensual pornography warrants an instant shut down on the servers whenever we identify it, as well as permanent ban on the users. We have investigated these servers and shut them down immediately."

Gfycat has now joined Discord in this thinking. PornHub is still full of deepfakes, and Reddit—the site that has is the central hub of deepfake creation and sharing—still has not responded to our requests for comment about r/deepfakes. Meanwhile, the deepfakes community has found plenty of other hosting services to use besides Gfycat.