FYI.

This story is over 5 years old.

Tech

Imgur Is Cracking Down on NSFW Content Because it Wants to Be Instagram

Is the site's crackdown on NSFW content an attempt to appease Apple?
Screengrab: Imgur.com

Imgur's burgeoning photo-sharing community is tearing itself apart over a crackdown on not-safe-for-work content. Meanwhile, the image hosting site is still the go-to for Reddit's amateur pornographers to upload their pics. What gives?

Imgur was unceremoniously founded in 2009 by one guy, Alan Schaaf. Because of its speed, unobtrusive nature, and Reddit integration, Imgur quickly became the most popular place to host memes, images, and, yes, porn that would later be posted on Reddit. The site has also managed to sprout its own community of people who bypass Reddit altogether, discovering photos based on an up voting system very similar to Reddit's.

Advertisement

Much of that community got very pissed off earlier this week, because Imgur began to more aggressively moderate and delete what it deemed to be NSFW or inappropriate comments. Much of the site's front page at the moment are various memes and protest photos referencing the crackdown, and Sarah Schaaf, Imgur's community manager (and Alan's sister) received death threats as a result. Imgur quickly scrambled to explain the changes to its users.

This was not a shift in rules—the site's no NSFW content rule has existed for some time—but it was a change in enforcement. But, in an interesting timing coincidence, the company launched an Android app Tuesday. Its iPhone counterpart is currently in Apple review hell, which may be one of the reasons why the company is quickly trying to clean up its front page without deleting thousands (millions?) of NSFW photos it hosts for Reddit. Update: Imgur says its iPhone app has recently been approved by Apple: "There isn't any correlation between our app launches and anything happening on Imgur. As you know, the Apple review cycle can sometimes just take a while," Michelle Masek, Imgur's head of communications said.

"We've always had rules against obscene and NSFW content and comments, and these guidelines have been in place for years," Masek, told me. "The difference here is that Imgur team has grown a lot this year, and we're now better equipped to respond when Imgurians flag content as inappropriate to us."

Advertisement

Masek said the site is "entirely self-policing" and that the company is only deleting content that is flagged by users. "Ultimately, the community decides what's inappropriate and flags it to Imgur's attention," she added.

That makes sense, but if the company has a rule against NSFW content, then why doesn't it go to the very, very easy to find porn sections of the site and delete them en masse? As I mentioned, Imgur serves as the defacto default host for most porn subreddits, and the site gets a massive amount of traffic (and has a huge user base) for the corresponding subsections of Imgur. Here's a list of the top NSFW subreddits from fall of last year, each of which have their own Imgur section:

Obviously, completely destroying these sections would cause lots of problems for the site. And so, it's allowing those pictures to remain, indefinitely. Imgur says it has divided itself into two services (which, as a user, are close to indistinguishable unless you are specifically interested in the distinction), "image hosting" and "community."

"The image hosting side of things is governed by our terms of service, which doesn't prohibit NSFW. That service is used primarily for hosting content and sharing it in other places, though we do make it easy to see all of that content in one place on Imgur if you seek it out," Masek said. "When users navigate directly to imgur.com/r/gonewild they are seeking out and opting-in to the mature content. This is a very different behavior from those browsing the public gallery and engaging with the community."

Advertisement

That explanation makes sense, and I suppose the company is well within its rights to make such a distinction. When uploading a photo, users are asked to select whether they want it to go into the "gallery" or not, which is where Imgur has some of the strictest moderation rules of any major social media site.

In its community, the site takes a pretty broad view of what constitutes NSFW content:

It also bans hate speech and abusive content (including racism, sexism, and personal attacks). It's early days of this latest crackdown, but those rules are much stricter than Reddit's, and it shows that Imgur may be attempting to become something like the all-inclusive community that Reddit is striving to be.

I have somewhat of a different theory: Imgur may be positioning itself as a nerdier competitor to Instagram, or at least an alternative to Instagram. The service doesn't offer anything resembling Instagram's filters, but, with the upvoting system, it does offer the potential for photos to succeed on their merits rather than existing star power or popularity, which could be enticing to some photographers.

The company also just launched an Android app that's as much focused on uploading pictures to the site as it is on exploring them. In fact, the cleanup may be directly tied to the launch of the app: The company's iPhone app is currently "tied up by Apple" (according to another representative for Imgur). Apple is notoriously protective of its app store, and it makes sense that an overabundance of NSFW content on an Imgur app home page could be the holdup. Update: Imgur's iPhone app has been approved.

I asked Masek whether or not she thought cleaning up was necessary for the company's long-term growth potential, and she said that NSFW content simply doesn't align with the company's values.

"It's not about our growth potential, it's about our vision for the community," she said. "We want Imgur to be a force for good on the Internet where people can feel welcome and wanted, and we think the community rules we established years ago ensure that's the case."