FYI.

This story is over 5 years old.

Tech

Reddit CEO 'Aware' of Community That Makes Memes of Dead Children

A subreddit full of videos of people burning to death and memes of mutilated, dead children evaded a crackdown on violent content; Reddit says it's now "under review."
Image: Reddit

Reddit has rarely been proactive about banning toxic communities from its platform; throughout the years, the company’s administrators have said, for example, “we’re a free speech site with very few exceptions,” “we stand for free speech,” and “we uphold the ideal of free speech on Reddit as much as possible.”

In recent years, under significant pressure from the press and some of its users, Reddit has started to ban communities like /r/FatPeopleHate, /r/jailbait, /r/picsofdeadkids, and, most recently, /r/DeepFakes, which was used to create nonconsensual porn of celebrities. But Reddit is still slow at responding to toxic content on the site: While services like Pornhub and GfyCat began banning AI-generated nonconsensual porn within a few days of being alerted to its existence, Reddit took more than two weeks to ban DeepFakes after we initially asked about the community.

Advertisement

Monday, a Redditor asked Reddit CEO Steve Huffman if the company was considering banning /r/nomorals, a community that posts “images of dead babies/corpses and harming animals.” Huffman responded that the company is “aware” of the subreddit, which currently has nearly 19,000 subscribers and was “trending” in late January.

“We are aware, and this community is under review,” Huffman responded.

“The original creator of the sub nuked it about two months ago and deleted all the content,” he wrote. “It’s now back up and running, which is why we’re getting new reports.”

That may be the case, but NoMorals has been considerably active in the last two months, with a huge spike in users; it is currently among the top 5,000 subreddits on the site. In this case, I’m not sure how much “review” could possibly be necessary before deciding that a community like NoMorals doesn’t belong on one of the most popular websites on Earth. A cursory look at the front page is full of people laughing about people being burned to death and memes of mutilated people and animals.

“Post what ever you want … just make sure it’s funny,” the subreddit’s sidebar reads. “It could be a smashed guy, a bloated lady, a dead cat with a pop tart, anything. It’s half the fun, you see a funny title, you match it with the picture, woo wee.”

In December, Reddit said it would be more proactive about banning content that “glorifies” violence; it banned communities such as “picsofdeadkids,” “DylannRoofInnocent,” “SexWithDogs,” “picsofcaninevaginas,” “horsecock,” “killthejews,” “selfharmpics.”

NoMorals was created in 2012, but didn’t become popular until soon after Reddit banned other violent communities in December.

NoMorals and the banned communities are different in tone than /r/WatchPeopleDie, a controversial subreddit with 319,000 subscribers that is exactly what it sounds like. But unlike NoMorals and other recently banned subreddits, WatchPeopleDie doesn’t revel in death and violence; people in that community have said it exists to help people realize “just how fragile human life truly is.”

Whether that’s believable, or enough reason for it to continue to exist is a question Reddit will have to grapple with. But it seems the company could start by enforcing its own rules, which prohibits “content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or a group of people; likewise, do not post content that glorifies or encourages the abuse of animals.”