The new rules sound pretty reasonable, but their deployment is baffling some redditors.
Reddit is going through some growing pains right now. The community that built its following by being open, democratic, and largely anonymous has more recently decided maybe it shouldn't be ground zero for some of the most deplorable content online. So it's trying to change that, but it's not an easy process.
The latest step in this transformation was the introduction of a lengthier, more specific set of rules in the site's content policy. Though still pretty straightforward, the new guidelines go further to root out and prevent hateful, disruptive, or dangerous content from surfacing on the site and making it unsafe for redditors.
But in the post announcing these changes, many redditors voiced concern and confusion over the new rules and how they will be enforced. One rule in particular, the decision to "quarantine" subreddits that are deemed particularly offensive to the average person, drew a lot of questions. Quarantined subreddits won't be banned, but there will be a wall put in place so you have to explicitly opt-in if you want to view that kind of content. The rest of reddit will be able to avoid it more easily.
That seems like a reasonable balance between protecting the community at large from being bombarded with offensive content, and maintaining Reddit's open, free-speech reputation. But not all subreddits will be quarantined, some will be banned flat out, like a handful of aggressively racist subreddits that got banned when the new policy was rolled out. The moderator of these subs has posted an AMA, defending the forums and calling out other subreddits. He or she is also notably the moderator of no fewer than nine quarantined subreddits, and two NSFW subreddits:
But even that mod can't determine how Reddit deciding which subreddits to ban, which to quarantine, and which to leave alone. Why is r/coontown banned but r/blackpeoplehate is just quarantined? It's a tricky, subjective process.
Reddit spokesperson Ashley Dawkins said bans are for particularly disruptive subreddits:
"We found ourselves spending a disproportionate amount of time dealing with CT instead of working on Reddit itself," Dawkins said in an email. "So this was a necessary shift for our team to maintain a healthy and successful platform."
Some redditors have been collecting a list of all the quarantined and banned subreddits they can find, since Reddit hasn't published a complete list. Some subreddits have pretty self-explanatory names that reveal why they were quarantined or banned, while others are mysteries (a complete list can be found at the bottom of this post).
But there are still some subreddits that raise questions over how the decisions are being made. Take the subreddit r/chokeabitch, often listed as one of the most offensive subreddits on the site. While other subreddits that promoted violence against women, like r/beatingwomen and r/rapingwomen have been banned flat out, r/chokeabitch is still alive, but is closed off (and, as a notice on the landing page explicitly claims, not active). So why didn't it make the cut to be removed entirely?
R/theredpill is another glaring outlier, and has long been identified as a hotbed for misogynistic and hateful posts. Thirty gruelling seconds on the subreddit reveals some charming gems like this advice about long distance relationships: "Never follow the girl. Instant loss of frame. You want her to chase you/move to where you are. If she won't, cut her loose," and this comment about women's contributions to technology "some have truly contributed things, but are far spread out compared to men. Biology/Reality strikes yet again!" But since the comments are mostly contained within the subreddit, it remains open and untouched.
Meanwhile, subreddits like r/burningkids (which posts graphic images of children with burn wounds) and r/quranimals (which posts hateful content directed at people of the Muslim faith) are still active and completely open. Granted, they're small, unpopular subreddits that don't tend to bleed out into the general pool of content, but what threshold do they need to cross before they are deemed offensive?
Then there are less obvious examples. Many redditors cried foul over r/shitredditsays, which has been identified as the most toxic forum on the site. It mostly criticizes misogynistic, homophobic, or even just right-wing comments on the site, but redditors have claimed active users of the subreddit are abusive, hunting down other users whose comments get pulled onto SRS's front page and harassing them through private messages. Are SRS redditors harassing others on the site or do people who maybe post shitty comments just not like being called out? It's not easy to tease apart, but either way it's one of the most contentious subreddits on the site.
If you look hard enough, you can always find something on Reddit to rattle your jimmies, but the new rules aren't intended to protect every user from ever getting upset. According to site admins, they're simply trying to smoke out the most egregious content on the site and prevent people from getting doxxed, threatened, and harassed. The system isn't perfect, and some subreddits might slip through the cracks, but at least Reddit is trying. And if you really don't like it, you can always go to Voat.