Tech

Web Hosting Companies Shut Down a Series of Neo-Nazi Websites

Major online neo-Nazi meeting grounds are being nuked by hosting sites that want nothing to do with connections to militant white nationalists. Meanwhile, several of them have regrouped on Twitter.
GettyImages-1015825232
Image: Yasin Ozturk/Anadolu Agency/Getty Images

In recent weeks a number of web hosting services have shut down several major online neo-Nazi meeting grounds, Motherboard has learned. The moves predate Facebook’s recent and total banning of white nationalism, but some of the services that have been shut down continue to operate openly on Twitter.

One of the websites of Atomwaffen Division—a violent neo-Nazi extremist group linked to five killings—went down earlier this month. Bluehost, which once hosted the Atomwaffen Division site, told Motherboard the site was “deactivated for violating Bluehost's Terms of Service,” but wouldn’t expand further.

Advertisement

Previously, the same Atomwaffen Division site was using the products of DDoS mitigation giant Cloudflare, which has come under fire for protecting ISIS and neo-Nazi websites in the past, before it was taken down by Bluehost.

Cloudflare told Motherboard it declined to comment on users of its products and said it was committed to making a more secure Internet for all.

Bluehost also shut down the website for Radio Wehrwolf, a popular and broadly shared podcast among militant white nationalists, while Zencast—a podcast hosting service—told Motherboard it disabled the white nationalist site from using its platform to stream the show for violating its terms of service.

“Our service is a place for expression and we encourage free speech and it is fine to express unpopular points of view, but we do not tolerate hate speech,” said a spokesperson for Zencast who asked not to be named for fear of retribution from the neo-Nazis associated with the shutdown. “Our service is not a place for engaging in any harassing, bullying, or threatening behavior, nor is it a place to incite others to engage in these activities.”

In perhaps the biggest takedown of all, Fascist Forge, a Facebook-like site for online Nazis that spanned everything from shitposting to the exchange of weapons manuals, disappeared for a second time after violating the terms of service for Hostinger International, its web hosting company. Hostinger International was notified of its affiliation with violent white nationalism by the Counter Extremism Project, an anti-extremism non-profit network.

Advertisement

“Hostinger acted with haste—and very rightly so—in suspending Fascist Forge, an online forum that promotes neo-Nazi violence and radicalises recruits," said David Ibsen, the CEP executive director in a company release posted online.

All of the takedowns occurred in or around the last month. Some members of the far-right believe that the US government is behind the takedown of these websites, but Motherboard was unable to confirm this.

Weeks before a white nationalist killed 50 people in a mosque in Christchurch, New Zealand, one well-known Gab user connected to Radio Wehrwolf posted, “there’s been a large crackdown on many [neo-Nazi] media outlets” pointing to the disappearance of the Atomwaffen Division site. Another Gab poster opined “how long has [the Atomwaffen Division site] been down? WTF!! Fascist Forge, now this, damn it!”

Some of the shutdowns were attributed by the far right to companies attempting to distance themselves from an online community linked to the Christchurch terror suspect. But with the exception of Radio Wehrwolf, these site takedowns predate the attacks in New Zealand, leading some to believe authorities are putting serious efforts into dismantling the wider online neo-Nazi ecosystem.

The FBI declined comment for this story.

But while web hosts have seemingly stepped up their efforts to take white nationalists and neo-Nazis offline, they continue to be active on social media, including Twitter, while many episodes of Radio Wehrwolf continue to be available on YouTube. YouTube did not immediately provide a comment for this story.

Advertisement

Motherboard has viewed the continued Twitter activities of users connected to The Base (an infamous international far-right network exposed by VICE) and Atomwaffen Division, going completely undisturbed on Twitter, even as some users clamoured for an “acceleration” of race tensions and openly celebrated the Christchurch attacks.

For example, the most well known suspected Atomwaffen Division Twitter account, though locked to private members, continued to live on before and after those terror attacks, while one Base recruiter incited followers to join the secret network.

After Motherboard reached out for comment for this story, Twitter suspended two neo-Nazi Twitter accounts. Other Twitter users connected to those accounts have already noticed the suspensions and protesting by encouraging other white nationalists to meet and organize in person “while you still can.”

According to Twitter, the company expanded its plan to combat online extremism in 2017, outlining tenets against affiliation with violent extremist groups and general hateful conduct.

“You may not promote violence against or directly attack or threaten other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or serious disease,” reads the rules outlined by Twitter. “We also do not allow accounts whose primary purpose is inciting harm towards others on the basis of these categories.”

Advertisement

The broader conversation of what social media does to radicalize militant white-nationalists was highlighted once again in the aftermath of the Christchurch terror attacks.

The terrorist suspect widely believed to be responsible for the Christchurch attacks littered his manifesto—posted on 8Chann and spread widely over Twitter, while broadcasting the attack in real time on Facebook Live—with numerous references to white nationalist internet-speak. His indoctrination within a wider online world of neo-Naziism became apparent just as soon as the news of the attack broke.

This week, Facebook announced that it would finally ban white nationalism and white separatism on the site. Facebook has pledged to use similar tactics and machine learning tools it employed to successfully dismantle the activities of ISIS, Al Qaeda and other terrorist groups using its platform on white nationalists.

Thus far, Twitter hasn’t done the same. While Facebook has said it will ban content that says, for example, “I am a proud white nationalist,” Twitter’s hateful conduct policy is not as specific, and the company declined to give specifics about white nationalism and white separatism when asked by Motherboard.

Previously, when The Base accounts were flagged to Twitter by VICE in 2018, several accounts connected to the group were taken offline but reappeared months later under similar aliases. The tedious process of eliminating those accounts is not unlike the difficulties identifying and suspending ISIS accounts in the wake of its online boom in 2014. That said, Twitter has yet to present any coherent strategy to combat white nationalism on its platform.

Amarnath Amarasingam, a terrorism expert and senior research fellow at the Institute for Strategic Dialogue, carefully tracked the rise of ISIS online and understood the massive importance of Twitter in its development as an international terror group.

“Twitter was fundamentally important for ISIS fighters and supporters from 2013 onwards. They shared content, reached out to each other, and created a vibrant online community that thrived for many years,” said Amarasingam.

But many of the policy changes to combat ISIS on Twitter only happened after high profile attacks by the terror group. While Facebook has very publicly banned white nationalism, Twitter has yet to do the same even in the wake of attacks like Christchurch.

“Twitter finally took a real stance on kicking ISIS off its platform in 2015, and it had a massive impact. The lessons learned from that effort should and could be applied more forcefully to tackling white supremacist groups,” Amarasingam said. “These kinds of growing pains around policy choices seem to exist for a while before these social media companies take a real stance. Usually after attacks happen and they feel the pressure from governments.”