Tech

YouTube Decides to Leave Neo-Nazi Propaganda Online

Neo-Nazi propaganda, podcasts, and audiobooks continue to exist on YouTube. Even when alerted to their existence, the platform demonetized and removed some features from them, but did not delete the videos.
shutterstock_162472430
Image: Shutterstock

In the aftermath of a neo-Nazi terrorist killing 50 people in a mosque in Christchurch, New Zealand, tech companies spanning web hosts to social media giants have banned or shut down militant far-right content online.

But even in the face of those horrific terror attacks, YouTube continues to be a bastion of white nationalist militancy. Over the last few days, Motherboard has viewed white nationalist and neo-Nazi propaganda videos on the website that have either been undetected by YouTube, have been allowed to stay up by the platform, or have been newly uploaded.

Advertisement

When examples were specifically shown to YouTube by Motherboard, the company told us that it demonetized the videos, placed them behind a content warning, removed some features such as likes and comments, and removed them from recommendations—but ultimately decided to leave the videos online. The videos remain easily accessible via search.

The spread of neo-Nazi and white nationalist propaganda on YouTube is a years-long problem. While YouTube took swift action against the Islamic State in the wake of a series of high profile beheadings and terrorist attacks, it has repeatedly failed to remove neo-Nazi and white nationalist content at scale. A Motherboard investigation in 2018, for example, documented a vast trove of neo-Nazi content shared on YouTube. A Bloomberg investigation published Tuesday showed that YouTube chased "engagement" at the expense of the health of its platform by allowing toxic videos to continue to stay online.

Over the years, YouTube has disabled and deleted some neo-Nazi channels, but a lot of content still lives online. For example, in March 2018, YouTube disabled hate group Atomwaffen Division’s central channel where the group posted videos calling for a “race war” and took down another pro-Atomwaffen channel in January of this year.

But everything from the propaganda of neo-Nazi organizations and well known white nationalist-podcasts to the genocidal writings of the far-right continue to live undisturbed on the popular video platform, Motherboard has found.

Advertisement

Several propaganda videos of Atomwaffen showing various masked men with weapons giving the “sieg heil” salute and urging potential members to join were still available on YouTube as of the posting of this article. Motherboard saw several well-known Atomwaffen propaganda videos still available on YouTube; one was uploaded within the last 24 hours. These are reuploads of past videos that YouTube has previously taken down, indicating that the company isn’t always automatically detecting them.

Ironically, in one video Atomwaffen asks for its content to be shared on social media.

“Spread and mirror our work on various social media: YouTube, Facebook, Twitter, etc.” reads a message in a well known Atomwaffen Division propaganda video, “Join your local Nazis!”

Last month, web hosting service Bluehost shut down a prominent Atomwaffen website, which means that YouTube remains one of the only places online where the Siege manifesto—something of a bible for Atomwaffen, written by influential neo-Nazi figure James Mason in the eighties—can be easily found as an audiobook.

1554224438915-Screen-Shot-2019-04-02-at-125430-PM

In Siege, Mason describes how to undertake leaderless resistance movements, targeted killings of American politicians and violent atrocities against minorities, which Atomwaffen adopted as its central ideology. It has drawn comparisons to the Turner Diaries, another white nationalist work, which is cited as a direct influence on Oklahoma City bomber Timothy McVeigh and what was once the largest domestic terror incident in US history before 9/11.

Advertisement

Meanwhile, Bluehost and Zencast banned the neo-Nazi podcast Radio Wehrwolf, but several of its episodes remain available on YouTube, including an interview with Mason. Following those earlier takedowns by Bluehost and Zencast, Radio Wehrwolf used YouTube to rally its followers by disseminating YouTube links to episodes, promising to bring back the podcast.

“On a little hiatus there with the website. We're gonna get that back up and running soon,” said one of the hosts of Radio Wehrwolf in a more than two-hours-long YouTube video, denigrating the “bull shit terms of service” violations that took him down. In the same show, it calls for the death of a member of the Southern Poverty Law Center, a far-right extremism watchdog.

YouTube asked Motherboard to forward links to several neo-Nazi videos. The company confirmed that copies of the neo-Nazi Radio Wehrwolf show and copies of Siege are still streaming on its platform. Unlike Facebook, the company continues to allow the white nationalist podcast to use its services. YouTube told Motherboard that it didn’t take down the content sent to it by Motherboard, but that it demonetized the videos, removed comments and likes, and placed them behind a warning message.

1554224423539-Screen-Shot-2019-04-02-at-125439-PM

Despite the overwhelming evidence of neo-Nazi militant content on its platform, YouTube says it’s taking the issue seriously. In an interview with the New York Times last week, Neal Mohan, YouTube’s chief product officer, said the company is trying to police white nationalism, but unlike ISIS propaganda, it’s harder to detect and classify as hate speech.

Advertisement

“In the case of something like this,” said Mohan referring to white nationalist content, “the challenges are harder because the line, as you can imagine, is sometimes blurry between what clearly might be hate speech versus what might be political speech that we might find distasteful and disagree with, but nonetheless is coming from, you know, candidates that are in elections and the like.”

A YouTube spokesperson told Motherboard the site doesn’t tolerate hate speech or the promotion of violence, but stopped short of an outright ban of white nationalism on its platform, which Facebook recently pledged.

“Hate speech and content that promotes violence have no place on YouTube,” the spokesperson said. But the streaming giant did say it was investing in human moderators and AI tools to dismantle hateful content violating its terms of service.

“Over the last few years we have heavily invested in human review teams and smart technology that helps us quickly detect, review, and remove this type of content,” they added. “We remove millions of videos that violate our policies every quarter, the majority of which are first flagged by our automated systems.”

But the Counter Extremism Project—an anti-extremism non-profit network based in the US—has highlighted YouTube’s continued refusal to remove Siege from its online library.

Advertisement

“I think it’s because they’re not taking the threat seriously,” said CEP researcher Joshua Fisher-Birch in an interview with Motherboard on why white nationalist content continues proliferating on YouTube.

Fisher-Birch said he’s logged recent uploads of Siege and the Turner Diaries that have thousands of views, while several new uploads of Atomwaffen propaganda appear almost daily.

Motherboard recently saw an active Siege audiobook channel on YouTube clocking thousands of streams.

The CEP researcher believes the difference between YouTube’s pronounced and well-documented banning of ISIS content versus white nationalist, has to do with its public image and business model.

“With ISIS and ISIS content YouTube was getting a lot of bad press,” said Fisher-Birch. “It was really becoming an issue with advertisers pulling out because they were worried about their brands being associated with ISIS content. And there hasn’t been the same level of public or corporate sponsorship outrage about this kind of content.”

After beheading videos and ISIS statements were continually released first on YouTube in 2014, the company took a harsh and deliberate stance against the terror organization, pouring vast resources into the problem.

Along with Facebook, Microsoft, and Twitter, YouTube is a founding member the Global Internet Forum to Counter Terrorism, an initiative formed specifically to thwart the usage of social media by terror groups to recruit and prosper online. But the collective has come under fire recently for its attention to white nationalist born terrorism online.

On the GIFCT website, YouTube maintains that 98 percent of the videos it pulls offline are, “flagged by machine-learning algorithms,” which helps its “human reviewers remove nearly five times as many videos than they were previously.”

But an official YouTube Twitter account claims it’s harder to detect white nationalist videos over the content of jihadist terror groups.

“Many violent extremist groups, like ISIS, use common footage and imagery. These can also be signals for our detection systems that help us to remove content at scale,” it said in a tweet thread.