Atomwaffen members
Image: YouTube

FYI.

This story is over 5 years old.

Tech

YouTube Is Full of Easy-to-Find Neo-Nazi Propaganda

In a software-aided investigation, Motherboard found that while YouTube has cracked down on pro-ISIS material, the video giant leaves neo-Nazi propaganda online for months and sometimes years at a time.

“Where will you be when the race war begins?” the propaganda for Atomwaffen Division, an American neo-Nazi group linked to several murders in the US, says. “Join your local Nazis.” The phrases were included in a video that YouTube removed along with the rest of Atomwaffen’s channel last month.

But that removal was closer to a one-off than an extended campaign: YouTube is still hosting exact copies of that video and others from Atomwaffen. Through a software-aided investigation, Motherboard has found that while YouTube has managed to clamp down on Islamic extremists uploading propaganda, the video giant is still awash with videos supporting violent and established neo-Nazi organizations, even when, in some cases, users have reported the offending videos. Clips of neo-Nazi propaganda operations, hate-filled speeches, and extremists pushing for direct action have remained on the site for weeks, months, or years at a time.

Advertisement

“Their rhetoric and calls for violence is of the most extreme nature,” Joanna Mendelson, senior investigative researcher at the Anti-Defamation League’s (ADL) Center on Extremism told Motherboard, referring to Atomwaffen.

Examples of neo-Nazi videos on YouTube include:

  • A video celebrating the one year anniversary of neo-Nazi website Iron March says “Gas the K***s, race war now!,” and has been left online since 2012.
  • A “call to action” from the Nordic Resistance Movement, an established neo-Nazi group in Sweden, Finland, and Norway, which Finland has moved to outlaw. The video includes narration from the late Robert Jay Mathews, an American neo-Nazi, and plans “total Aryan victory,” after breaking “the chains of Jewish thought,” and adds they will go “high above the mud of yellow, black and brown,” coupled with images of punching a person of color.
  • A UK anti-refugee protest and speech, which includes a participant holding a flag from National Action, a neo-Nazi group the UK government banned as a terrorist organization last year. Other videos also include National Action banners.
  • Several exact copies of Atomwaffen videos, including a clip calling for a “white revolution,” one showing Atomwaffen supporters distributing propaganda posters around a US university, and several encouraging viewers to join local neo-Nazi groups. These were uploaded on the same day YouTube banned the original Atomwaffen channel in February.

Advertisement

Arguably, many if not all of these videos may fall under YouTube’s own policy on hate speech, which “refers to content that promotes violence against or has the primary purpose of inciting hatred against individuals or groups based on certain attributes,” including race or ethnic origin, religion, and sexual orientation, according to the policy.

Motherboard built a tool to monitor YouTube and make a record of when the platform removed certain videos, and limited the clips to propaganda for established neo-Nazi and far-right terrorist organizations like Atomwaffen, rather than people in the so-called “alt-right.” Most of the videos were discovered through simple YouTube searches of relevant organizations’ names, or sometimes through the “recommended videos” sidebar after Motherboard had built up a browsing history of neo-Nazi material.

For the sake of comparison, over a week-long period Motherboard also tracked pro-ISIS videos uploaded by the group’s supporters and then distributed through a network of Telegram channels. Typically, YouTube removed these Islamic extremism videos in a matter of hours, including those that did not contain images of violence, but were instead speeches or other not directly violent content.

“There’s no question that some social media and video sharing platforms including YouTube have made tremendous strides in recent months not only in the amount of terror related videos pulled but the swiftness in which videos have been removed,” Raphael Gluck from research group JihadoScope told Motherboard in an email.

Advertisement

Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

But YouTube is playing catch up with neo-Nazi material. YouTube removed only two videos that Motherboard was monitoring: two identical clips of a speech from UK terrorist organization National Action.

Whereas pro-ISIS uploaders often use various techniques to bypass YouTube abuse detection systems—such as removing audio from clips, or encasing videos in a black border—some of the neo-Nazi videos, including ones online at the time of writing, appear to be exact copies of other previously removed clips. YouTube is also aware of many of the neo-Nazi clips, because they are already flagged as offensive by users, and have had some features such as comments, likes, or recommended videos disabled.

“The following content has been identified by the YouTube community as inappropriate or offensive to some audiences,” a banner on one of the mirrored Atomwaffen videos, as well as many others, reads.

Last year, YouTube announced it was going to use machine learning to identify terrorism content, which looks for similarities with previously flagged content. In a more recent blog post from December, YouTube CEO Susan Wojcicki said 98 percent of videos removed for violent extremism are flagged by the platform’s machine-learning algorithms.

Caption: A still taken from a propaganda video for Atomwaffen Division. Image: YouTube

YouTube confirmed to Motherboard that the video giant is applying these new techniques to hate-speech related content. (When asked other questions, YouTube pointed to its December blog post). Using machine learning to, say, identify neo-Nazi group logos on flags on banners could be fruitful, but it appears YouTube is not there yet, or at least to the same level of takedowns as Islamic extremism.

Advertisement

YouTube’s terms of service may be fine on paper, “but if they’re not enforced, that becomes the problem,” said ADL’s Mendelson.

One researcher has already proven that using an algorithmic method for identifying far-right imagery can be successful. Ex-NSA hacker Emily Crose has built NEMESIS, a program that automatically detects hate symbols in social media posts, imagery, and video. In one example, NEMESIS picked out far-right symbols on shields carried during a march.

“I think we were also able to demonstrate that it’s possible for a company to create a system that does what NEMESIS does, and do it at scale,” Crose, who also comes from a background in Reddit moderation, so understands the issue of managing a wide spanning community, told Motherboard in a phone call.

At Motherboard’s request, Crose processed a small selection of videos through NEMESIS. While two of the videos, including one from the Nordic Resistance Movement, didn’t raise any alarms—Crose said her team has not focused on image data for European groups; presumably a hurdle that YouTube would need to overcome as well—NEMESIS detected neo-Nazi imagery such as Swastikas and other symbols in the Iron March videos saying “Gas the K***s, race war now!”

Below is an example of NEMESIS detecting far-right symbolism from that video:

Clearly, the difficult task isn’t picking out particular imagery. Instead, “the hard part is actually joining that up with a sort-of context in order to make a judgement on whether the image that you’re looking at is being used for a white supremacist purpose or not,” Crose said.

That may be why far-right content, in some instances, provides more of a challenge for YouTube: some videos, although offensive, may be legitimate acts of protected speech, and will need a human to make the final sign-off on whether a video should be removed or not. In its December blog post, YouTube’s Wojcicki said the plan is to increase the total number of people across Google working on address content to over 10,000 this year.

But with that being said, the Atomwaffen clips and many of the other videos Motherboard monitored are unambiguous and clear examples of hate speech, in some cases calling for the murder of particular ethnic groups.

“The real problem doesn’t become an issue of technology; the real question is one of will,” Crose said.

Update: After the publication of this article, YouTube removed a number of the videos Motherboard monitored, including the mirrors of Atomwaffen propaganda.