FYI.

This story is over 5 years old.

Tech

YouTube Regularly Leaves Islamic State Videos Up for Days, Weeks

According to a Motherboard analysis of over 150 videos shared in pro-Islamic State groups, while YouTube is quick to remove offending clips in some cases, the social network is still making serious errors in policing extremist content.
A screenshot of a video still online at time of publication of the Tehran attackers giving an interview before they attacked. Image: YouTube

Slumped on the floor wedged between a filing cabinet and a large wooden desk, a man lies motionless, blood oozing from his torso. One gunman, perhaps the killer, walks past the body carrying a rifle.

This visceral footage was taken by Islamic State terrorists during their attack on Tehran's Parliament building earlier this month. Pro-Islamic State social media accounts quickly disseminated the video, spreading it as quickly as possible. YouTube, where many copies of the clips were hosted, started to remove them. Some disappeared within 24 hours. But at least one video remained on YouTube for over a week, available to anyone to watch.

Advertisement

That clip is just one included in a Motherboard analysis of over 150 YouTube videos recently shared by Islamic State supporters. The analysis found that YouTube's approach to policing extremist content is, although sometimes swift, scattered and often ineffective, raising serious questions about whether the social network has done enough to stop the proliferation of pro-Islamic State videos. YouTube knows there is a problem: On Sunday, the company announced several updates to how it will try to curb extremist material.

In all, Motherboard kept tabs on 164 unique YouTube URLs which were shared in pro-Islamic State media channels, noted the videos' content, and recorded how long YouTube took to remove the clips or take other action. Our analysis found:

  • A video including an alleged audio message from an Islamic State spokesperson calling for attacks during Ramadan is still online, a week after being uploaded.
  • Several battlefield and street battle clips branded by the Islamic State's de facto news agency Amaq are still online, two weeks after being uploaded.
  • A gruesome video of the Tehran attack remained on YouTube for nine days.

The findings come as the UK government says social media sites, including YouTube, need to do more to remove extremist content from their networks, or potentially face being fined.

"These videos, which may contain graphic imagery and urge for attacks in the West, often remain for months at a time," Rita Katz, the director and co-founder of terrorism analyst firm SITE Intelligence Group, told Motherboard.

Advertisement

Hours, Days, Weeks

Over a two week period, Motherboard monitored channels on Telegram—a social network and messaging app popular with Islamic State supporters—used to spread extremist propaganda. These included channels that shared material from Amaq; outlets such as Nashir, which republish Islamic State announcements; and a sea of other supportive accounts which regularly post similar extremist videos. The sample included clips of Islamic State fighters in street combat in the Philippines; members destroying Christian religious sites; and instructions on how to commit attacks with knives and vehicles. Several long, half hour propaganda videos with ideological speeches were also shared.

To be clear, YouTube did remove the vast majority of the analyzed videos. But many videos stayed accessible for hours, days, or longer. Out of the clips that YouTube did remove, the company deleted around 50 percent of them within 24 hours. The remaining half stayed online between two and 15 days.

Got a tip? You can contact this reporter securely on Signal at +44 20 8133 5190, or email joseph.cox@vice.com

In SITE's own experience of tracking clips, "We did notice that official ISIS videos are often removed from YouTube faster—at times even instantly. Videos by ISIS' Amaq News Agency, however, remain for much longer," Katz said. "The same goes for pro-ISIS media groups."

"These videos, which may contain graphic imagery and urge for attacks in the West, often remain for months at a time."

Advertisement

When deleted, the videos are replaced with a variety of messages. In the majority of cases, they are removed for "violating YouTube's Terms of Service," while only a handful were removed for specifically violating YouTube's policy on violent or graphic content. YouTube also deleted entire user accounts, and all of their videos with them.

In some cases, YouTube decided not to remove the video entirely, but instead put it behind a "content warning," meaning users had to confirm they wished to view the video that "may be inappropriate" for certain users. YouTube put this warning on a series of videos showing an alleged white phosphorus attack by the US-led coalition on the Syrian city of Raqqa.

How the Islamic State Avoids YouTube's Censors

The 150-plus videos Motherboard tracked also show the various tactics Islamic State supporters use to try to bypass YouTube's censors. In one case last week, Telegram channels simultaneously pushed a flurry of videos that consisted of a still image of Arabic text combined with apparent audio of Abi al-Hassan al-Muhajer—the Islamic State's official spokesperson—calling for more attacks during the holy month of Ramadan. According to Motherboard's analysis, YouTube rapidly removed the main wave of recordings, but a few videos still broke through: Some included a different still image (the same Arabic text but this time with a white border), while another just presented a plain black screen with the speech laid on top. At the time of writing, the latter video is still online.

That clip was also 'unlisted,' meaning it can only be viewed with a direct link and not discovered with YouTube's normal search features—another tactic that experts say uploaders use to dodge bans. Some videos included short, chopped-up snippets of material that YouTube has previously removed, allowing the clips to stay up longer.

Advertisement

YouTube also removed the majority of the Tehran attack videos in under 24 hours, but one in particular persisted. It contained the original footage of the attack, but crucially, included no audio whatsoever. That attack video remained on YouTube for nine days.

"They're very clever at adapting," Steve Stalinsky, executive director of the research group MEMRI told Motherboard in a phone call.

"IS supporters commonly re-upload previously removed content onto YouTube and distribute the new links a short time later."

Presumably, removing the audio fooled YouTube's systems for spotting previously flagged videos. YouTube identifies videos via their "hash", a unique cryptographic fingerprint. If one video has already been removed, YouTube can automatically flag all other copies of it, as they will share the same hash. But, by slightly tweaking the video and in turn the hash, Islamic State uploaders are able to bypass YouTube's algorithms.

YouTube also has a feature that allows ordinary users to flag a video as "Promotes Terrorism." From here, the video is put into an expedited queue, and a human team will investigate the issue.

MEMRI has tracked this mechanism specifically against videos of support for jihadi fighters, which Stalinksky says may inspire potential attackers. The group found that, as of last week, 60 percent of the clips it flagged in 2015 have remained online.

Independent researcher Raphael Gluck also told Motherboard about his own experience using the manual flagging system.

Advertisement

"There are cases where I have reported and the offending URLs are not taken down at all," Gluck said. On Monday, Gluck pointed to several videos, including one of the Tehran attackers, that have been online for a week, and still are available at the time of writing. The channel's name is, simply "Islamic state."

It is worth bearing in mind that what one group sees as extremist content may differ from the social media company's perception, and perhaps some footage, such as battlefield images, could have a legitimate public interest in being available to researchers or journalists. Indeed, YouTube also runs a "Trusted Flagger" program, which works with NGOs and experts to more accurately flag videos that should be removed.

"When official IS media releases a new video, the material hosted on YouTube via those links is suspended almost immediately. However, IS supporters commonly re-upload previously removed content onto YouTube and distribute the new links a short time later," Jade Parker, senior researcher associate in VNSA Cybersecurity and Terrorist Use of the Internet at research group TAPSTRI, told Motherboard. "Because those links aren't distributed via official channels, the videos are exposed to a smaller pool of individuals likely to report them. This can give rise to the content remaining on the platform for a longer period of time."

YouTube Steps Up Its Efforts

Although YouTube's counter-measures are clearly not always effective, with some material continuing to remain online, Stalinsky from MEMRI says the company has improved tackling this sort of content.

"If you look at how it was two years [ago], it's much better [now]," he said. Stalinsky said he has met with YouTube to discuss the problem, and said YouTube was originally "hostile" in meetings, not open to suggestions, and was focused on deleting individual videos rather than entire accounts.

On Sunday YouTube announced improvements to its current methods, as well as some new machine-learning approaches for identifying offending content. Motherboard understands that this will scan videos automatically and look for similarities with previously flagged videos. From here, the system will mark the upload as potentially extremist, and a human analyst will review the video's content and context.

When asked for comment, a YouTube spokesperson pointed Motherboard to the company's announcement.

"While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now," Kent Walker, the senior vice-president and general counsel of Google, wrote.

Get six of our favorite Motherboard stories every day by signing up for our newsletter .