The Camp Fire in California has killed at least 79 people, left 699 people unaccounted for, and created more than a thousand migrants in Butte County, California. In these circumstances, reliable information can literally be a matter of life death. But on YouTube, conspiracy theories are thriving.
Currently, when a user starts typing “California fire” into YouTube, the top autocomplete search suggestions are “conspiracy 2018,” “agenda 21,” and “laser beam,” all of which refer to conspiracy theories related to California’s wildfires. Similarly, typing in “California wildfire” leads YouTube to suggest “lasers,” “directed energy weapon,” and “dew,” which is an acronym for “directed energy weapon.” Simply typing “California fire” and searching it does return straightforward news coverage, which is an improvement over, say, the false flag and crisis actor conspiracies YouTube was surfacing about the Marjorie Stoneman Douglas High School mass shooting earlier this year.
Videos by VICE
Believers of this false California wildfire conspiracy theory think that the US government shoots directed energy weapons, or lasers, from a plane in order to to start fires at predetermined targets. The goal of this attack, which is not actually happening, is to support a bastardized interpretation of “Agenda 21,” a sustainable development plan developed by the United Nations in 1992. This conspiracy theory also gained a little bit of traction on Twitter.
The conspiracy theorists use doctored or out-of-context images in order to falsely argue that directed energy weapons, or laser beams, caused the wildfires in California—not climate change. By incorrectly claiming that the houses were consciously struck, these theorists ignore ecological science which explains that the arrangement of homes and topography of the land makes wildfires prone to destroy some homes and not others.
But the point isn’t that these conspiracy theorists are wrong. They obviously are. The point is that vloggers have realized that they can amass hundreds of thousands of views by advancing false narrative, and YouTube has not adequately stopped these conspiracy theories from blossoming on its platform, despite the fact that many people use it for news. A Pew Research survey found that 38% of adults consider YouTube as a source of news, and 53% consider the site important for helping them understand what’s happening in the world.
YouTube did not directly address misinformation relating to the California wildfires on its platform, but in an email to Motherboard, a YouTube spokesperson noted that the platform has a Breaking News shelf on the homepage, which directs users to credible news organizations in 23 countries.
Images: Screenshots from YouTube by Caroline Haskins.
Misinformation and conspiracy theories are a flagrant, long-term problem on YouTube. Earlier this year, the number one trending video on YouTube falsely suggested that survivors of the Parkland shooting, such as David Hogg, were crisis actors. The platform has also fueled “Flat Earth” conspiracies, which are now so popular that the first-ever international Flat Earth conference was held this year. Most notoriously, YouTube delayed removing Alex Jones from its platform for years, despite violating the company’s Community Guidelines frequently. When YouTube finally deplatformed Jones in August, other platforms like Spotify, Apple, Facebook, and eventually Twitter followed suit.
Micah Schaffer, a former community manager at YouTube, said at the Content Moderation & Future of Online Speech Conference in October that the risk of hosting fringe views on YouTube, including hate speech and conspiracy theories, is that these views benefit from normalization simply because they’re hosted on a platform alongside mainstream, popular content.
“People aren’t talking about which conspiracy theorists are being banned from Vimeo,” Schaffer said. “They’re worried about which ones are being banned from Twitter, Facebook, and YouTube, and that’s because of the distribution they provide. It’s because of the legitimacy and the adjacency from the content. They are free riding on that community that was built on content that was specifically not them.”
YouTube did attempt to curb misinformation by including a Wikipedia “fact check” at the top of certain search keywords, such as “climate change” or “global warming,” in March of this year. But measures like this are easily evaded. Conspiracy theorists commonly don’t use words like “climate change” or “global warming” in their video titles. This means that content creators avoid triggering the Wikipedia fact check, even if they dispute the evidence for climate change in their videos (which they often do). Side bar, there is an overwhelming scientific consensus that climate change is real and caused by humans.
Granted, we don’t currently have data on how frequently people actually click on Wikipedia fact checks. It’s also possible that YouTubers avoid fact checking trigger words by accident. But intentional or not, it’s worth noting that evading fact checking is built into the way conspiracy content creators use YouTube.
YouTube is also friendly to conspiracy theorists in other ways. The platform will recommend “New” videos within the results of a search query. The prioritization of new videos encourages YouTube conspiracy content creators to post multiple videos that expand upon a particular conspiracy theory. It also increases the likelihood that people searching for conspiracy theories will be recommended content that relates that particular conspiracy theory to a current event, such as the California wildfires. In an email to Motherboard, a YouTube spokesperson noted that its search results also consider the title, description, video content, and engagement of videos in its ranking.
Image: Screenshot from YouTube by Caroline Haskins, annotations by Caroline Haskins.
It’s also worth noting that accounts such as ODD Reality and Moe Othman, which have posted conspiracy content about the California wildfires as well as other conspiracy theories, have verification check mark badges from YouTube. According to Google, this mark means that “the channel belongs to an established creator or is the official channel of a brand, business, or organization.”
YouTube claims that badges are subject to removal if the accounts are perpetrators of “spam, misleading metadata, and scams,” which violates the site’s Community Guidelines. However, the site often relies on user reports in order to respond to these cases. A YouTube spokesperson said that YouTube is committed to being a platform for free speech, and that the company is also committed to enforcing its Community Guidelines.
In order to understand how the misinformation ecosystem on YouTube works, it’s useful to consider Becca Lewis’s “Alternative Influence” report published in Data and Society. In the report, Lewis explains how users can easily by radicalized to the far-right. This radicalization is made possible through a network of video collaborations with members of both ends of the political spectrum. For instance, mainstream content creators like Joe Rogan habitually collaborate in a videos with far-right figures of the “Intellectual Dark Web,” a community united by racist, misogynist, and anti-LGBTQ hate speech.
Through this network of collaborations, which Lewis calls the “Alternative Influence Network,” users can be directed toward more extreme content by either searching for the people in these videos, or by having these videos recommended by YouTube. The network builds trust and inclusivity by sharing a common brand in which reliability and perceived authenticity legitimize people’s content more so than a basis in fact, or connections to a fact-gathering organization.
The network of YouTube conspiracy theorists works in a similar way as the network of YouTube political extremists: prominent conspiracy theorists don’t commonly collaborate in videos, but they do share clips and links to other conspiracy theorists’ videos, and they speak about the narratives proposed by other theorists by debating the minute details of a theory. (For instance, some conspiracy theorists debate the specific fictional government projects that would benefit from destroying billions of dollars of private property.)
This video structure can implicitly encourage viewers to seek out alternative perspectives on the same topic. In essence, YouTube conspiracy theorists benefit from a group structure to their community in which viewers engage with a conspiracy theories through watching videos on multiple channels.
It’s worth noting that directed energy weapon conspiracy theories are far from new. A small group of fringe individuals have alleged that tragedies such as 9/11 were perpetrated by the US government using directed energy weapons. This is obviously not true. The terror attacks were demonstrably organized by Al-Qaeda.
The issue of conspiracy theorists in a society far predates platforms like YouTube, however, platforms such as YouTube provide new fuel and routes through which these conspiracy theories can spread. In other words, conspiracy theories are the disease, but YouTube is a whole new breed of carrier.
More
From VICE
-
Screenshots: HBO, Sony Interactive Entertainment -
Screenshot: ASUS -
Screenshot: WWE/USA Network -
Screenshot: Shaun Cichacki