FYI.

This story is over 5 years old.

Tech

Can Wikipedia Solve YouTube's Conspiracy Theory Problem?

The Wikimedia Foundation wasn’t given warning before Susan Wojcicki announced at SXSW that YouTube is planning to use its content to debunk fake news.

YouTube has a conspiracy theory problem, and the platform is trying to fix it by outsourcing its problems to Wikipedia.

Tuesday evening, YouTube CEO Susan Wojcicki announced that YouTube will begin displaying links to Wikipedia articles within conspiracy theory videos, in an effort to debunk them

People within the Wikimedia Foundation, the nonprofit organization that operates Wikipedia, are unclear on how, exactly, this will work, and what will happen once YouTube trolls begin targeting Wikipedia articles to game this new system. This is in part because YouTube gave the Wikimedia Foundation no notice that this was happening.

Advertisement

“YouTube is outsourcing responsibility for the truth."

It's concerning that YouTube, a company that generates billions of dollars of revenue a year, decided to outsource one of its biggest problems to a nonprofit organization, but research shows that Wikipedia does have a good method for combating hoaxes and conspiracy theories. It's not a solution, but If implemented properly, Wikipedia could at least provide people who watch videos about the Earth being flat with links to better, factual information.

YouTube didn't even tell Wikipedia about its plans

“We are always happy to see people, companies, and organizations recognize Wikipedia’s value as a repository of free knowledge,” Wikimedia Foundation said in a statement emailed to Motherboard. “In this case, neither Wikipedia nor the Wikimedia Foundation are part of a formal partnership with YouTube. We were not given advance notice of this announcement.”

Katherine Maher, executive director of the Wikimedia Foundation, tweeted after the announcement that this was done independently of Wikimedia—meaning that YouTube didn’t consult or brief the organization before announcing this at SXSW.

“We’re always exploring new ways to battle misinformation on YouTube,” a spokesperson for YouTube told Motherboard. “At SXSW, we announced plans to show additional information cues, including a text box linking to third-party sources around widely accepted events, like the moon landing. These features will be rolling out in the coming months, but beyond that we don’t have any additional information to share at this time.”

Advertisement

Longtime Wikipedia editor and community member Liam Wyatt—the world’s first “Wikipedian in Residence” at the British Museum and founder of the GLAM-Wiki movement—told me that the first he’d heard of this announcement was from the Wired coverage of the panel, as well.

“YouTube is outsourcing responsibility for the truth,” Wyatt told me in a phone interview. “Their job is not to be responsible for truth, that’s not their mission. But they’re abrogating their responsibility to society by saying there is some organization that can answer for that, and we don't have to deal with it.”

Wyatt said that it’s great that YouTube recognizes Wikimedia’s contributions to information-sharing, as a volunteer-run effort, while also being “kind of ironic” that a multi-billion dollar company is pointing to Wikipedia’s model of donation-driven, crowdsourced community as the best means to deal with hoaxes and conspiracies.

With YouTube essentially springing this announcement on Wikipedia’s cadre of volunteer editors and community members, it’s also worth questioning whether Wikipedia is ready for what could come next: Trolls and conspiracy theorists flooding the platform to try to influence what’s displayed on YouTube.

Wikipedia is much better than most platforms at fighting misinformation

Wikipedia’s pretty good at catching bad actors on its site. A 2016 study by University of Maryland and Stanford University researchers explored how hoaxes, disinformation, and “bullshit” (a scientific term, in this case) enter Wikipedia, and how the platform handles malevolent edits. They found that 99 percent of attempted hoaxes are caught immediately‚ within the first minute after they pop up on the site. A small fraction of hoaxes last longer, but the ones that do survive the initial patrol by editors “attract significant attention and a large number of pageviews,” according to the study.

“There are so few of us and so much crap, basically, on the web."

Advertisement

I asked Bob West, one of the researchers on that study and now assistant professor at the Swiss Federal Institute of Technology, what he made of this move by YouTube. “It’s a great step, they should be commended for doing that,” he said. “It’s not gonna fix the problem completely.”

West told me that there are stupid hoaxers and smart hoaxers: The less-clever ones will set up a new account and create a fake article immediately. A more savvy hoaxer might tinker around Wikipedia for a while, editing commas and small errors in other articles, before trying to establish a new hoax. The smartest ones will seed other articles with hints for the hoax to come, and then make their big move much later, once they’ve “primed” the system to agree with their new disinformation campaign.

West believes that bringing more humans into the mix—essentially making everyone on the internet a fact-checker—could curb hoaxes before they spread.

“There are so few of us and so much crap, basically, on the web,” he said. “You don’t want [algorithms] to become self-referential… If you take the human out of the loop completely the algorithm reproduces its own actions, and bites its own tail.”

Wyatt feels confident that any sudden influx of trolls coming from YouTube won’t crush the site or any specific pages. “If it was a controversial topic before, then it would already have people monitoring it, and editing restrictions from drive-by vandalism in place,” he said, in a Google Hangouts chat. “If YouTube suddenly sent lots of conspiracy theory people towards one obscure article—that would be a temporary problem for us. but if they're sending people to a variety of well-established and well-monitored articles, no problem.”