Could Wikipedia Help Crowdsource Politics?
While 'Wikipolitics' could help politicians connect with public opinion, there's no such thing as consensus on the world's largest encyclopedia.
Wikimania 2014. Image: Wikimedia/Henry Kellner
Imagine if government policy was created using Wikipedia as a source, with politicians monitoring discussion and edits made to pages, and accepting the resulting page as the voice of the people, speaking in online solidarity. It's an exciting prospect: the notion of crowd-sourced opinion finally playing a role in a new regime of "Wikipolitics."
This concept was put forward by Carl Miller, co-founder and research director of the Centre for the Analysis of Social Media (CASM) at UK think-tank Demos, in his presentation at Wikimania 2014, three days of talks on the world's biggest encyclopaedia held last weekend at London's Barbican Centre. He presented Wikipedia as the missing link between knowledge and power.
But the idea of a Wikipolitics might also be disastrous. Even though Wikipedia offers an alternative form of ostensibly democratic media, it relies on sourcing from traditional news outlets for credibility. And despite its impression of neutrality, it is as given to human bias as the human beings who edit it. Given this, plus the general confusion around Wikipedia editing (Is it local? Is it global? Is it done by paid editors and bots?), presenting the site to politicians as a font of grassroots insight might not be the wisest option.
Indeed, I found it surprising how many of the Wikimania panels voiced doubt over the system, addressing topics like paid editing, barriers to editing, and the biases behind article creation. The program presented a coldly logical take on the limits of crowd-sourced knowledge; speakers queried the site's trust in procedures over people, and the excessive lists of rules which put new editors off (one of which confusingly informs users, "If a rule prevents you from improving or maintaining Wikipedia, ignore it").
It was perhaps only to be expected, then, that the culminating succession of main stage talks appeared to contradict each other in some ways.
Positioning Wikipedia as an outlet for disenfranchised voters, Miller's talk made a salient point: People now trust politicians less than estate agents and bankers, and notably trust Wikipedia more than the news. The ability to edit a worldwide encyclopaedia is undeniably empowering, allowing the average internet user to play a role in the writing of history.
But who might that "average internet user" actually be, and could a global media really effect change on a local level?
One problem is that Wikipolitics, as a different concept, already exists. It's the term used to describe the hierarchy of editors allowed blocking and deletion rights among Wikipedians. That speaks of a more general confusion about the significance of Wikipedia: the view that it is an objective reflection of public opinion.
Miller accorded Wikipedia the power of consensus over and over, –"Consensus is truth; consensus is fact"–but didn't exactly define how this could be defined. Would each Wikipedia page be allowed a limited number of edits before consensus was reached? Would users be issued with web IDs, or geo-tagged to insure their views reached local politicians? And exactly how would this consensus, once observed, be incorporated into political discussion and action?
Miller pitched Wikipedia as "magical alchemy," a public portal that could illuminate the ambiguous "dark and murky space" that politicians move in. But some of his ideas were challenged by the next speaker to take the stage: researcher, activist and entrepreneur Heather Ford, who spoke on Wikipedia as a source for breaking news.
Though Ford spoke of Wikipedia's role in breaking stories such as the 2011 Egyptian revolution, she also highlighted how the site is "'like everything, a product, and it mirrors the biases of the components which produce it." Wikipedia famously bans original research, relying on mainstream media coverage as a source to confirm that events have really happened. What happens when a government controls local media in a country less accustomed to getting their news online? Meaning and truth end up lost amid conjecture between editors on the other side of the world.
Ford's talk also raised issues with the prospect of a "global media," one by no means omniscient or without its own prejudice. The spare, brutally functional design of a wiki gives the impression of objectivity, but Wikipedia is very much the sum of its parts.
Editorship is biased, famously by gender but also by education (according to Wikipedia's own stats, around 23 percent of editors have completed third level education, while 26 percent are currently undergraduates). Perhaps most interestingly, 13 percent of Wikipedia editors are aged 17 and under, and not legally entitled to vote yet. By anchoring its definitions in outside links, Wikipedia risks being smothered by its own filter bubble. So why, of all the sources presented by the internet age, would we want politicians to use Wikipedia as a source for insight?
I honestly believe an expert could not win a debate on a top-level Wikipedia page.
Away from the main stage and earlier in the program, a talk given by Dariusz Jemielniak, associate professor of management at Kozminski University in Poland, explored one possible solution. Author of the first ethnography of Wikipedia's users, Jemielniak considered the idea of introducing "expert opinion" sections to Wikipedia, a kind of "verified tweets" status for editors. In a previous piece for the Daily Dot, Jemielniak also brought up German Wikipedia's introduction of a company registry system, where corporations are allowed to make edits as long as their identity is flagged.
The idea of puncturing Wikipedia's objective tone with clearly marked opinions is a radical one, but if a real consensus is ever to be reached it will necessitate Wikipedia coming to terms with uncomfortable truths. In Jemielniak's words, "I honestly believe an expert could not win a debate on a top-level Wikipedia page," so why not at least discern the expert from the novice, and allow the reader to make up their own mind who to believe?
As it currently stands, Wikipedia cannot be read as "truth" or "consensus," but more as an aggregator and a gauge for debate. On Wikipedia, consensus rarely stays that way for long, and both Ford and Jemielniak stressed that for all its sense of community, the site is ultimately built upon distrust between its members.
Its strength lies in its constant evolution: Behind the scenes in the page edits, Wikipedians consistently question what the site presents as truth. And that's exactly what its readers should do too.