Europe Needs to Save Itself From Internet Upload Filters
A draft law currently making its way through the EU lawmaking process would impose filters that detect copyright infringement.
by Julia Reda
Nov 30 2017, 5:00pm
Image: Diliff/Wikimedia Commons
More from motherboard
Julia Reda is a German member of the European Parliament. She’s part of the Pirate Party Germany, The Greens-European Free Alliance, and president of the Young Pirates of Europe.
The internet has radically democratized creativity. What previously required special recording equipment, contracts with distributors, and a major financial investment is now accessible to everyone for free: The opportunity to reach a global audience with your work.
While corporate broadcasters have seized on the internet's advantages as a distribution medium for their content, the much more fundamental impact on our society is that it has allowed everyone to become a broadcaster themselves on an equal footing. But that equal footing is now at risk.
For the past year, a fight has been brewing in Brussels. A draft law currently making its way through the EU lawmaking process would force all internet platforms where you can upload media to install so-called upload filters. Their job: To surveil all your contributions on behalf of big media companies, trying to detect copyright infringement and stop it before it ever sees the light of day.
Every video you upload of your cat doing that adorable thing that it does would then by law need to be automatically checked against Taylor Swift's and Justin Bieber's body of work, to make sure your cat wasn't illegally meowing a copyrighted melody.
YouTube already has such a filter. It was put in place several years ago in an effort to appease record labels. As almost any YouTuber can tell you, “Content ID” has been a constant source of pain for individual creators. Regularly, perfectly legal uploads get taken down or “demonetized” (depriving creators of advertising income). Creators then need to go through an arduous process to fight for their rights and get their videos reinstated.
In the court of the upload filter, uploaders are guilty until proven innocent. Corporate giants provide the “Wanted” list of media that filters must look out for. Platforms blindly trust this input, as well as the technology that scans your uploads looking for any signs of it. What hasn't cleared the filters can't be seen. In effect, your freedom of expression has become subject to corporate approval.
Even if copyright infringement were as widespread and as harmful as media giants would like to make us believe—and it's not—that measure would be completely disproportionate. In addition, there are two glaringly obvious pitfalls: Filters can't tell what's legal from what's illegal. And filters fail.
It's legal to make parodies, to quote works, to use copyrighted content in educational contexts and so on. These exceptions and limitations to copyright enable the essential human right of cultural participation. But filtering technology can't tell whether your use of, for example, a few seconds of music is legal under of an exception, or just an illegal copy.
When in doubt, block it: That will be the principle of upload filters. Platforms will configure their filters to err on the side of taking down your uploads. They're more afraid of media companies' armies of lawyers than of you and me claiming our human rights. In this way, upload filters tip the balance of power all the way to big business.
If you thought I was joking earlier about the cat meowing a song, well, reality is stranger than fiction: YouTube’s filter once claimed that a 12-second recording of a purring cat contained works copyrighted by a major record label. And that's after Google had spent $60 million on the filter's development—now imagine how often cheaper filters built by smaller companies will get it wrong!
In fact, small platforms may have no other choice but to outsource their upload filtering to Google—then we'll truly have created one big centralized “censorship machine” that undermines everything the web once stood for.
European internet platforms and apps that can afford neither the licensing nor the development costs for upload filters will need to shut down. It will become even harder for EU companies to compete with the dominant US platforms, and innovation will be discouraged.
Unfair, prone to malfunction, a restriction of human rights, harmful for European innovation: Upload filters are a bad idea in so many ways. Human rights organisations and independent academics have been urging politicians to drop these plans for months now. But whether their warnings are being heard is unclear.
Last week, the Civil Liberties Committee of the European Parliament voted to remove automated upload filters from the copyright reform law. That's good news. But other committees had previously come out in their favour, and the most important decision is yet to come: The Legal Affairs Committee, in charge of this law, is scheduled to vote in January 2018. Whether it supports or opposes upload filters will come down to every single member's vote. Meanwhile, several national governments of EU member states are staunchly defending these filters in the Council, which holds just as much sway in the lawmaking process as the Parliament.
Upload filters would chip away at the internet's role as a public space for everyone. The internet would increasingly come to resemble cable TV, where it's up to a few big companies to decide what goes on air.
It's time to speak up against these plans—while we can still do so unfiltered.