FYI.

This story is over 5 years old.

Tech

Will Europe’s ‘Right to Be Forgotten’ Actually Hurt the Web? We Asked an Expert

"I think the ruling is quite devastating."
Image: Robert Scoble/Flickr

Original post from Motherboard's Dutch office.

Last week, the European Court of Justice ruled that Google must, in some cases, agree to delete search results about users that people find embarrassing—the "right to be forgotten." It unleashed a tempest on the web, with folks all but rationing food and warm clothes in anticipation of a Googleless world.

But are we really heading towards the end of digital times? To find out, I talked to Joris van Hoboken, a Dutch expert in information law working at New York University, about the right to be forgotten and freedom of speech.

Advertisement

Motherboard: Are you worried about the court ruling?

Van Hoboken: I don’t think this was a good decision. If you take a look at the specifics of this case, something strange is going on.

The pivotal question is the role and responsibilities that are attributed to Google, or any other search engine for that matter. Search engines provide a sharp edge to what is public on the internet and the fact is lots of information is publicly available. A consequence of being able to search very accurately is that it’s easy to compile a profile of anyone.

In this legal case, the question is: If it’s known that the source is legal, should the search engine be held responsible to delete the link from its results? I have always answered this question negatively; I don’t think that’s sensible at all. Although that doesn’t mean that a search engine is a neutral intermediary, because they’re obviously not.

So you agree with the court that Google has some responsibilities towards these kinds of requests?

Yes, I agree and I understand the idea of wanting to able to regulate search engines more. In the current environment you could solve most of the problem by making information unavailable through Google searches. When an illegal link is not indexed by a search engine, it practically doesn’t exist.

Of course you have to be careful with this, and should always try to tackle a problem at the source, but a search engine does have some kind of derivative responsibility. But as long as there’s nothing wrong with the source, it should not be expected that a search engine should delete anything.

Advertisement

The fact that that’s going to happen now is a disaster, according to some people.

Indeed. By making Google filter something because someone doesn’t like the disclosure of public information, you’re compromising the integrity of search engines. Moreover, you’re also restricting the freedom of speech in a couple of crucial ways.

We’re talking about freedom of speech in its broadest form, including freedom of communication and the right to inform yourself about other people. I think the European Court did not make the right consideration when balancing the importance of privacy and the importance of freedom of information.

"The people newspapers write about often don’t agree with that. It’s a really big problem if the law tries to prevent those kinds of problems."

On the other hand, people are very worried about the violation of privacy, especially by technological development and companies like Google.

The web is full of legal issues. And that’s a direct consequence of the fact that people can publish things themselves without having the information checked by an editorial office. Quite a bit of what’s published on social media, and especially pictures, is just plainly illegal in Europe. Small and large violations of privacy are commonplace.

An important question is what role the law can play in making up the balance. I think it’s dangerous to take a shortcut by making Google responsible for these kinds of things. I think it’s inevitable that there will always be sharp edges on the internet. Nonetheless I think it would be fine to check if there are some specific cases in which a search engine should delete results.

Advertisement

Such as?

These conditions already exist; special cases that involve children, for example, are already being altered or anonymized. Under existing laws, citizens of the European Union already have quite a lot of options to have search results removed.

In the US, that’s different. Over there, it’s almost impossible to have things removed from the internet, even if a site wants to publish your mugshots regardless if you’re in prison or not. A company could offer to take them off if you pay them $500. In the US this is a lot more intense, which explains why some people there are applauding this verdict.

So the context is also very important. But to place the responsibility squarely with Google is not smart. I also think it’s not smart to decide this on a European level. In Scandinavia it’s totally normal that your tax data  are public, but ask an Italian what he makes, and they go nuts. They should let only the Spaniards decide what to do with this matter.

Should we expect that this ruling will result in a dramatic change in the way the internet and search engines work?

I wouldn’t say this ruling will cause a huge drama, but it’s a push in the wrong direction. It places a skewed focus on the role of search engines, and moreover it’s important to acknowledge that search engines deserve some kind of protection because they fulfill a very special function.

The only problem there, is that there’s only one party dominant on the European web. And, ironically, now that all the pressure is put on Google to filter stuff, they gain even more power. If Google is always the one that has to decide what is or isn’t public information, and accessible through their search engine, then Google becomes even more the editor. I think there are already enough legitimate concerns about Googles cultural influence, but this ruling doesn’t take away any of those. It actually increases them.

Advertisement

So the aggressive reactions are legit.

I think that many people already feel that it’s kind of fishy when government agencies can decide to make stuff less findable. But many people will also react positively, especially under the pretext of the right to be forgotten.

Don’t they have a point?

I think the debate around the right to be forgotten harbors a very moralistic tone. And that it also has disciplinary effect on society, one that I do not agree with. That doesn’t mean that there are no cases in which it’s just to delete things, just that the gist of the debate revolves around a fear of vulnerability with respect to visibility.

I think that’s the wrong attitude to have. Conversations in bars between people who are worried how they’re going to find a job when there are embarrassing pictures of them on the internet are saturated with that disciplinary effect. That also leads to risk-averse behaviour in young people, even though when you are young you should be able to do things, also stupid or embarrassing things.

But currently, those kind of pictures do remain infinitely available and retrievable forever.

Everyone makes mistakes; that’s just normal. I think that forgiveness toward old information is a lot more important than the right to be forgotten.

"It’s also partly a problem that’s being projected even though it doesn’t exist."

It’s crucial to have the right to be vulnerable, to show things about yourself without it being abused. But it’s also partly a problem that’s being projected even though it doesn’t exist. People somehow think that if something ends up on the internet, it will be there forever. That’s just complete nonsense. The digital environment is extremely dynamic and suffers from severe amnesia.

I also don’t think it’s right to propose that there shouldn’t be any conflicts about what information is published. The people newspapers write about often don’t agree with that. It’s a really big problem if the law tries to prevent those kinds of problems. That shouldn’t happen.

Do you think the ruling is dumb?

I think the ruling is quite devastating in how the Court interprets freedom of speech. They give the impression that they don’t want to discuss that, as if they assume that everything is fine, and that is very troubling.

It’s a discussion we definitely should have, and this ruling has a catalyzing role in this, but pretending to solve a problem by making search engines filter information, that’s not smart.