Heavy Censorship in Pakistan Shows Why the Next Arab Spring Won't Be on Facebook
​Image: ​Joelle L/Flickr

FYI.

This story is over 5 years old.

Tech

Heavy Censorship in Pakistan Shows Why the Next Arab Spring Won't Be on Facebook

Facebook is deleting posts that governments don’t like, even when it may not be legally obligated to do so.

​If you tried to access the Facebook page of one of Pakistan's most popular bands, Laal, between June 5 and 7th, you were out of luck: Facebook blocked the band's page within the country after receiving a censorship request from the government.

Because it was such a high profile case of censorship (Laal has more than 400,000 fans on Facebook), the event got internati​onal attention. What has flown relatively under the radar, however, is the fact that Facebook censorship in Pakistan has increased astronomically over the last 18 months.

Advertisement

During the second half of 2013, Facebook censored or removed 162 posts from Pakistan, according to Facebook's official transpa​rency report, which outlines government requests for censorship and user data around the world.

What's even more interesting is that, in the first six ​months of this year, Facebook removed 1,773 posts—roughly a 1,000 percent increase.

A Close Relationship With Government

There have been reports of growing media censorship in Pakistan, which ranks a dismal 158 o​ut of 180 on the World Press Freedom Index. The country has recently seen a rash of anti-government protests as well as sustained​ tension between the prime minister and military leaders. Both forces have the motivation and means to seek suppression of information in the media.

Facebook says the jump in takedowns was due to an increase in government requests to remove content. "The increase reflects the fact that the Pakistan Telecommunication Authority sent significantly more reports of illegal content over this period," a spokesperson told me.

Facebook is extremely compliant with the government. They are over complying

The company told me that it hasn't changed the way it evaluates reports of illegal content from the government, and that it appears as though Pakistan's Inter Ministerial Committee for the Evaluation of Websites has been asking the PTA to be in closer touch with Facebook.

There's still no obvious reason for the sudden increase, however. During the same period, Twitter received just 12 takedown requests from Pakistan, according to Twitter's official numbers, none of which it complied with.

Advertisement

So why did Facebook see such a massive increase in requests, while Twitter did not?

Part of the answer may be that Twitter has historically pushed back harder against Pakistani censorship. Over the summer, the company reversed a decision to block "blasphemous tweets" after the government failed to provide "additional clarifying information" for the takedown requests.

Meanwhile, Pakistani government officials haven't been shy about noting Facebook's willingness to work with them to remove "undesirable" content.

Last year, a representative of the Pakistan Telecommunication Authority (PTA) told the country's high court that the government has "an existing arrangement" with Facebook that makes it easier for the government to get content it doesn't approve of taken down. Facebook declined to address those comments, but the company maintains it only responds to "valid legal requests," and that each one is "checked for legal sufficiency."

What's a 'Valid Legal Request?'

The trouble with all this is that Facebook obscures the reasoning and mechanism behind content removals.

Facebook says it only removes content after "careful legal review," but it's unclear how much proof is required when a government claims a piece of content is "illegal."

We have no idea what a takedown request looks like from the PTA, or what types of content it's trying to censor. Facebook won't talk about it formally and on the record, and it doesn't publish actual documents that it receives from governments like Google, Vimeo, and Twitter do. Its transparency report also offers few details.

Advertisement

In fact, there is evidence that Facebook may not be legally obligated to remove anything at all.

It's an entertainment platform, not one for using freedom of speech rights

Last month, AJURIS Advocates & Corporate Counsel, an independent law firm in Pakistan, conducted a legal review that suggested the country's strict penal code, which prohibits things like reciting obscene songs and making disparaging comments about Mohammed or Islam, probably could be applied online.

The group notes, however, that the penal code would almost certainly only apply to the users actually making the posts, not the company making the posts possible. In other words, Facebook may be censoring content it has no legal obligation to remove.

"What is required in the present circumstances is a declaration from a competent judicial forum in Pakistan… as to whether [the government] is duly authorized to request social media websites such as Facebook and Twitter to block access to pages," the review noted.

In other words, Facebook or Twitter have yet to push back on a content takedown request to the point where there is a legal precedent about whether social media companies actually have to remove anything.

It also suggests that companies that provide a service—even if they are Pakistani companies—are not liable for the actions of people who are "not subject to the direction or control of the network service provider." This would appear to protect Facebook from any legal liability if it decided to leave content up, the review notes.

Advertisement

Sana Saleem, director of ​Bolo Bhi, the Pakistani civil liberties advocacy group that commissioned the legal review, says that, at the moment, Facebook is certainly removing content without court orders, because Pakistan's blasphemy laws don't specifically mention online content and the question of whether existing law applies to the internet has not yet been tested in court.

What you're left with is an opaque system: Are government officials calling Facebook and telling them to remove content? Are low-level local government workers sending Facebook an email and a link to a law, and demanding content comes down? Is there a paper trail at all? The company won't say.

"In the case of Laal, who sing songs from a socialist perspective and write revolutionary poetry, we asked why they took it down and Facebook cited 'anti-state content.' Well, we don't know what laws were cited, what specific instance in the penal code," Saleem said. "It's extremely compliant with the government. They are over complying."

What's Being Lost

So what was in those 1,773 posts that were taken down?

"While we are not yet aware of the scale of this banning and filtering exercise agreement between Facebook and Pakistani Government authorities, we feel that some of the known blocked pages are essential for the promotion of peace, harmony and alternate narratives in the country," BytesforAll, a Pakistani group that fights for civil liberties and free speech, wrote in response to the government representative's comments about having an "existing arrangement" with the company.

Advertisement

Saleem says Facebook has removed content that is critical of the Taliban, which suggests that the company may be taking down content somewhat indiscriminately or could be in contact with people from local governments or the military. "It makes absolutely no sense that the government is even requesting it," she said.

Facebook has also removed a page that listed people who were killed by or were on a list naming them as targets of terrorist organizations, she said.

She says that her organization has had recent formal meetings with Facebook in Silicon Valley about being more discerning with what it takes down.

"I don't think they're a company that's ready to push back against the government, but we're seeing what other options they can offer," she said. "Can they not take down every link the government sends? Can they push back on pages that are against the Taliban? We're in constant conversation with them, but the conversation is never 'How can we not take down content or take it down as minimally as possible,' it's 'How can we find a way to do it without angering the government?'"

Why Facebook Won't Push Back

It makes sense that Facebook would want to err on the side of caution when it removes content. Pakistan represents a market of 182 million people, 17.2 million of whom use Facebook.

Considering that the company's largest expansion is occurring outside of North America (and the fact that many people are still getting internet access for the first time in Pakistan), the country offers the potential for plenty of growth.

Advertisement

Though Facebook may not be legally liable for what its users post, if it pisses off the wrong government official, it could be blocked in Pakistan altogether. There's nothing stopping the Pakistani government from completely blocking access to Facebook, something it did for several hours back in 2010. YouTube, meanwhile, has been blocked in the country for almost two years, a fate that Facebook clearly wants to avoid.

"It seems more important to them that their platforms remain available than it is for them to protect free speech," Jillian York, the Electronic Frontier Foundation's director for international freedom of expression, told me. "I don't think they're thinking about what lines they're drawing. They might want to think about taking the long view and take the risk of being blocked."

In a blog post earlier this month, York suggested that Facebook is "complicit" in government censorship, and said that "when a company does not have [a formal office] in a given country and is thus not subject to its censorious laws, we believe that it can and should refuse government censorship requests."

We haven't trusted Facebook for four to five years

This isn't only a problem in Pakistan, of course, though that's where Facebook censorship increased the most over the last six months. (Government requests for user information are another question altogether, and are a particularly dangerous considering the penalty for posting "blasphemous" content in certain countries is severe.) Facebook censored nearly 5,000 posts in India, 1,893 in Turkey, and a handful in Germany, Russia, France, Australia, Israel, and several other countries.

Advertisement

Serhat Koç, a lawyer in Turkey who fights free speech cases, told me that "Facebook is very close to government" in his country.

"Government officials can call and ask for groups to be closed on Facebook," he said. "We haven't trusted Facebook for four to five years in Turkey. It's an entertainment platform, not one for using freedom of speech rights. I tell my clients: If you write something contrary to the government's opinion on Facebook, you can be tracked easily."

A Protest Tool, or a Toy?

The transition from freedom of expression to entertainment platform is really the crux of this. Facebook pitches itself as a place where people can speak out against their oppressive governments and organize protests. At a conference in September, Facebook COO Sheryl Sandberg said Facebook played an important role in the Arab Spring revolutions, and that it would continue to do so in the future.

"If you live [under] an oppressive government, you have to choose what you post," Sandberg said, suggesting that activists can set up a Pages account (similar to accounts that businesses have—where people are "liked" instead of "friended") instead of a personal profile. "You can set up a page and say bad things anonymously about your government on that page… A lot of what happened during the Arab Spring was done on these pages. People who want to do it, can."

But if oppressive governments have the ear of Facebook, is the next Arab Spring going to be organized on the social network? Probably not, York says.

"To some extent, Facebook and Twitter pitch themselves as these saviors of the Pakistani people," York said. "But if they don't listen to the needs of their users around the world, they'll move somewhere that does."

So, what happened in Pakistan over the first six months of 2014 that led to a tenfold increase in Facebook takedowns?

Maybe it doesn't really matter. That Pakistan wants to censor content is no surprise—the thrust of the problem is that Facebook appeared willing to let itself get pushed around by a government notorious for censoring its people. Facebook's acquiescence may have emboldened the PTA, considering that Twitter has complied with no Pakistani takedown requests; to speculate a bit, Twitter's obstinance may have convinced censors that there is no point in even asking the company to take down content.

"It could be that someone, or a group of users, went crazy on Facebook and just posted hundreds of things," Meg Ambrose, a Georgetown University professor who studies internet governance told me. "It's a question for Facebook."

Well, the company won't publish details about what, exactly, it's removing. So, how transparent is its transparency report, really?