A Visit to Facebook's Recently Opened Center for Deleting Content
Our tour of one of Germany's new content moderation centers gave us a look at Facebook’s content moderation—and what it means for the people who have to enforce its deletion rules.
Exterior shot of Facebook's new content deletion center in Essen, Germany. The building is located in a commercial area west of the city center. Image: Max Hoppenstedt/Motherboard
A version of this article originally appeared on Motherboard Germany.
For the first time, Facebook granted journalists access to its new center in Essen, Germany for deleting content from its platform. In the five-story building, more than 400 employees are already deleting comments, photos, and videos that break Facebook’s rules.
During this process, employees sometimes have to review disturbing videos and photos. A large portion of the visible content is either hate speech, spam, or content created by fake accounts, a Facebook spokesperson told me. These employees are essentially taking the trash out created by Facebook's 2 billion monthly active users from across the globe. The content that isn’t allowed on Facebook—according to the platform’s own rules—needs to be reviewed and deleted by the content moderators. This important work is complex and exhausting.
The first employees in Essen started working there October. At the end of 2017, a total of 500 employees were working in the office building located in an industrial park a few minutes walk from the Essen town center. The center is now the second facility of its kind in Germany, with the other located in West Berlin.
What it looks like from the inside
As part of the visit, Facebook provided a glimpse into one of the employee work spaces. According to the company employees, this workspace is representative of all others within the building. The “Floor,” as the employees there call these workspaces, isn't much different from standard open-plan offices. The areas the journalists were allowed to photograph show roughly two dozen employees sitting at their workstations—each with one computer screen.
Facebook didn't allow us to talk to the employees or look at the details on their computer screens during the 30-minute visit. While journalists were there, employees didn't actually review any user-created content. According to a Facebook spokesperson, this was due to security reasons.
Immediately next to the work space, there’s a meeting and hang-out room. Other than that, the facility closely resembles the first German center for deleting content run by Arvato in Berlin.
Who is responsible for the center?
The center is run by the company Competence Call Center (CCC). The employees aren’t hired by Facebook directly, but by CCC after undergoing a multi-layered recruiting process. The company is among the leading companies in the field of community management. PayPal and eBay, for example, are among CCC’s customers. The company runs numerous sites both internationally and in Germany.
For the Facebook content deletion center, the CCC rented its own office building in Essen. Currently, a number of the floors of the building are still occupied by a previous tenant. But that tenant is vacating the building at the end of the year, at which point all the floors will be dedicated to the social network, providing the employees of CCC a total office area of 10,000 square meters and a specially created “Relaxation Zone” with another 1,000 square meters.
Hate and violence: What content is reviewed?
All content that goes through the center was first reported by Facebook users. CCC’s employees don’t proactively go searching for content that breaches Facebook’s rules. Until now, only hate speech, spam, fake accounts, and explicit photos and videos have been reviewed and deleted in Essen.
What falls into the category of hate speech is defined by Facebook’s internal rules—also known as the Community Standards. We've reported extensively about what types of content are banned by Facebook—how these rules are enforced is tricky to get right, and is the subject of much controversy.
In this case, Facebook is responding to the German Network Enforcement Act (Netzwerkdurchsetzungsgesetz) aimed at hate speech and fake news and which came into effect in October. The regulation, also known colloquially as the “Facebook law,” is intended to force social networks to be more systematic about deleting criminally relevant content. In obvious cases, networks have to react within 24 hours, with violations leading to hefty fines. That said, the law is widely contested. Critics fear that it will lead to the excessive deletion of content, which in extreme cases could infringe upon freedom of speech.
How Facebook intends to prevent wishy-washy deletions
During the press visit, Facebook spokespeople continually emphasized one goal: Quality assurance. To them it’s important to prevent a situation where one post is deleted by one employee that wouldn’t otherwise be deleted by another. A difficult task. After all, content continually crops up that’s a matter of interpretation.
To ensure that all CCC employees interpret and apply the deletion rules in the same way, the company uses a sort of “four-eyes principle”: While the review of content is always carried out by only one employee, “a statistically significant number of reports are reviewed by a second employee at random,” according to a Facebook spokesperson who was present. Every CCC employee is expected each week to carry out a large amount of such four-eye reviews of their colleagues’ content.
The content that isn't reviewed in Essen
Particularly drastic and sensitive content such as terror propaganda, extreme depictions of violence, or child pornography has yet to be reviewed at Essen, a Facebook spokesperson confirmed to Motherboard. According to a Süddeutsche Zeitung report, this kind of content is processed by the Facebook deletion center in Berlin. CCC employees don't review Facebook's live videos either, and the company told us that for now there's no plan to expand the scope of the content that will be reviewed at the Essen deletion center.
The rules used to determine what does and doesn't get deleted
The rules that the employees in Essen follow was created by a special Facebook department—the so-called Policy Team. Facebook’s Community Operations Team is responsible for enforcing the rules and in turn commissions companies like CCC or Arvato—which have their own review processes and mechanisms in place.
The rules continually get adapted to current social developments. Facebook employees who spoke to us during the visit point to the Syrian refugee crisis, for example. Refugees were added to Facebook’s guidelines as falling under a group of people particularly in need of protections. This was not the case before. The change was made so that hate speech against refugees could be reviewed and deleted.
To make sure Facebook and CCC are on the same page, 10 CCC employees were sent away to a six-week training in Facebook’s European Community Operations headquarters in Dublin. Additionally, during the past few months, Facebook employees were on site in Essen for a total of eight weeks. They plan to come back to Essen every month for a couple of weeks to, among other things, review the current implementation of the deletion guidelines.
What do employees earn and what are the working conditions?
The employees’ starting gross rate is €10.50 an hour. But there are also employees in higher positions who start out at €15 an hour. There’s extra pay for working nights or on Saturdays. The salaries of the employees are above the minimum wage, which in North Rhine-Westphalia is currently at €9.10 an hour. All the employees there are currently in full-time positions and have contracts for more than 40 hours a week. Because the facility is occupied around the clock, people work in shifts.
Because some of the content can tend to “be more intense and graphic,” as the CCC manager Ulf Herbrechter describes it, there are four psychologists on site in Essen. This was a reaction to the critical discussions that took place at the end of last year in response to Facebook’s first deletion center. Journalists from the Süddeutsche Zeitung reported on the high pressure and the secrecy demands made of the employees at Arvato’s deletion center in Berlin. A number of Arvato employees, however, pushed back against the criticism. “Everyone I know here is proud of the work they’re doing,” said one 38-year-old Arvato employee to WDR, for example. “We save lives here.” This was a claim we also heard during our visit to Essen.
Facebook didn’t allow us to speak with the employees during our visit about concrete cases they were working on.
How much content gets deleted in Germany?
Facebook hasn’t provided more concrete numbers on this. In the summer of 2017, however, Richard Allen, who is Facebook’s European Vice President for Public Policy, claimed that somewhere around 15,000 posts were deleted in just one month due to hate speech originating in Germany. There is, however, significantly more content that is reported—Facebook spoke of numbers in the billions. But that figure isn’t limited to hate speech alone. Instead, it includes all content that could be a breach of Facebook’s Community Standards. Facebook, however, couldn’t say what portion of that content came from Germany.
It’s not just the sheer mass of the deletions and reports, but also a glance at the processes on site that reveals that content moderation is a complex challenge even for Facebook. After our visit, there were still a number of unanswered questions. Facebook is trying, particularly in Germany, to be transparent, but the specifics on the exact rules that the employees use still remain a secret.
One thing is very clear: Content moderators, such as those in Essen and Berlin, will be in increasing demand in the future. Mark Zuckerberg recently announced that Facebook intended to increase the number of content moderators worldwide from 10,000 to 20,000. Companies such as Arvato and CCC could therefore likely soon receive more contracts.