Image: Evelyn Hockstein/For The Washington Post via Getty Images

These Are Facebook’s Policies for Moderating White Supremacy and Hate

As hate speech continues to be a top issue for social media platforms, we are publishing an extended selection of training material showing how Facebook sees the issue of white supremacy and hate more generally.

|
May 29 2018, 5:36pm

Image: Evelyn Hockstein/For The Washington Post via Getty Images

This piece is part of an ongoing Motherboard series on Facebook's content moderation strategies. You can read the rest of the coverage here.

Over the past few weeks, Motherboard has been reporting on recent training material for Facebook moderators—the workers tasked with keeping terrorism, sexual abuse, and other offending content off of the platform. Some material showed how Facebook had something of an internal reckoning around American hate speech in particular after the events of Charlottesville, in which a white supremacist killed a counter-protester with a vehicle.

Now to provide more context around how Facebook decides what sort of content related to hate, Motherboard is publishing an extended selection of recent training materials. These include additional details on how Facebook differentiates between white supremacy, separatism, and nationalism—classifications that some experts say are really the same thing—as well as hate figures and speech more generally. The Guardian previously published documents related to hate speech moderation, some of which had a particular focus on immigrants.

Facebook has published a skeleton of its policies around hate speech, but these leaked documents include details on a much more granular level. To know whether Facebook is following its own policies, or what even Facebook considers a piece of offending hate speech, more information is required than what Facebook has published itself. To be clear, however, the documents do not necessarily present a full picture of how a moderator should handle an instance of hate speech in every case: in videos obtained by Motherboard, trainers reiterated certain points when asked questions by the audience.

Rather than publishing the original materials themselves, Motherboard has reconstructed the content of the slides to remove identifying details. The language remains intact. Additional context added by Motherboard is in square brackets.

"Our policies against organised hate groups and individuals are longstanding and explicit—we don't allow these groups to maintain a presence on Facebook because we don't want to be a platform for hate. Using a combination of technology and people we work aggressively to root out extremist content and hate organisations from our platform," Facebook previously told Motherboard in a statement.

Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

1527611371596-page_0000
1548326698618
1527611562996-page_0002
1527611571543-page_0003
1527611583174-page_0004
1527611593527-page_0005
1527611610778-page_0006
1527611635491-page_0007
1527611656760-page_0008
1527611705108-page_0009
1527612575522-page_0000
1527612584252-page_0001
1527612594147-page_0002
1527612605491-page_0003
1527612618729-page_0004
1527612634857-page_0005
1527612646370-page_0006
1527612659980-page_0007
1527612770254-page_0008
1527612850053-page_0009
1527612870064-page_0010
1527612943944-page_0011
1527612952256-page_0012
1527612960028-page_0013
1527613179551-page_0000
1527613187717-page_0001
1527613195457-page_0002
1527613204088-page_0003
1527613215857-page_0004
1527613238037-page_0005