Facebook trains its moderators to recognize when emojis violate the social network site’s policies as well, including for sexual solicitation, hate speech, and bullying.
A single emoji can make all the difference between an innocent joke and a hateful message. With that in mind, Facebook trains its moderators in how a slew of different emojis can violate any number of the social network giant’s content policies, including poops for hate speech, vomit faces for bullying, and crowns for sexual solicitation, according to internal Facebook documents obtained by Motherboard.
The documents show the constantly evolving arena of content moderation, with phrases, fictional characters, or in this case, emojis, potentially taking on new meanings that platforms then have to tackle.
“How to action on Emojis,” a document obtained by Motherboard reads. “Use additional context to determine if the emoji is used in a violating manner,” it adds, before including a table detailing how different emojis may be used in an offensive or troubling context.
The list itself is fairly expansive, covering emojis that may be indicators of praise, support or promotion—positions that Facebook bans for things like celebrating crime, for example—such as the hearts-as-eyes emoji, the 100 emoji, or clapping.
Other emojis including a middle finger, water pistol (after platforms and devices moved away from a more realistic handgun), and knife can full under calls for action and Facebook’s "credible violence" policy. National flags can violate Facebook’s hate speech policies if used in a certain way, as can pictures of animals in a “dehumanizing comparison”, according to the document. This also includes the poop emoji, which can qualify as hate speech, sometimes.
Got a tip? You can contact this reporter securely on Signal on +44 20 8133 5190, OTR chat on email@example.com, or email firstname.lastname@example.org.
With this table, it seems Facebook has somewhat officially weighed in that, yes, eggplant and peach are sexual emojis: both are included in the section for “sexualised text,” according to the document obtained by Motherboard.
The document explains that the table is not meant to be exhaustive, either, likely meaning moderators are expected to use their own judgment or knowledge of particular areas they cover to flag any additional violating emojis (in training videos obtained by Motherboard talking more generally about content moderation, mods are told to exercise their own market awareness, such as a specific geographic region). Facebook also recommends that moderators submit an emoji for potentially being added to the list by contacting the Content Policy department, according to the document.
Facebook has introduced similar sorts of approaches for particular images. The company recently broke with its own policy of allowing fictional characters in a hateful context to ban certain depictions of Pepe the Frog.