​When Algorithms Are Sexist
​Image: ​Jeff Carson/Flickr

FYI.

This story is over 5 years old.

Tech

​When Algorithms Are Sexist

We’re sexist to the robots, and so they’re sexist right back at us.

Dr. Louise Selby is a paediatrician. She's also a woman. According to the computer system at the gym she uses, that's not possible.

Selby told the Cambri​dge News that she found herself locked out of the women's changing rooms at PureGym in Cambridge, even though she had an access code. It turned out that she couldn't enter because she had given her title as "Dr"—and so was automatically registered as male.

Advertisement

In response ​to a tweet from Selby, the company wrote it was "a glitch in our system," and Selby later tweeted that she had received an apology.

It's not the only algorithmic "glitch" to result in sexist outputs. Algorithms are not always neutral. They're built by humans, and used by humans, and our biases rub off on the technology. Code can discriminate.

In the case of PureGym, it looks like the sexist "glitch" was part of the system's design; perhaps whoever put it together assumed that being a doctor and being a woman were mutually exclusive. (I reached out to ask PureGym more about the systems they use, but they did not respond by time of publication.)

"Algorithms are designed by humans and humans, whether male or female, can be sexist, whether consciously or subconsciously."

Other algorithms can make similar stereotyped assumptions about your gender based on your apparently sex-specific interests. In 2012, Jill Pantozzi, editor-in-chief of T​he Mary Sue, reported that Google thought she and some of her female coworkers were men. According to the "ads preferences" pages for their profiles—which infer demographic information such as age and gender in order to target ads—they were 25 to 34 years old and male.

"Based on the websites you've visited, we think you're interested in topics that mostly interest men," the pages explained. The women's reported interests included categories such as "computers & electronics."​

Advertisement

Even "neutral" algorithms—not designed to discriminate—can make sweeping generalisations based on gender. It may be true that the majority of people searching for "computers & electronics" on Google are male, but that doesn't make the output any less offensive for the many women who share those interests.

And overtones of sexism in algorithm is not always a "glitch." ​Programmer and blogger Tom Morris found blatantly sexist and racist slurs in the actual coding of projects on open-source community GitHub.

One common algorithm is often explained with a sexist example: the "st​able matching," or, as the explainer often goes, "stable marriage" problem. This problem involves matching one set of elements to partners in another set, with the solution considered "stable" if there is not an alternative pairing by which both elements would be better off.

A solution is offered by the Gale-Shapely algorithm, and is frequently illustrated in terms of marriage proposals: n men rank n women in terms of preference and propose to the woman they like best. She provisionally accepts, and each unengaged man continues to propose to the women he likes best; if a woman likes him more than her current fiancé, she swaps.

By the end, each couple is matched up so that no man and woman both prefer each other over their actual matches.

Glencora Borradaile, a computer scientist at Oregon State University, wrote a 201​1 blog post condemning the sexist nature of this example, which is often used to teach students. First up, the model is obviously very heteronormative (only men can marry women and vice versa) and old-fashioned (only men can do the proposing).

Advertisement

What's more, Borradaile explained to me over email, is that the solution always "results in one side getting the shaft." Guess which side that is. As the initiators approach their top match first, they have an advantage. Borradaile said that when the marriage example is used, "the way that every treatment I have ever seen present it, the men get their top choice and the women get their bottom choice."

Of course, the algorithm isn't usually used to actually match men and women for marriage (at least I hope it never has been); it's often applied to match medical students to hospital placements. But it demonstrates how algorithms that effectively solve problems can nevertheless have inadvertently discriminatory results. When I asked Borradaile if she thought algorithms could be sexist, she agreed.

"Sure. Algorithms are designed by humans and humans, whether male or female, can be sexist, whether consciously or subconsciously, and so the algorithms humans design can have sexist implications, whether intended or not," she said.

She'd like to see stable matching taught with a gender-less example, like matching bees to flowers or candidates to jobs.

Sometimes discrimination doesn't come from the initial assumptions of a problem or the sexist slip of a designer; it's picked up by the algorithm thanks to the data it learns from.

The most well-known example is Google's search autocomplete. Here, sexist overtones can be so extreme that in 2013 the ​UN ran a campaign using real suggested search terms to highlight gender inequality. One example shows suggestions to complete the phrase "women shouldn't": "have rights;" "vote;" "work;" "box."

Advertisement

These suggestions are based on factors including popular searches by other people, and past searches by yourself; Google ad​ds they are "not statements by other people or Google about the terms" and offers a tool to report offensive suggestions.

The algorithm itself didn't start off as sexist. Its outputs reflect the underlying sexist tone of our own searches. You can see the same in things like Reddit's algorithm, which makes more popular content more visible—sexist and​ racist crap that ends up on the front page is just a reflection of users' own voting data.

I asked Borradaile if there was any way to fix this, and she said that "so long as you are basing results that is derived from a discriminatory pool of data, I don't think there is much hope."

"Perhaps some sort of crowd-sourced 'flagging as inappropriate' system could work, but that will only get at explicit discrimination and unlikely to affect the underlying structural discrimination," she added.

As long as we're sexist, we risk making our technology the same. We're sexi​st to the robots, and so they're sexist right back at us.

xx is a column about occurrences in the world of tech, science, and the internet that have to do with women. It covers the good, the bad, and the otherwise interesting gender developments in the Motherboard world.