en

The VICE Channels

    Image: aastock/Shutterstock

    We're Sexist Toward Robots

    Written by Victoria Turk

    Last month, Toshiba unveiled a prototype robot capable of sign language. But the first thing you noticed likely wasn’t its 43 actuators or even its apparently “warm-hearted” personality. The first thing you probably clocked: It’s a girl.

    Called Aiko, the robot has an uncannily realistic face with all the markers we associate with femininity in humans: long hair, a smooth visage, rosy lips. In case you’re in any doubt, it’s dressed in a frilly pink blouse. In Toshiba’s video announcement, this robot is not referred to as an “it.” It’s a “she.”

    Designers have a habit of making robots male or female, even when they’re nowhere near as life-like as Aiko. Popular mini-humanoid Nao is described as “him” by its makers Aldebaran Robotics. A dalek-style security robot I met had the female name Linda. Boston Dynamics describes its Atlas robot only as “humanoid,” but let’s be real: it’s a man. Even disembodied artificial intelligence can be gendered. Apple’s Siri is presented as male and female depending on its voice.

    And if designers don’t include explicit or implicit gender cues in their robots, users still have a habit of perceiving them as male or female. People give their Roombas gendered pet names. Just as we have a massive proclivity to anthropomorphize the objects around us, so we like to call them he or she—tendencies that aren’t unrelated.

    But what’s weirder than our insistence on assigning gender to non-sentient machines is that we then sometimes treat them differently as a result. We’re sexist to robots. It would be funny in its absurdity, if it didn’t so harshly reflect the prejudices already ingrained in human society, and risk entrenching them even further.

    Take for instance a  study published last year that asked participants to interact with a robot security guard, a stereotypically male occupation in the human world. Half of the participants met a robot that was given the typically male name “John,” and the other half met a robot with the typically female name “Joan.” John had a male text-to-speech voice, and Joan a female voice, but otherwise the robot remained identical. After doing some security tasks, like detecting an intruder on CCTV, the participants rated the robots.

    They rated John higher than Joan. He was considered more useful and more acceptable as a security bot than his female twin.

    It’s a curious experiment, and the results are rage-inducingly reminiscent of past findings that have shown a similar gender bias when we judge male and female humans.

    What’s weirder than our insistence on assigning gender to non-sentient machines is that we then sometimes treat them differently as a result

    For example,  a 2012 study by researchers at Yale University asked science professors to judge student applications for a position as lab manager (again, a stereotypically male role). Each participant received exactly the same application materials, but some were submitted under a typically male name, and some under a typically female name. The professors rated the female applicant as less competent and less hireable, and also offered her less career mentoring and a significantly lower starting salary.

    These attitudes are pervasive. In that instance, the researchers noted that “female and male faculty were equally likely to exhibit bias against the female student.” In regard to the robot study above, Benedict Tay Tiong Chee, a researcher at Nanyang Technological University in Singapore and one of the authors, confirmed in an email that they also found no real difference between how male and female participants judged the male and female security robots. (Some other studies have found that men and women respond differently to male and female robots.)

    Tay Tiong Chee said they actually expected to observe a difference, as they believed men had more rigid stereotypes than women. “However, the results did not show any strong significant difference between male and female participants in their responses toward stereotypical and non-stereotypical robots,” he said.

    Not all robots are built for stereotypically male domains like John/Joan the security guard. In fact, some of the most obvious environments where robots could be used are associated with “women’s work,” like caring for the sick or elderly, or doing domestic chores.

    Friederike Eyssel, a professor of social psychologyat Germany’s Bielefeld University, has done a lot of research into how people interact with robots.  In one study, she and her co-author Frank Hegel looked at the effect that facial gender cues had on people’s perception of robots in both typically male and female roles.

    Instead of changing the robot’s voice or name, they used visual cues to suggest its gender. The Flobi robot head, developed at Bielefeld, has a modular design that lets you change its face. In their study, the researchers simply gave Flobi longer hair and plumper lips to make for a female robot and shorter hair and straight lips for a male robot. Otherwise, it was exactly the same.

    Participants were shown Flobi in one of its forms and asked to judge what kind of traits they would attribute to the robot and the extent to which they would use it for different applications.

    Surprise surprise, their judgments fit with the gender stereotypes we see in human interactions. The participants ascribed more male traits related to “agency,” like assertiveness and dominance, to the male version of Flobi, and more female traits related to “communion,” like friendliness and affection, to the female Flobi. (That’s not to say, of course, that all men are actually more agentic and women more communal. But these traits have been identified as stereotypically masculine and feminine.)

    Participants also chose more stereotypically male applications for the male bot, like transporting stuff, repairing things, and guarding a house, and more female applications for the female bot, like childcare, elderly care, and tutoring. The authors pre-tested these applications to check they were associated as male and female, respectively.

    Perceived suitability for gender stereotypical tasks as a function of robot type. Image: Eyssel and Hegel

    Eyssel summed up the results of the work over a Skype call. “As a function of robot appearance, people attribute more gender stereotypical personality traits but also more suitability to gender stereotypical tasks to the respective robots,” she said, and noted that the results are comparable to what’s already observed in human relations.

    I asked her about the implications of this. “To me, I took it as an indication of the fact that really these gender stereotypes are quite deeply ingrained in us,” she said. “We even use our rule-of-thumb characteristic judgment when taking into account robots. We would not think twice, but rather we would make analogous judgements and even gender stereotype robots in that way.”

    So why do we even give a robot a gender in the first place? It’s an inanimate object; a hunk of metal and plastic and electronics. It needs no sex-specific organs. It has no Y chromosomes nor X chromosomes.

    Female Flobi. Image: Eyssel and Hegel

    The researchers I spoke to suggested that we might be inclined to assign a gender to robots because it makes them seem more human to us.

    “Maybe it has to do with our tendency to seek or to associate with something that is like us, more humanlike,” said Eyssel. “Since gender is one of the major social categories, it’s something that we intuitively choose.”

    Tay Tiong Chee made a similar suggestion. “I think the main purpose to do so is to leverage on our social knowledge,” he said. “If we apply the gender appropriately it will be transferring a good amount of knowledge we have already known and acquired through social learning.”

    Male Flobi. Image: Eyssel and Hegel

    Gender is a major social cue—one of the first things you notice about a human. And research has shown that when we think robots are more humanlike, we like them more.

    That’s perhaps why we gravitate to robots that are humanoid in the first place; those visual cues could have a similar humanising effect. Gender in robots isn’t always shown physically, but it still suggests the bot shares qualities with our social group.

    Gender is not the only social category we identify in humans, and it appears we apply other stereotypes to robots as well. In  another study, Eyssel and her colleague Dieta Kuchenbrandt found that applying social categories like age and ethnicity to a robot can also affect our evaluation of the bot. For their part, Tay Tiong Chee’s team found that personality traits, such as whether a robot is seen as extroverted or introverted, affected its perceived suitability for different occupations even more than gender.

    And exactly how we respond to gender in robots isn’t clear-cut; while the studies above suggested interesting trends, questions remain, and more research is needed. A study led by Kuchenbrandt this year, for instance, found a discrepancy in how people judged male and female robots’ abilities to do stereotypically male and female tasks.

    Regardless of the apparent gender of the robot (which was a Nao humanoid robot with a male or female voice, named Nero and Nera respectively), the participants had more trouble following the robot’s instructions to sort items in a sewing kit than in a tool box.

    Nao robots. Image: Axel Voitier/Flickr

    Kuchenbrandt admitted this could be an issue with the methodology—perhaps people are generally more familiar with a tool box than a sewing kit—but suggested other gender judgments could be at play. “Maybe robots generally are perceived a bit more male,” she speculated. “Maybe we also want to interact with a robot in a male environment?”

    But what does it matter if we treat robots differently according to gender cues? A robot can’t get annoyed at sexism or feel hurt by prejudice. Robots don’t have equal rights.

    Understanding how we relate to different robot attributes could in fact be beneficial as we move forward. If humans respond better to a male security robot, why not make it male? If we accept a care robot more when it’s female, why not make it female? Playing into the stereotypes we already hold could help smooth the transition as we come to use robots more in our daily lives.

    “Considering gender stereotypes in earlier stage of social robot deployment could enhance social integration and acceptance by new users,” said Tay Tiong Chee, the researcher on the security robot paper. “This is very important since everyday people may have a hard time accepting what robots can do and building trust with these ‘machines.’ Putting in appropriate gender could help resolve this issue.”

    But that could have an effect beyond our relationship to robots. As Eyssel suggested, our eagerness to stereotype robots based on gender indicates just how fixed these biases are—and if we purposefully manipulate robot’s genders to pander to existing attitudes around what traits and jobs are suitable for men and women, that could cement them even further.

    This research, however, also raises an enticing opportunity to flip the tables and play against existing stereotypes. Eyssel emphasised that her team has not yet tested the robot in a real-world scenario, such as in an elderly care home, but suggested that there may be a chance to unpick some of our attitudes to gender roles.

    “Implementing a counter-stereotypical example, like having a male carebot or a female-looking mechanic’s support in your workshop—maybe that would help us to undermine the existing social stereotypes,” she said.

    This research, however, also raises an enticing opportunity to flip the tables and play against existing stereotypes

    Whether manufacturers would be swayed from making bots that are currently more accepted by the public in favour of a distant gleam of a more equal future is another question.

    An alternative optimistic outlook is that, in any case, the increased use of robots could help break down gender stereotypes over the long term. Tay Tiong Chee suggested that the lack of biological sex in robots, even if there is robot gender, might help to bridge the social gap between male and female.

    “The prevalence of gender stereotypes is mainly shaped by the biological differences between men and women in physical and cognitive capability (putting aside education and social influence),” he said. “The case, however, may not hold for social robots since they will not be limited by that. In this sense, I also believe that robots could be an excellent medium to promote gender equity in the future.”

    That will also depend largely, of course, on decisions we make regarding our future robo-brethren (and sistren).

    Eyssel and Hegel end their 2012 study with a hopeful vision. They note that in the present, their study participants’ judgments of robots are “grounded in rather gender-stereotypical conceptions about social roles.”

    “Perhaps sometime in the future, however, not only will robots be as technically refined as envisioned, but gender prejudice will also be a thing of the past.”

    But as long as we can’t have even simple robot interactions without falling back on tired gender stereotypes, that “sometime” sounds like it’s a way off. 

    xx is a column about occurrences in the world of tech, science, and the internet that have to do with women. It covers the good, the bad, and the otherwise interesting gender developments in the Motherboard world.