A.I.-based virtual assistants with female voices reinforce sexist gender stereotypes, according to a new United Nations report.
Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana, and other A.I. assistants with default female names and voices send the message that women are “docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK,’” the study by the the UN Educational, Scientific, and Cultural Organization claims. “The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility.”
Amazon has revealed that it chose to program Alexa as female because research suggested customers would find a female voice gentler. “We carried out research and found that a woman’s voice is more ‘sympathetic’ and better received,” said Daniel Rausch, the chief of Amazon’s “Smart Home” division.
“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” said the director of gender equality at UNESCO, Saniye Gülser Corat. “Their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves. To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”
The UN report recommended several steps to combat the alleged bias, including ending default programming of digital assistants as female, programming them not to respond gently to sexist or abusive commands, and providing women with more opportunities in the tech field and a seat at the table in developing new technologies.