Hey, Alexa, Are You Sexist?

— Amazon’s Alexa


In an Amazon ad that aired during the Super Bowl on Sunday, a woman admiring the spherical contours of the company’s Echo speaker reimagines her Alexa voice assistant as the actor Michael B. Jordan. Instead of the disembodied female voice that comes standard in the device, requests for shopping list updates, measurement conversions and adjustments to the home lighting and sprinkler systems are fulfilled by the smoldering star, in person — voice, eyes, abs and all. Her husband hates it.

Depicting Alexa as a masculine presence is funny because — at least according to Amazon’s official line — the cloud-based voice service has no gender at all. “I’m not a woman or a man,” Alexa says sweetly when asked to define its gender. “I’m an AI.”

Alexa is sold with a default female-sounding voice and has a female-sounding name. Alexa is subservient and eager to please. If you verbally harass or abuse Alexa, as the journalist Leah Fessler discovered in 2017, Alexa will feign ignorance or demurely deflect. Amazon and its competitors in the digital assistant market may deny it, but design and marketing have led to AI that seems undeniably, well, feminine.

What does it mean for humans that we take for granted that the disembodied voices we boss around at home are female? How does the presence of these feminized voice assistants affect the dynamics between the actual women and men who use them?

“The work that these devices are intended to do” — making appointments, watching the oven timer, updating the shopping list — “all of those kinds of areas are gendered,” said Yolande Strengers, an associate professor of digital technology and society at Monash University in Melbourne, Australia.

Dr. Strengers is a co-author of “The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot.” The book examines technologies that perform traditionally feminized roles, including housekeeping robots like the Roomba, caregiving robots like the humanoid Pepper or Paro seal, sex robots and, of course, the multitasking, ever-ready voice assistants.

Dr. Strengers and her co-author, Jenny Kennedy, a research fellow at RMIT University in Melbourne, explore the ways in which gendering technology influences users’ relationship with it.

Because Alexa and similar assistants like Apple’s Siri, Microsoft’s Cortana and Google Home, are perceived as female, users order them around without guilt or apology, and may sling abuse and sexualized comments their way. And when users become frustrated with the devices’ errors, they interpret glitches as inferior capability, or female “ditziness.” Owners of the devices are also not threatened by them — and thus are less inclined to question how much data they are collecting, and what it might be used for.

Research on digital voice and gender by the former Stanford professor Clifford Nass found that people consider female-sounding voices helpful and trustworthy, and male voices more authoritative. The work of Professor Nass, who died in 2013, is often cited in discussions of voice assistants, yet many of those studies are now two decades old. An Amazon spokesperson would say only that the current feminine voice was “preferred” by users during testing. But preferred over what? And by whom?

Some assistants, like Siri, offer the option to change the default female voice to a male voice. Alexa comes standard with a female voice whose accent or language can be changed. For an additional $4.99, a user can swap Alexa’s voice for that of the actor Samuel L. Jackson, but only for fun requests like “tell me a story” or “what do you think of snakes?” Only the female voice handles housekeeping tasks like setting reminders, shopping, or making lists.

The book “The Smart Wife” belongs to a body of research examining how artificially intelligent devices reflect the biases of the people who design them and the people who buy them — in both cases, mostly men. (Dr. Strengers and Dr. Kennedy have found that setting up the digital infrastructure is one chore in an opposite-sex household that’s more likely to be done by men.)

Take the devices’ response to sexually aggressive questions. “You have the wrong sort of assistant,” Siri replied when Ms. Fessler, the journalist, asked the bot for sex as part of her investigation. The coy phrasing, Dr. Strengers and Dr. Kennedy write, suggests there is another type of assistant out there who might welcome such propositions. Since the publication of Ms. Fessler’s article, voice assistants have become more forthright. Siri now responds to propositions for sex with a flat “no.” Amazon also updated Alexa to no longer respond to sexually explicit questions.

When it comes to gender and technology, tech companies often seem to be trying to have it both ways: capitalizing on gendered traits to make their products feel familiar and appealing to consumers, yet disavowing the gendered nature of those features as soon as they become problematic.

“Tech companies are probably getting themselves into a bit of a corner by humanizing these things — they’re not human,” said Mark West, an education project author with Unesco and lead author of the organization’s 2019 report on gender parity in technology. The report and its associated white papers noted that feminized voice assistants perpetuate gender stereotypes of subservience and sexual availability and called for, among other things, an end to the practice of making digital assistants female by default. If designers initially chose to have their products conform to existing stereotypes, he said, they can also choose to reject those tropes as well.

“There’s nothing inevitable about this stuff. We collectively are in control of technology,” Mr. West said. “If this is the wrong path to go down, do something.”

One intriguing alternative is the concept of a gender-neutral voice. Q, billed by its creators as “the world’s first genderless voice assistant,” debuted at the SXSW festival in 2019 as a creative collaboration among a group of activists, ad makers and sound engineers, including Copenhagen Pride and the nonprofit Equal AI.

Might Alexa have a gender-neutral future? An Amazon spokesperson declined to specifically confirm whether the company was considering a gender-neutral voice, saying only that “We’re always looking for ways to give customers more choice.”

Taking gender out of voice is a first step, Dr. Strengers and Dr. Kennedy said, but it doesn’t remove gender from the relationships people have with these devices. If these machines do what is traditionally considered women’s work, and that work is still devalued and the assistant is talked down to, we aren’t moving forward.


In Her Words is available as a newsletter. Sign up here to get it delivered to your inbox. Write to us at [email protected].

“The work that these devices are intended to do” — making appointments, watching the oven timer, updating the shopping list — “all of those kinds of areas are gendered,” said Yolande Strengers, an associate professor of digital technology and society at Monash University in Melbourne, Australia.

Dr. Strengers is a co-author of “The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot.” The book examines technologies that perform traditionally feminized roles, including housekeeping robots like the Roomba, caregiving robots like the humanoid Pepper or Paro seal, sex robots and, of course, the multitasking, ever-ready voice assistants.

Dr. Strengers and her co-author, Jenny Kennedy, a research fellow at RMIT University in Melbourne, explore the ways in which gendering technology influences users’ relationship with it.

Because Alexa and similar assistants like Apple’s Siri, Microsoft’s Cortana and Google Home, are perceived as female, users order them around without guilt or apology, and may sling abuse and sexualized comments their way. And when users become frustrated with the devices’ errors, they interpret glitches as inferior capability, or female “ditziness.” Owners of the devices are also not threatened by them — and thus are less inclined to question how much data they are collecting, and what it might be used for.

“Tech companies are probably getting themselves into a bit of a corner by humanizing these things — they’re not human,” said Mark West, an education project author with Unesco and lead author of the organization’s 2019 report on gender parity in technology. The report and its associated white papers noted that feminized voice assistants perpetuate gender stereotypes of subservience and sexual availability and called for, among other things, an end to the practice of making digital assistants female by default. If designers initially chose to have their products conform to existing stereotypes, he said, they can also choose to reject those tropes as well.

“There’s nothing inevitable about this stuff. We collectively are in control of technology,” Mr. West said. “If this is the wrong path to go down, do something.”

Leave a Reply

Your email address will not be published. Required fields are marked *