Is your smartphone “sexist”? The rise of voice assistant technology—and longstanding gender disparities within the tech industry—means assistants like Siri, Alexa, and Cortana are under a microscope due to claims they promote gender stereotypes and encourage users to treat women as subservient. But does this tech promote gender bias—or just reflect it?
Criticisms of gendered voice assistants arose alongside the much-publicized launches of the proprietary voices—many female—that accompany smartphones and other technology. And outcry has only increased along with the penetration of voice assistants, which a projected 8.4 billion people will use worldwide by the end of 2024.
Why the objections? “Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” said Saniye Gülser Corat, director of gender equality at UNESCO, in a statement accompanying a 2019 UNESCO report on voice assistants and gender.
“Their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves.”
The UN agency reports that “with rare exception,” voice assistants are intentionally female—a choice tech companies say is driven by a market that trusts and prefers female voices.
Snap decisions
“Voice is super-complex—it’s very multi-dimensional,” says Naim Zirau, who co-authored a 2021 study on voice assistants, gender, and pitch while working as a researcher at the University of St. Gallen, Switzerland. Factors like age, the listener’s gender, demographics, and their understanding of the voice assistant’s task all appear to play a part in how we react to a gendered voice.
Zirau’s team developed a voice interface and tasked subjects to book a flight or take a financial survey with the help of voices of varying pitches—long associated with sex and gender due to hormones that make male voices generally lower in pitch than female voices. They found that listeners made instantaneous decisions about the gender of a virtual assistant when they heard each voice—and that just hearing a voice they associated with a particular gender caused them to make stereotypical, gendered assumptions about the computerized assistant in question.
Users were more likely to attribute stereotypes like “delicate” and “empathetic” to female-coded voices, while male-coded voices sounded “dominant.” Moreover, participants consistently assigned a gender to a gender-neutral voice, even when given the option to say they were undecided. “People have a very dualistic perception of gender,” says Zirau.
Some researchers say we can make a decision on the gender of a speaker in just five seconds, but we often contradict ourselves in our preferences. One 2020 survey, for example, found that though a majority of both men and women said they wanted to hear a female voice on their smart speaker, local accents played a significant role in determining a speaker’s credibility regardless of gender. And other research shows that people’s trust of different gendered voices varies across social contexts.
‘Women-are-wonderful’ effect
Why is it so compelling to code a computerized voice as male or female? Chris Mayhorn, head of North Carolina State University’s psychology department, says social norms—and a longstanding tendency to anthropomorphize machines—are to blame. “When people hear a voice, they end up almost automatically using social norms,” he explains, including the gender binary.
Mayhorn co-authored a recent study that examined the influence of perceived gender on participant’s perceptions of voice assistants. Overall, participants trusted a female voice more for medication advice and perceived female voices as “more benevolent” than males—a common social bias sometimes called the “women-are-wonderful effect.”
It isn’t that people assume computers or voice assistants are actual humans with genders, says Mayhorn. Rather, it’s evidence of how humans unconsciously bring their own cultural biases to the table, treating computers as social beings and seeing them through a gendered lens.
Tech companies’ marketing and user interfaces have had the same effects, since most popular voice assistants were given feminine-sounding names and have speaking voices based on recordings of female voice actors. Though many tech companies have since removed markers like “female” and “male” from their voice options, most popular voice assistants were initially identified as female within their operating systems, referred to as female by company spokespeople, and even programmed to respond to gendered harassment with flirtatious comments.
“For our objectives—building a helpful, supportive, trustworthy assistant—a female voice was the stronger choice,” a Microsoft spokeswoman told the Wall Street Journal in 2017. Other companies have been more elusive about their reasons for choosing female-coded voices to do things like scheduling, handling correspondence, and sending reminders: all tasks that are overwhelmingly coded as female in the home and workplace.
“In the end, what people look for is what increases engagement,” says Zirau, who now works as a senior AI engineer at IBM.
A gender-neutral future?
Could the future hold fewer female-coded voices and more gender-neutral ones? Maybe. Though most voice assistants systems now offer male- and female-sounding voices, gender-free voices are still underused despite multiple proposals for genderless voice concepts. One outlier is Apple, which now labels voices generically instead of by gender and which offers a gender-neutral Siri voice recorded by a member of the LGBTQ+ community. That voice isn’t the phone’s default, however, and psychological research on how users react to and interact with seemingly genderless voices is still in its infancy.
What is clear is that intelligent voice assistants, considered so new just a decade ago, are more popular than ever before. In 2022, a projected 142 million users had voice assistants, and that number is projected to rise to 157.1 million users—nearly half of the U.S. population—by 2026. The covid-19 pandemic and ever-increasing use of smartphones explain the assistants’ dizzying rise. And Mayhorn says his team’s research shows that older adults are also more engaged with voice assistants than ever before. “A lot of them have more experience with voice assistance than young people in college,” he says. “That tells me we’re starting to see penetration of the technology in households and that people are starting to understand [its] value and worth.”
Given the technology’s seeming saturation—more than half of all Americans are projected to use a voice assistant by 2026— it’s easy to forget that voice assistants are still relatively new, with the first wide-scale launch occurring just 13 years ago. That means there’s still time for companies and the public to reckon with the potential consequences of gender stereotyping and voice assistance. Increasing gender diversity within the tech space could help, too. But only time will tell if the expectations set by Siri, Alexa, and their female-coded counterparts—fueled by a society that can’t seem to quit the gender binary—can ever be shaken.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : National Geographic – https://www.nationalgeographic.com/science/article/female-voice-assistants-siri-alexa-woman