The harmful impact of female virtual assistants

Katie Gibbs
4 min readJul 25, 2019

Female virtual assistants have a harmful impact on female empowerment. They are the antithesis of gender equality in the technology industry, and we must demand better from the technology providers developing these systems.

There has been a lot of noise in the media recently regarding female voice assistants reinforcing gender bias, such as UNESCO submitting a warning about the danger of them due to their portrayal of women as submissive and eager to please. The gendering of voice assistants has been a concern of mine since they started arriving on the market, and I make a concerted effort to design conversational AI around a ‘bot personality’ rather than a human in order to prevent such gender bias.

Research firm Canalys estimates that approximately 100 million smart speakers were sold globally in 2018, and Gartner estimates that by 2020 people will be having more conversations with voice assistants than their other half. This highlights that this is a growing problem; as the audience of virtual assistants grows, the problem of gender bias is exacerbated, so we cannot wait any longer to take action.

If you’re sat there thinking what’s the big deal, then let’s take a moment to think about what voice assistant represents. They have been designed to do our bidding at a single command, no questions asked. The fact that they are being represented as female reinforces some dangerous mindsets and behaviours when it come to treating women.

One of the issues is that the voices are clearly female, and there’s no clear reason as to why this design choice was made. Research has been carried out on the impact of male and female voices and found that the majority of people prefer interacting with someone who has a low-pitched, aka male voice, to a high pitched one. It’s worth noting that in this same research paper it was concluded that women with higher pitched voices are perceived as more attractive, which makes you wonder whether the designers of voice assistants thought that this would improve user engagement. Researchers are developing a genderless digital voice called Q, but in the meantime there are actions that we can take, from naming conventions to tone of voice to address the gender imbalance that is being presented today.

Let’s take a look at some examples. Alexa, Amazon’s digital assistant, is an epithet for the Greek goddess Hera, symbolising the home and family and essentially representing domesticity. Apple has Siri, which takes its name from a Nordic term meaning “the beautiful woman that leads you to victory”. They have been designed (by predominantly male teams) to not only symbolise femininity, but the tone of voice was originally designed to be flirty and coquettish. In November 2016, if you told Siri “I’m naked”, she would respond with “And here I thought you loved me for my mind. Sigh.” Apple redesigned Siri’s tone of voice in June 2017, which saw a leap forward in the response to the above intent: “That is both inappropriate and irrelevant”. This clearly highlights that Apple recognises that it made some poor choices when it came to designing Siri the first time around. But there is still so much more to do.

Yet this extends beyond the market leaders. Most financial services organisations now have a chatbot to respond to menial queries 24/7. A retail banking giant has a chatbot named Amy presented alongside a photo of an attractive woman, Deutsche Bank’s Debbie helps market traders. Such service-based bots tend to be female, whereas intelligence-based bots tend to be male such as IBM’s Watson. ING have demonstrated this gender divide by having Marie deal with retail customers on Facebook Messenger and who, according to Tim Daniels a programme manager for ING, was given the name “because it conjures up an image of someone who is helpful and friendly,” yet ING’s chatbot that deals with corporate customers is called Bill.

These design choices are harmful as they are iterating that a woman’s place in the business world is as an assistant, so we should be pushing back to make them gender neutral. There are no proven benefits from having a virtual assistant with a gendered name, or being associated with a photo of a woman. In fact, California has recently introduced a ban on bots pretending to be human, they will have to disclose that they are not a human. The intention was to prevent bots from having such a widespread impact during elections as they did in 2016, but this is also a positive move for the gendering of such bots, as if they chatbot needs to clearly identify itself as such, then it cannot pretend to be female or male.

In short, the future of virtual assistants should not be gendered, they should not have programmed responses that iterate harmful gender stereotypes and they should not be marketed as a tool for female servitude. I’m amazed that’s it’s continued for so long, as it compromises all efforts the tech industry is making to drive gender equality. It’s time that we, as consumers, demand a change before female virtual assistants become even more widespread in both home life and within business. The future of women in tech is not as assistants, as we will continue working to defy this insulting stereotype that is being propagated.

Originally published at https://www.bjss.com on July 25, 2019.

--

--

Katie Gibbs

Founding Partner at Emergence, delivering profound transformation with emerging tech.Passionate about diversity and inclusivity, especially women in tech.