A recent study by the United Nations cultural body UNESCO has called for changes to be made to voice assistants like Siri and Alexa, which use female voices. The use of women’s voices for these assistants, combined with their subservient attitudes, the researchers say, dangerously entrenches existing views about the role of women in the workplace and society more generally.

The research, entitled “I’d blush if I could” – after the response that Siri gives in response to a particular sexually provocative phrase – notes that the companies which have built these systems, such as Apple and Amazon, are overwhelmingly staffed by men.

The problem with this, they say, is that those men have designed systems which “cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation”.

As well as reacting to all manner of rude language and abuse in an infinitely patient manner, the female voice at the other end of our devices, such as phones and smart speakers, is completely “docile” and will do the speakers’ bidding, the study says.

“Because the speech of most voice assistants is female, it sends a signal that women are docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility,” it says.

The group that conducted the research says that it did so because voice assistants are “rarely examined through gender lens, either by academics such as themselves or lawmakers who are attempting to build more equal societies.

The effect

Some observers might say that that’s understandable. These are, after all, not real people, just smart systems designed to mimic human interaction in an efficient and useful way.

But it is not, of course, that simple. We live in a highly complex world where everything happens against backdrops of social interaction, history and political struggle. And right now, the gender debate is one of the most keenly-felt of those struggles, particularly in the workplace.

Not only that, the study also notes that these voice assistants are widely-used all over the world. Indeed, IT research house Gartner has said that in only a couple of years, some people may be speaking to their voice assistants more than they speak to their spouse, as reported by the BBC. s

What that means is that our interactions with these assistants will have an increasingly powerful effect on our “real-life” interactions. For some, this might mirror the debate that went on a few years ago about the importance of online interactions and, for instance, whether bullying online should be considered the same as in real life. The consensus now, of course, is that our online lives are our real lives. The same principle could arguably be applied to voice assistants.

The effect, the researchers say, is that people who hold an ingrained bias – perhaps one that they don’t even know that they hold – could be reinforced or developed through voice assistants. If a person makes demands of a ‘female’ voice assistant, often opening the conversation with a “blunt” command like ‘OK’, in a way that they wouldn’t have thought was acceptable to speak to a real-life woman, they could eventually come to see all women as subservient.

As the study puts it, “in many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment”.

The broader issues

The problem, of course, isn’t necessarily with Siri or Alexa – although it should be noted that the study’s authors have called upon tech companies to change the voices so they are no longer female. The real issues here are about gender inequality, and perhaps more specifically, about the mostly-male workforces at big tech companies.

Diversity in tech is now a big issue, and a complicated one at that. Some are starting to realise that the problem is more about changing structural employment and recruitment policies than paying lip service to equality – and this study could be just another example of why that is necessary.

The BBC notes that researchers have attempted to build a voice assistant that uses a gender- neutral voice. It could certainly be a solution, but at the very least it illustrates the lengths the tech industry needs to go to and the things it needs to consider when building products.

Newsletter sign up

Name(Required)
This field is for validation purposes and should be left unchanged.