Menu Menu
[gtranslate]

Why are all AI assistants feminised?

Ever wondered why Siri, Alexa, and Google Assistant all answer our beck and call in sweet dulcet tones? The female gendering of AI technology is pervasive and largely unquestioned – but does it enforce damaging stereotypes?

Rather disconcertingly, AI assistants are now an omnipotent presence in day-to-day life.

Whether you use them for weather updates, fun-facts, or quick math equations (guilty), most of us interact with AI on a regular basis. So much so that we often don’t realise how much it frames our lives. 

For all their quips and endless pools of knowledge, AI assistants have one more basic – though less recognised – thing in common: they’re all women. At least by default. 

Siri, Alexa, and Google Assistant answer our every beck and call in soft dulcet tones, like digital girlfriends ready to serve us. Call their name, and they come running, never grated by our endless demands for information. When you put it like that, it’s pretty disturbing.

Back in 2019, the UN argued that the gendering of AI technology entrenched harmful gender biases. Nothing much has changed since then, but debates around the issue have started to resurface. 

 

Last month, Chris Baranuik approached the topic of misogyny and AI once again.

Following the release of ‘No Time to Die’, Baranuik reflected on the feminisation of James Bond’s many tech assistants, their soft – even sensual – voices serving as an accessory to his uber-masculine, macho character. 

With a little digging, however, it turned out that the feminine voice-assistants in Bond’s cars weren’t accurate. BMW actually recalled female-voiced GPS systems from its cars in the late 1990s, after male German drivers complained they didn’t want to take instructions ‘from a woman’. 

Oh, how far we’ve come.

The irony that our AI assistants are now unanimously feminised doesn’t make the BMW incident any more digestible. If anything, it proves we’ve been unable to break from the inherent need to gender things at all. And, more specifically, to relinquish our associations between women and (lack of) power. 

When you look at the BMW situation in the context of Siri, it becomes clear that women have specific positions drawn out for them in modern society, positions that are now shaping gender norms in the digital world.

It’s perfectly acceptable to bark orders at a feminised AI system, but to have ‘her’ tell us how to drive a car – a stereotypically ‘male’ domain – is just not on. 

Baranuik outlines the long and tortuous history of our relationship with digital voices. From aircraft computer systems dubbed ‘Sexy Sally’, to the London Underground announcement system termed ‘Sonya’ by TFL staff members – named as such because it ‘gets onya nerves’ – misogyny has underpinned our relationship to the digital for decades. 


The UN’s 2019 report outlined how the mistreatment of feminised AI both encourages and reflects dangerous real-life attitudes toward women. 

‘The subservience of digital voice assistants becomes especially concerning when these machines – anthropomorphised as female by technology companies’ (which, by the way, are often staffed by overwhelmingly male engineering teams) ‘give deflecting, lacklustre, and apologetic responses to verbal sexual harrasment’.

This harassment isn’t uncommon, either. A writer for Microsoft’s ‘Cortana’ assistant noted in 2019 that ‘a good chunk of the volume of early-on enquiries’ probe the assistant’s sex life’. 

Ione Gamble, founder of Polyester Zine, unpicked the feminisaiton of AI in her latest podcast ‘The Sleepover Club’. 

Referencing the film ‘Her’, in which Joaquin Phoenix’s character Theodore falls in love with his AI assistant Samantha, Gamble stated ‘it’s interesting that the internet is set up under the male gaze, with women as desirable objects’, ‘if a woman is just in a box and will do whatever you say, that’s a bit creepy, no?’.

‘As feminists we have the capacity to think of the internet as some kind of utopia’ Ione continued. ‘And perhaps for some of us in smaller communities it was utopia. But really, it has always been stacked against us’.


While our AI interactions remain overtly gendered, Salomé Gómez-Upegui has suggested that the queering of our devices could help to unpick sexist stereotypes. 

Upegui cites ‘Q’, introduced as the worlds first ‘genderless AI voice’ a the Smithsonian FUTURES festival in 2021, as a turning point for misogyny in the digital world. 

While tech giants like Google and Apple have responded to pushback  by adding male voices to their AI rosters, Siri and Google Assistant remain female by default. Besides, it’ll take more than a setting change to uproot the misogynistic gender ideals embedded in our tech systems. 

Yolanda Strenge’s, associate professor of human-centered computing at Monash University, argues that removing gender from AI is not the answer, as ‘this oversimplifies the ways in which these devices treat gender, which are not only gendered by voice, but by the types of things they say, their personalities, their form, and their purpose’. 

Instead, she argues, we should consider queering the ‘smart wife’ – as she dubs female AI assistants. In this way, our digital systems may finally start to exist in defiance, rather than in absence, of gender stereotypes.

Accessibility