'Hardwired subservience': Siri and Alexa reinforce sexist stereotypes, says UN

A UNESCO report has asked why AI assistants are always female - and what effects is this having on how we speak to and expect women to act.

A UN report has found AI assistants, usually female, are promoting a subservient image of women to tech users around the world.

A UN report has found AI assistants, usually female, are promoting a subservient image of women to tech users around the world. Source: Getty

A UN report has found female AI assistants are reinforcing gender stereotypes and promoting gender-based verbal abuse. 

Be it Apple's Siri, Amazon's Alexa, Microsoft's Cortana, or Google's Assistant, the vast majority of automated assistants have a female voice.

While voice-command technology may be the way of the future, UNESCO said it promotes an image of women from the dark ages to the hundreds of millions of people using the technology.
Amazon's electronic home assistant Alexa is one of the popular voice-operated assistant's accused of promoting unhealthy gender stereotypes.
Amazon's electronic home assistant Alexa is one of the popular voice-operated assistant's accused of promoting unhealthy gender stereotypes. Source: SBS
"It sends a signal that women are obliging, docile, and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’," the report sates.

"As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase dramatically."

The report found voice command software promotes acceptance of sexual harassment and verbal abuse in how automated assistants respond to their users.
UNESCO's Saniye Gülser Corat said voice assistants like Siri and Alexa are teaching people how to speak to women.
UNESCO's Saniye Gülser Corat said voice assistants like Siri and Alexa are teaching people how to speak to women. Source: YouTube ' TEDx Talks'
For example, when a user calls Siri a "b---h", she responds "I'd blush if I could". 

When given a same insult, Alexa replies "Well, thanks for the feedback".

It also concluded AI technology "makes women the face of glitches and errors" and forces the female personality to defer questions to a higher "and often male" authority.

UNESCO's director of gender equality Saniye Gülser Corat said this "hardwired subservience" was showing people how to speak to women and teaching women how to respond.
"Obedient and obliging machines that pretend to be women are entering our homes, cars and offices," Ms Corat said.

"To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

The UN is now calling on governments and tech giants to stop making digital assistants default to female voices and explore the possibility of developing a "neutral machine gender" that sounds neither male nor female.


Share
Published 23 May 2019 8:05am
Updated 23 May 2019 8:09am
By Claudia Farhart
Source: SBS News


Share this with family and friends


Get SBS News daily and direct to your Inbox

Sign up now for the latest news from Australia and around the world direct to your inbox.

By subscribing, you agree to SBS’s terms of service and privacy policy including receiving email updates from SBS.

Download our apps
SBS News
SBS Audio
SBS On Demand

Listen to our podcasts
An overview of the day's top stories from SBS News
Interviews and feature reports from SBS News
Your daily ten minute finance and business news wrap with SBS Finance Editor Ricardo Gonçalves.
A daily five minute news wrap for English learners and people with disability
Get the latest with our News podcasts on your favourite podcast apps.

Watch on SBS
SBS World News

SBS World News

Take a global view with Australia's most comprehensive world news service
Watch the latest news videos from Australia and across the world