Apple’s Siri is no longer a woman by default, but is this really a win for feminism?
PTI, Jul 15, 2021, 10:09 AM IST
Source: Flickr
As of March 31, 2021, when Apple released the iOs 14.5 beta update to its operating system, Siri no longer defaults to a female voice when using American English. Users must now choose between two male and two female voices when enabling the voice assistant.
This move could be interpreted as a response to the backlash against the gender bias embodied by Siri But how meaningful is this change really? Siri has been criticized for embodying several facets of gender bias in artificial intelligence. Digital sociologists Yolande Strengers and Jenny Kennedy argue that Siri, along with other voice assistants such as Amazon Alexa and Google Home, has been developed to “carry out ‘wife work — domestic duties that have traditionally fallen on (human) wives.”
Siri was originally only voiced as female and programmed to not only perform “wifely” duties such as checking the weather or setting a morning alarm, but also to respond flirtatiously. The use of sexualized phrases by Siri has been extensively documented by hundreds of YouTube videos with titles such as “Things You Should NEVER Ask SIRI” (which has more than 18 million views).
Dated gender references Apple has been criticized for promoting a sexualized and stereotypical image of women that negatively harms gender norms. A 2019 investigation by The Guardian reveals that Apple wrote internal guidelines in 2018 asking developers to have Siri deflect mentions of feminism and other “sensitive topics.” It’s not clear what the guidelines were for hard-coding flirty comebacks.
The language used by Siri was (and still is) a combination of an already stereotypical language model, including jokes hardcoded by developers. A 2016 analysis of popular language models used by software companies noted that word associations were highly stereotypical. In the study, terms such as philosopher and captain were gendered male, while the opposite was true for terms such as homemaker.
Legal scholar Céline Castets-Renard and I have been studying language models used by Google Translate and Microsoft Bing that have revealed similar issues. We input gender-neutral phrases in romanized Mandarin into the translation platforms, forcing the translation algorithms to select the gender in English and French. Without exception, the Google algorithm selected male and female pronouns along stereotypical gender lines. The Microsoft algorithm, conversely, exclusively selected male pronouns.
The use of models such as these in Siri’s algorithm might explain why, when you type in any corporate title (chief executive officer, chief financial officer, etc.), a male emoji would be proposed. While this has since been addressed — likely due to criticism — in the latest iOS, if Siri is asked to retrieve a photo of a captain or a programmer, the images served up are still a series of men.
Friendly and flirty The idea of the perfectly flirtatious virtual assistant inspired Spike Jonze’s 2013 movie Her, in which the male protagonist falls in love with his virtual assistant. But it’s hard to imagine how biased language models could cause a virtual assistant to flirt with users. This seems likely to have been intentional.
In response to these criticisms, Apple progressively removed some of the more flagrant traits and apparently hardcoded away some of the more offensive responses to user questions. This was done without making too many waves. However, the record of YouTube videos shows Siri becoming progressively less gendered.
One of the last remaining criticisms was that Siri had a female voice, which remained the default even though a male voice was also provided as an option since its 2011 launch. Now, users must decide for themselves if they want a female or a male voice.
Users don’t know, however, the language model that the virtual assistant is trained on, or whether there are still legacies of flirty Siri left in the code.
Bias is more than voice-deep Companies like Apple have a huge responsibility in shaping societal norms. A 2020 National Public Media report revealed that during the pandemic, the number of Americans using virtual assistants increased from 46 to 52 percent, and this trend will only continue.
What’s more, many people interact with virtual assistants openly in their homes, which means that biased AIs frequently interact with children and can skew their own perception of human gender relations.
Removing the default female voice in Siri is important for feminism in that it reduces the immediate association of Siri with women. On the other hand, there is also the possibility of using a gender-neutral voice, such as the one released in 2019 by a group led by Copenhagen Pride.
Changing Siri’s voice doesn’t address issues related to biased language models, which don’t need a female voice to be used. It also doesn’t address hiring bias in the company, where women only make up 26 percent of leadership roles in research and development.
If Apple is going to continue quietly removing gender bias from Siri, there is still quite a bit of work to do. Rather than making small and gradual changes, Apple should take the issue of gender discrimination head-on and distinguish itself as a leader.
Allowing large portions of the population to interact with biased AI threatens to reverse recent advances in gender norms. Making Siri and other virtual assistants completely bias-free should therefore be an immediate priority for Apple and the other software giants.
Udayavani is now on Telegram. Click here to join our channel and stay updated with the latest news.
Top News
Related Articles More
Scientists say India’s ‘Deep Sea Mission’ on track; hydrothermal vent discovery just the beginning
ISRO to launch SpaDeX Mission on Dec 30
ISRO to study how crops grow in space on PSLV-C60 mission
ISRO & ESA agree to cooperate on astronaut training, mission implementation
Snatcher lands in police net in Delhi, AI tech helps reveal identity
MUST WATCH
Latest Additions
Two youths die after bike hits canter while performing stunt wheelies
Karkala: Thief posing as customer steals jewellery, escapes
Shivakumar seeks research centre at Bangalore University for Ex-PM Manmohan Singh
Kharge urges PM Modi to conduct last rites of Manmohan Singh at a place where memorial can be built
Shuttler Lakshya sails into semifinals of King Cup
Thanks for visiting Udayavani
You seem to have an Ad Blocker on.
To continue reading, please turn it off or whitelist Udayavani.