Is Siri sexist?

That’s the contention of a new report by UNESCO, which takes as its title one of Siri’s early programmed responses (since deleted) to being called a b***h:  “I’d blush if I could.” The report claims that Siri’s “submissiveness in the face of gender abuse remains unchanged since the technology’s wide release in 2011” and that its “female obsequiousness—and the servility expressed by so many other digital assistants projected as young women—provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education.”

Setting aside the fact that anyone who uses Siri can opt to have it use a male-sounding voice (other AI assistants such as Amazon’s Alexa and Microsoft’s Cortana also offer male voice options), and that some researchers have created a gender-neutral AI voice, in the universe of possible challenges posed by our feelings about new technologies, Siri’s supposed female submissiveness doesn’t seem worthy of a UN-sponsored report.

But Siri’s supposed sexism is really a cover for the UN to complain about the lack of women in the tech industry. As a UN summary of the report stated, “Today, women are extremely under-represented in teams developing AI tools: women make up only 12 percent of AI researchers, six percent of software developers.” As UNESCO’s Director of Gender Equality, Saniye Gulser Corat claimed, as if describing an I, Robot-style invasion, “Obedient and obliging machines that pretend to be women are entering our homes, cars, and offices. Their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves.”

Contra Ms. Corat, it’s not a cabal of cavemen AI programmers intent on demeaning women that we should blame for female-sounding AI: it’s ourselves, the consumers. As engineers who helped designed Amazon’s Alexa voice told PC Magazine, “We tested many voices with our internal beta program and customers before launching and this voice tested best.”

That’s not a surprise, given research showing that people generally find women’s voices more pleasant than men’s voices. As the late, great Stanford University communications professor Clifford Nass explained in 2011, “It’s much easier to find a female voice that everyone likes than a male voice that everyone likes . . . It’s a well-established phenomenon that the human brain is developed to like female voices.” Nass noted studies of the responses of babies in the womb who “react to the sound of their mother’s voice but not to other female voices” and who “showed no distinct reaction to their father’s voice.”

It’s why Siri users in the UK complained loudly when Apple initially offered only a male-voiced Siri in the iPhone 4S sold there (it now offers both male and female options).

Market research, not a sexist conspiracy on the part of tech companies, led to more female-sounding voices in AI-enabled devices. Market demand has also led tech companies to offer more options to users, like the array of languages and accents available on GPS programs. People choose what they are comfortable hearing. Some of us would rather be firmly told to “proceed to the route” by a GPS device with an appealing British accent, while others want information about nearby coffee shops from a female-sounding Siri. (No one seems to want an AI helper with the malevolent and decidedly male tone of HAL from 2001: A Space Odyssey).

If you want to worry about Siri (or other AI-enabled assistants), don’t worry about whether their voices are male or female. Worry about the challenges they pose to privacy. AI-enabled devices are easy to hack, which means they can be used to spy on their owners, much like the Wi-Fi enabled home security cameras and baby monitors that have also been turned into highly effective spyware. Or be concerned about the unprompted, witch-like laughter that understandably freaked out many Alexa-enabled Amazon Echo device owners last year. Or fret about the fact that your smart speaker might “accidentally” send a recording of a private conversation to someone on your contact list.

There are plenty of risks consumers accept when they purchase and use AI-enabled “smart” speakers or software like Alexa or Siri. Contributing to the problem of sexism because of the “gendering of AI voice assistants” isn’t one of them.

+ A A -
You may also like
Share via
Copy link