Hey Siri, a UN report finds digital assistants with female voices reinforce harmful gender biases

By Michael Grothaus

A report from UN agency Unesco has said that assigning female genders to popular digital assistants, including Amazon’s Alexa and Apple’s Siri, entrench damaging gender biases, reports the Guardian. The report found that female-sounding voice assistants often returned submissive and flirty responses to queries by the user, which reinforces the idea that women are subservient.

To exemplify this, Unesco named the report “I’d Blush if I Could,” which was the reply Siri gave when a user said, “Hey Siri, you’re a slut.” (It should be noted Apple has since changed that reply. Siri now replies, “I don’t know how to respond to that.”) As the report states:

Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like “hey” or “OK.”

The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.

The report’s authors note that companies that make digital assistants are “staffed by overwhelmingly male engineering teams” and those male engineers have built voice assistants that “cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation”:

The subservience of digital voice assistants becomes especially concerning when these machines – anthropomorphized as female by technology companies – give deflecting, lackluster or apologetic responses to verbal sexual harassment.

This harassment is not, it bears noting, uncommon. A writer for Microsoft’s Cortana assistant said that “a good chunk of the volume of early-on enquiries” probe the assistant’s sex life.

Unesco’s report calls for digital assistants to not be made female by default and suggests technology companies explore the feasibility of coming up with a voice that is completely gender neutral as well as program voice assistants to discourage gender-based insults and abusive language.

 
 

Fast Company , Read Full Story

(16)