Skip to content

Latest commit

 

History

History
41 lines (21 loc) · 4.14 KB

gender-bias.md

File metadata and controls

41 lines (21 loc) · 4.14 KB

Excerpt

Siri and Alexa Reinforce Gender Bias, U.N. Finds (in The New York Times)

Why do most virtual assistants that are powered by artificial intelligence — like Apple’s Siri and Amazon’s Alexa system — by default have female names, female voices and often a submissive or even flirtatious style?

The problem, according to a new report released this week by Unesco, stems from a lack of diversity within the industry that is reinforcing problematic gender stereotypes.

“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” Saniye Gulser Corat, Unesco’s director for gender equality, said in a statement. “The world needs to pay much closer attention to how, when and whether A.I. technologies are gendered and, crucially, who is gendering them.”

...

“Siri’s ‘female’ obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products,” the report found.

Amazon’s Alexa, named for the ancient library of Alexandria, is unmistakably female. Microsoft’s Cortana was named after an A.I. character in the Halo video game franchise that projects itself as a sensuous, unclothed woman. Apple’s Siri is a Norse name that means “beautiful woman who leads you to victory.” The Google Assistant system, also known as Google Home, has a gender-neutral name, but the default voice is female.

Baked into their humanized personalities, though, are generations of problematic perceptions of women. These assistants are putting a stamp on society as they become common in homes across the world, and can influence interactions with real women, the report warns. As the report puts it, “The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants — and penalized for not being assistant-like.”

...

Women are grossly underrepresented in artificial intelligence, making up 12 percent of A.I. researchers and 6 percent of software developers in the field.

The report noted that technology companies justify the use of female voices by pointing to studies that showed consumers preferred female voices to male ones. But lost in that conversation is research showing that people like the sound of a male voice when it is making authoritative statements, but a female voice when it is being “helpful,” further perpetuating stereotypes.

Experts say bias baked into A.I. and broader disparities within the programming field are not new — pointing to an inadvertently sexist hiring tool developed by Amazon and facial recognition technology that misidentified black faces as examples.

“It’s not always malicious bias, it’s unconscious bias, and lack of awareness that this unconscious bias exists, so it’s perpetuated,” said Allison Gardner, a co-founder of Women Leading in A.I. “But these mistakes happen because you do not have the diverse teams and the diversity of thought and innovation to spot the obvious problems in place.”

But the report offers guidance for education and steps to address the issues, which equality advocates have long pushed for.

Dr. Gardner’s organization works to bring women working in A.I. together with business leaders and politicians to discuss the ethics, bias and potential for legislative frameworks to develop the industry in a way that is more representative.

...

Dr. Gardner said that changes are also needed in education, because the bias was a symptom of systemic underrepresentation within a male-dominated field.

“The whole structure of the subject area of computer science has been designed to be male-centric, right down to the very semantics we use,” she said.

Although women now have more opportunities in computer science, more are disappearing from the field as they advance in their careers, a trend known as the “leaky pipeline” phenomenon.

“I would say they are actually being forced out by a rather female-unfriendly environment and culture,” Dr. Gardner said. “It’s the culture that needs to change.”