• Uncategorised

UN finds voice assistants aren’t helping combat sexist gender stereotypes

Your voice assistant of choice probably has a female-sounding voice. While you may not think much of it, the United Nations Educational, Scientific, and Cultural Organization (UNESCO) believe the A.I.-powered assistants may be reinforcing negative gender stereotypes while also failing to properly rebuff violent and abusive language, leading to some potentially harmful outcomes.

In a paper titled “I’d blush if I could,” researchers from UNESCO explored some of the implicit biases that take place in artificial intelligence research. The paper’s authors suggest that by giving voice assistants traditionally female names and assigning them female-sounding voices by default, companies have unintentionally reinforced negative and regressive ideas about women. It positions them as subservient and in a role where they are expected to do what is asked of them.

Additionally, the paper took a look at how voice assistants respond when presented with abusive language. What they found is the A.I. deflects such comments rather than take any sort of action to discourage them. If a user threatens a voice assistant it often produces a silly joke or a dismissive message. Researchers at UNESCO believe that tech companies should be building safeguards into these systems that would help diminish abusive language directed toward the female-sounding voices. The researchers believe by failing to do so, companies run the risk of normalizing behavior such as making violent threats against women.

Part of this problem, according to UNESCO, is that most tech companies have engineering teams that are staffed extensively and overwhelmingly by men. Because they likely do not have direct experience with dealing with this type of language aimed at them nor have they been the victims of the type of negative stereotyping and antiquated gender roles that women deal with, they aren’t necessarily thinking about these things when designing A.I. systems.

“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built A.I. systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the researchers write in their report. “Because the speech of most voice assistants is female, it sends a signal that women are … docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”

UNESCO believes the best solution to the problem is to create a gender-neutral voice that A.I. assistants can use. The organization also suggests building in responses and systems that would shut down and discourage aggressive and violent language and insults. Researchers believe tech companies should also stop positioning the voice assistants as subservient humans in an attempt to avoid extending harmful stereotypes.

Editors' Recommendations

You may also like...