Advertisement

UN study finds female voice assistants reinforce harmful stereotypes

The report calls on tech companies to stop making AI assistants female by default.

For the record, the appropriate response to being called a slut isn't, "I'd blush if I could." But that's what Siri is programmed to say. According to a report by the United Nations, the fact that most voice assistant are gendered as young women is reinforcing harmful stereotypes that women are docile and eager to please, even when they're called lewd names.

The report, Closing Gender Divides in Digital Skills Through Education, calls on companies like Google, Amazon, Apple and Microsoft to stop making digital assistants female by default. It'd be even better to make them genderless, the report says -- as Google has attempted to do by labeling its voices red and orange, rather than male or female. The UN also calls on tech companies to address the gender skills gap, noting that women are 25 percent less likely to have basic digital skills than men.

This isn't the first time AI's gender bias has been questioned. Siri even changed her original response to being called a b-word. But, the report states, "the assistant's submissiveness in the face of gender abuse remains unchanged." The document includes a chart showing how the four leading AI voice assistants respond to being called hot, pretty, slutty or a "naughty girl." More often than you'd hope, they express gratitude. In other cases, they respond with a joke or claim they don't understand.

The issue will only become more pressing as AI assumes a greater role in our lives. According to the report, voice-based web searches now account for close to one-fifth of mobile internet searches. That number is projected to reach 50 percent by 2020. As the UN puts it, people will have more conversations with digital assistants than with their spouse -- hopefully they'll treat both with respect.

Engadget has reached out to Apple, Amazon, Google and Microsoft for comment.