UN report highlights gender bias in AI personal assistants

File photo - the Amazon Echo

Voice activated personal assistants such as Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana and Google Assistant, reinforce and spread gender bias by projecting themselves as female, in name, sound of voice, and personality, says a new UN report published on Friday.  

The report, titled ‘I’d blush if I could’ by UNESCO, the UN’s Education, Science and Culture agency borrows its name from the response Siri, Apple’s female-gendered voice assistant used to give when a human user told it, “Hey Siri, you’re a bi***.” 

The report explores how these personal assistants give subservient responses to statements that would be considered sexual harassment and verbal abuse. When asked, ‘Who’s your daddy?’, Siri answered, ‘You are’. Similarly, Cortana met come-ons with one-liners like ‘Of all the questions you could have asked’. Following online petitions in late 2017, these personal assistants have eliminated some of the most excessively apologetic or flirtatious responses to sexual harassment.

The publication notes how all the big tech personal assistants – Siri, Cortana, Alexa, and Google Assistant started with a female only voice, and added a male option after facing a backlash. Female voices are loaded by default in most countries, and Alexa and Cortana still do not offer a male voice. Samsung’s personal assistant Bixby received praise for allowing the user to select a male or female voice at the outset. 

UNESCO calls for change include ending the practice of making digital assistants female by default, programming them to discourage gender-based insults and abusive language, and exploring the feasibility of developing a neutral machine gender. Additionally, report calls for greater representation of women in AI-related technologies – pointing out that only 12 percent of AI researchers are women. 
“Unless current trends reverse, the digital future is likely to be awash in docile near-human assistants, virtually all of them female, who routinely make dumb mistakes. The conflation of feminized digital assistants with real women carries a risk of spreading problematic gender stereotypes and regularizing one-sided, command-based verbal exchanges with women,” the report states.