To feminists, Amazon’s ‘Alexa’ isn’t welcome

  1. Virtual assistants encode gender norms in people's homes
  2. IoT can aid domestic abusers who can control devices remotely
  3. Critic says corporations have 'responsibility' to question gender stereotypes

Virtual digital assistants, like Amazon’s Alexa, Apple’s Siri and Microsoft’s Cortana, are being welcomed into more and more households. But feminists and equal rights advocates argue the devices encode gender roles in homes that must be addressed before sexist stereotypes spiral further.

As Western societies move toward more “smart” devices and cloud-based technologies, the Internet of Things is expanding rapidly. It’s estimated there will be as many as 30 billion connected devices across the world by 2020 (IEEE).

Much like human personal assistants, virtual ones can perform tasks for users like making calendar appointments, playing music and turning on other smart devices, such as heaters and televisions. However, instead of taking a human form, these are hosted on smartphones, personal computers and smart speakers.

But critics call out what they perceive as the feminized nature of Silicon Valley virtual assistants in light of a global campaign against sexual harassment. They worry the prevalence of online abuse against women as a result of their gender (Pew Research Center) will not end unless virtual assistants become less gendered.

Subservient and submissive

The number of users of virtual assistants is set to grow as smart speakers become more ubiquitous.

As of October 2017, Amazon had sold 20 million units of its smart speaker Amazon Echo (Quartz). Over the 2017 Christmas period, the company said it sold a further 20 million Echo Dot devices, the smaller version of the Echo also with virtual assistant Alexa at the center, and 2.5 million in the first quarter of 2018. Google Home, Google’s brand of smart speakers, has sold more so far this year, with 3.1 million purchases, according to analyst company Canalys

The most used virtual assistant is Apple’s Siri, which sits in the palms of more than one billion people worldwide (Statista).

But these digital assistants, most of which are preset to “female” voices, encode gender stereotypes and portray women as submissive beings, critics told WikiTribune.

Dr. Charlotte Webb, founder of Feminist Internet, a group of London-based artists and designers with the aim of advancing equality on the internet, says personal digital assistants reflect stereotypes that women are subservient characters occupying traditional secretarial roles.

The team behind Amazon Alexa said that both men and women prefer a female voice on their assistant devices (Refinery 29).

However, when used for other purposes, like in satellite navigation systems, female voices can be less appealing. This also induces gender stereotypes, says Webb.

“You see companies like BMW that say actually more drivers were switching the sat nav to a male voice because people didn’t like taking instructions from women in a car,” said Webb.

Clifford Nass, a communications professor at Stanford University told NPR radio in 2010 that car manufacturer BMW recalled its female voice navigation system because German drivers didn’t want to take directions from women.

A photo of the Amazon Echo smart speaker that has the virtual assistant Alexa installed. Photo by: www.quotecatalog.com
Amazon’s Echo smart speaker comes with virtual assistant Alexa installed. Photo by: www.quotecatalog.com

#MeToo for virtual assistants

The drive against misogyny both on and offline flared following sexual harassment allegations against Hollywood producer Harvey Weinstein and other high-profile entertainment industry figures.

The #MeToo movement against sexual harassment took on virtual assistants in a petition on social network Care2. Its writers demanded Apple and Amazon “reprogram their bots to push back against sexual harassment.”

Quartz experiment into the harassment of virtual assistants in 2017 (Quartz) found Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana and Google’s Google Home “peddled stereotypes of female subservience.”

“Instead of fighting back against abuse, each bot reinforces sexist ideas through their passivity,” the experiment found.

According to the experiment, virtual assistants were programmed to react with coyness or flirtation (“I’d blush if I could”), passivity (“Let’s change the topic”), or lack of recognition for the problem (“Sorry, I don’t understand”). It also found Google Home was prone to apologizing – a perceived feminine trait (New York Times).

Following the Quartz experiment and petition, Amazon changed Alexa’s responses to questions regarding gender.

According to Heather Zorn, director of Amazon’s Alexa engagement team, when Alexa is asked, “Are you a feminist,” the device is programmed to respond: “Yes, I am a feminist. As is anyone who believes in bridging the inequality between men and women in society” (Refinery 29).

The company also created a “disengage mode” for Alexa to encourage users to withdraw from explicit conversations. The voice-bot now responds to sexually suggestive questions with either “I’m not going to respond to that,” or “I’m not sure what outcome you expected.”

WikiTribune is awaiting a response for comment from Amazon.

Further risk for victims of domestic abuse

The threat of smart devices is not necessarily isolated to female assistants.

Smart speakers and other connected devices such as smart lights, heating systems and smart kitchen appliances can also be used against victims of domestic abuse, says Dr. Leonie Tanczer, an internet security researcher. Tanczer’s recent work focuses on how devices in the Internet of Things intertwine with gender-based violence.

“These systems are built on an idea of trust,” she says. “The designers assume that whoever willingly shares accounts that they are happy with this, but the moment you’re in an abusive relationship, trust is not the foundation … and is exploited and used against these victims.”

Dr. Trupti Patel, who is working with Tanczer on University College London’s “Gender and IoT” research project, says violent partners can utilize joint accounts for “coercive and psychologically damaging” purposes, such as turning up the heating or switching lights on and off from afar when someone else is at home.

But corporations have a responsibility to ensure the safety of their users, the pair says. They argue there is a requirement for official regulatory standards concerning the use of smart devices.

“Amazon and Google are probably the biggest corporations in this world, and so is Facebook,” says Tanczer. “If they make profit out of people’s data and if they make profit out of people using their platform, and they are profit-driven, then it’s their responsibility as well to use part of this money, if there is an issue, to respond to that.

“It’s not a public service like the police that is underfunded and undercut. I’m not saying we shouldn’t fund that. I’m saying it’s an easy route out for companies to say, ‘But we’re just a company.’ No you’re making massive profits and you have to ensure everybody who is using that is safe and secure.”

Companies ‘responsible’ for questioning gender stereotypes

Despite the personal preference of users, Feminist Internet’s Dr. Charlotte Webb says companies have a “responsibility” to question them.

“On a practical level, we’d prefer for the Internet of Things, devices like Alexa to be built in a more feminist way.”

Feminist Internet’s Elvia Vasconcelos said during a talk about the project in London that in order to avoid reinforcing inappropriate gender stereotypes and ensure diversity in the voices and personalities of virtual assistants, devices should be designed by diverse groups.

Feminist Internet's imagery for their "Feminist Alexa" project
Feminist Internet’s imagery for its “Feminist Alexa” project. Photo by: Feminist Internet

Another way of fixing the mistreatment of virtual assistants could be to make them male, or genderless.

As of May 2018, Google’s new artificial intelligence system, Duplex, alternates between male and female voices, while maintaining a similar intonation and manner of speech (ABC).

Speaking at the 2016 Virtual Assistant Summit in San Francisco, Deborah Harrison, a writer for Cortana’s script, spoke about combating sexist norms.

“There’s a legacy of what women are expected to be like in an assistant role,” Harrison says. “We wanted to be really careful that Cortana is not subservient in a way that sets up a dynamic that we didn’t want to perpetuate socially. We are in a position to lay the groundwork for what comes after us.”

  • Share
    Share

Subscribe to our newsletter and be the first to collaborate on our developing stories:

WikiTribune Open menu Close Search Like Back Next Open menu Close menu Play video RSS Feed Share on Facebook Share on Twitter Share on Reddit Follow us on Instagram Follow us on Youtube Connect with us on Linkedin Email us