Is Amazon's Alexa inadvertently reinforcing gender stereotypes? A recent study delves into the popular virtual assistant's design, suggesting that it may be coded to present a female persona, inadvertently embedding gendered expectations within its various skills. 

Is Alexa Sexist?

A new study led by Dr. Lai-Tze Fan, a professor at the University of Waterloo and Canada Research Chair in Technology and Social Change, suggests that Amazon's virtual assistant, Alexa, may exhibit signs of gender bias and reinforce traditional gender norms. 

INDIA-ECONOMY-TECHNOLOGY-EDUCATION-AMAZON
(Photo : INDRANIL MUKHERJEE/AFP via Getty Images)
In this photo taken on February 6, 2019, an Indian student watches classmates put a wig onto a teaching virtual assistant mannequin fitted with Amazon's "Alexa" -- a cloud-based intelligent voice service -- at the Ramakrishna Paramhansa Marg BMC school in Mumbai.

The research aims to explore how Alexa's design may inadvertently reflect and perpetuate gendered stereotypes and expectations. Fan analyzed numerous voice-driven skills integrated into Alexa to uncover patterns that may indicate gendered design. 

The primary objective was to reveal how the technology's inherent design influences and, in turn, is influenced by traditional notions of feminized labor and sociocultural expectations.

In her investigation, Fan expressed a desire to showcase how Alexa's design tends to present a female persona, leading to the inclusion of gendered expectations within the code and user experiences of various Alexa skills.

"While users have the option to change the voices of Alexa, Siri, and other AI assistants, research shows that male-presenting voices have not been as popular. In addition, developments in gender-neutral voices have not been integrated into the most popular interfaces," Fan said in a statement.

The study employed techniques similar to reverse engineering to understand aspects of Alexa's closed-source code within the boundaries of fair dealing laws.

Typically, virtual assistants like Alexa operate by interpreting user commands through text or voice, triggering predefined scripts to perform specific tasks. As of mid-2022, Alexa boasted over 100,000 skills covering a range of activities, from household chores to entertainment.

Despite the closed-source nature of Alexa's code, Fan utilized various methods to examine snippets of the code, drawing from Amazon's official software developer console, the Alexa Skills Kit, and GitHub repositories containing open samples of Amazon-developed code.

The analysis extended to third-party user-developed skills that provided additional insights into the technology's responses to user behavior.

Read Also: Amazon Alexa's Controversial Election Claims Highlight Challenges in Voice Assistant Accuracy

Responses of Alexa to Users

The study shed light on how Alexa's design influenced responses to users engaging in flirting, verbal abuse, and attempts to trick the virtual assistant into accepting misogynistic behavior.

Fan emphasized the importance of critically analyzing the culture of major tech companies, revealing potential exclusions, discrimination, and systemic inequalities within their foundations.

The study ultimately aims to understand how AI designed for assistance and support may unintentionally perpetuate gender norms becomes crucial for assessing its impact on user behaviors in virtual and real-world social contexts.

The research paper, titled "Reverse Engineering the Gendered Design of Amazon's Alexa: Methods in Testing Closed-Source Code in Grey and Black Box Systems," was published in Digital Humanities Quarterly. 

Related Article: BMW Chooses Amazon Alexa for Its Next-Gen Voice Assistant: Partnership Grows from 2018 Decision

Byline

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion