Mobile

Researchers find conventional voice AI overlooks trans and nonbinary users

Voice-activated AI (VAI) is increasingly ubiquitous, whether in the form of conversational assistants or more generalized personal assistants like Alexa, Google Assistant, and Siri. Researchers to date have studied the social consequences of their design and deployment, with one focus being on the social implications of gendering VAIs. (Many VAIs, including Google Assistant, have feminine-sounding synthetic set by default.) A forthcoming study in the Association for Computing Machinery aims to advance previous work by undertaking a series of interviews with trans and/or nonbinary users of VAIs, a historically understudied group, to explore their experiences, wants, and needs. The coauthors say the results show that these needs are far more than improvements to representation and that users raise concerns around the framing of gender — even by well-intentioned developers.

As the researchers point out, voices are important elements of interaction to many trans people because of how deeply gendered different styles of speaking can be. An incongruence in voice can serve as a source of pain for a trans and/or nonbinary person’s sense of self. Moreover, voice can serve as a way in which trans people are identified as trans, potentially leading to discrimination and violence. A 2015 survey found that 46% of transgender people had experienced verbal harassment, 47% had been sexually assaulted, and 54% had experienced violence from a partner.

The researchers conducted interviews with a group of 15 participants recruited through a call to LGBTQ+ community centers along with LGBTQ+ Facebook groups based in two cities. They asked questions centered around core themes, specifically (1) the participants’ experiences of being represented by VAI, (2) their suggestions for VAI development, and (3) tensions with the expanding role of VAIs.

Thirteen out of the 15 participants were negative about the representativeness of VAIs’ voices, with 11 stating that VAIs were not designed for them. They pushed back against the idea that gender should be treated as equivalent to “where your voice falls within a stereotypical rage of pitch,” but at the same time, they proposed alternative forms of representation like providing a wide spectrum of “ungendered” voice options.

The participants in the study also worried about how technologies like VAIs might amplify the hardships they experience on a daily basis. Several had mixed feelings about the idea of a system featuring trans representation in voice and gender-affirming design, expressing privacy concerns and a skepticism about developers’ ability to deliver on their promises. A single fixed voice aimed to represent trans and/or nonbinary people, the participants said, would be a reduction of the diversity within the community and perpetuate the notion that specific vocal ranges are linked to specific genders.

“Implicit in our participants’ understanding of representation is the importance of recognition: a way of affirming the humanity, moral agency and equality of each other –as individuals, and as communities,” the researchers wrote. “This approach chafes with our participants’ understanding of how VAI designers approach representation. They worry that designers’ understanding contains only a surface-level approach to trans needs, visible in their focus on developing gender-neutral voices.”

The researchers cite Q, a project spearheaded by media company Vice and its for-profit spinoff Virtue, as an example of poorly representative, exclusive design. While Q was billed as the “world’s first nonbinary voice for technology” when it was announced in early 2019, the coauthors argue its design process — which partly entailed recording 6 people identifying as male, female, transgender, or nonbinary — “raises as many questions as it answers.” The developers behind Q appear to treat trans voices as representing a “monolithic population” rather than not those of men and women.” Furthermore, by treating nonbinary voices as a “third gender option,” Q risks denaturing fixed, categorical ideas of gender, the researchers say.

To address these concerns and others, the coauthors recommend that VAIs at a minimum be designed with and for trans-specific privacy considerations, have features for trans-specific purposes, be representative and gender-affirming. Most importantly, they say the development of VAIs features must be in a grounded and participatory process.

“The development of VAIs that explicitly ‘move beyond merely ‘allowing’ trans people to exist’ would help bridge disparities experienced by this population under structural cisnormativity by providing material improvements that enhance their … comfort in identity, selfhood, embodiment, sexuality,” the researchers said. “Considering constraints within the commercial contexts where these devices are currently developed, we suggest that researchers and technologists go further and work outside these structures towards developing grassroots VAIs driven by and accountable to trans communities, while employing strategies for disrupting the aggravation of digital privacy breaches through the use of voice analytics.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Mobile – VentureBeat

Leave a Reply

Your email address will not be published. Required fields are marked *