Tech

Researchers Who Think Voice Assistants Like Siri Perpetuate Gender ‘Stereotypes’ Have A Genderless Solution

Shutterstock image via user Antonio Guillem

Daily Caller News Foundation logo
Evie Fordham Politics and Health Care Reporter
Font Size:

A group of researchers who believe tech’s current offering of mainly male and female voice assistants “perpetuates stereotypes” have put money and time into Q, a “genderless” voice assistant.

The artificial intelligence assistant uses a voice with a frequency of around 145 Hertz, which is believed to fall between the frequencies of typical male and female voices, according to Geek.com.

Q’s creators asked that visitors on their website to share the voice assistant with tech companies like Twitter and Apple. The site’s “About” section states:

Technology companies often choose to gender technology believing it will make people more comfortable adopting it. Unfortunately this reinforces a binary perception of gender, and perpetuates stereotypes that many have fought hard to progress. As society continues to break down the gender binary, recognising [sic] those who neither identify as male nor female, the technology we create should follow.

Who are Q’s creators? The voice assistant is backed by a team including Copenhagen Pride and Vice’s creative agency Virtue. They unveiled Q at South by Southwest in Austin, Texas, on March 11, according to AdWeek. (RELATED: Tech Exec Hired By DNC After Embarrassing Email Leaks Leaves For Social Justice Org Run By Steve Jobs’s Widow)

A man uses 'Siri' on the new iPhone 4S after being one of the first customers in the Apple store in Covent Garden on October 14, 2011 in London, England. (Photo by Oli Scarff/Getty Images)

A man uses ‘Siri’ on the new iPhone 4S after being one of the first customers in the Apple store in Covent Garden on October 14, 2011 in London, England. (Photo by Oli Scarff/Getty Images)

“It’s going to become an increasingly commonplace way for us to communicate with tech,” Project Q collaborator Julie Carpenter, a researcher with the Ethics and Emerging Sciences Group, said according to WIRED. “Naming a home assistant Alexa, which sounds female, can be problematic for some people, because it reinforces this stereotype that females assist and support people in tasks.”

Follow Evie on Twitter @eviefordham.

Send tips to evie@dailycallernewsfoundation.org.

All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact licensing@dailycallernewsfoundation.org.