Is AI sexist?

With men massively outnumbering women in technology roles, submissive female voice assistants like Alexa could become rife and reinforce stereotypes.
By: | October 9, 2019
Topics: DE&I | News

Gender bias in Artificial Intelligence (AI) is already happening. Just think of Amazon’s Alexa, Apple’s Siri and Google Assistant – all softly-spoken and submissive female voices.

The problem is the intake of women in science, technology, engineering or mathematics (STEM) degrees account for only 25 to 35% of the total intake for engineering and computing degrees. So this means fewer women are coming into the labour market to take jobs programming AI.

Chatbots and smart voice assistants are becoming more popular across websites, apps and social networks. For example, Singapore Airlines recently launched its own ‘Kris’ AI chatbot last month.

Robert LoCascio, Founder and CEO of LivePerson, and Founder of EqualAI, said: ‘’As conversational AI dramatically grows in usage, its sexism could get baked into the world around us, including the next generation of AI. Subtle reinforcement through repetition can add up, over time, to a form of problematic psychological conditioning’’.

Closing the gap for women in tech is likely to take some time but ensuring that women can be present to partner alongside programmers is one short-term option.

‘’With so few women working in the programming of today’s AI, white male engineers have largely been responsible for its development and, knowingly or unknowingly, failed to challenge their own chauvinism or consider the harm their work could do,’’ adds LoCascio.

EqualAI is working to develop a set of best practices in bot building that champion diversity and spread them across the industry.