Applying for a job? AI might not like the fact that you have kids

Despite being perceived as a neutral screening process for job applicants, a new study has shown that AI is identifying parents as less qualified.
By: | July 25, 2023

AI is not averse to biasness to gender even when it comes to supposedly neutral resume screening processes such as resume blinding, which removes any identifying pronouns, genders, and indicative names. In fact, research from the University of Melbourne has found that AI could not only identify genders of potential job applicants resume blinding, but is also biased against applicants who are parents.

The research paper, authored by Dr Lea Frermann, Sheilla Njoto, Dr Marc Cheong and Professor Leah Ruppanner from the University of Melbourne, focused on the different levels of gender bias in algorithms when it came to hiring, using ChatGPT as the AI algorithm in the context of hiring.

Creating different resumes for a range of different occupations, all aspiring applicants had the same qualifications and job experiences, with gender varied across resumes. The researchers added two variables to the study: firstly, the researchers swapped the applicant’s name that indicated the gender of the employee and gave half of the resumes a parental leave gap. The CVs were then showed to ChatGPT, which rated how qualified the candidate was for a job position on a scale from zero to 100.

READ MORE: HR leaders take wait-and-see approach to generative AI

What the researchers found was that while AI algorithms did not rank CVs written by different genders any differently, parents who have a gap for parental leave in their resumes were ranked lower across all occupations via the AI, no matter the gender. To counteract this, the researchers called for more careful auditing of biases to remove the most obvious layer of discrimination, as well as for more regulatory controls to counter AI’s capacity to discriminate.