AI recruitment tools can amplify gender bias: How to make it fairer


Many recruiters will be familiar with, or are currently relying on, recruitment algorithms as part of their candidate screening process. These algorithms are a set of instructions telling a computer program what to look for in a candidate’s job application. These tools are intended to identify qualified and unqualified applications and make recruitment faster and more effective.

Theoretically, this should remove bias from the early stages in the recruitment process, saving hiring managers time, effort and resources. But more and more research is revealing this isn’t the case. Depending on input, recruitment algorithms have the potential to amplify recruiters’ existing unconscious bias.

Data journalist Catherine Hanrahan recently shared new research from the University of Melbourne revealing how recruitment algorithms can amplify unconscious bias favouring men over women in the job market.


Key findings from new research

In reviewing CVs, a panel of recruiters told the researchers they were looking for relevant experience, education and keywords, for example “bachelor’s” and “human resources.” These criteria were used to generate an algorithm to identify top candidates.

We assume AI algorithms will do the same job as human recruiters, only faster. However, comparing the algorithm’s CV rankings to the panellists’ revealed the outcome can be quite different. For one role in particular, a finance role, the panel preferred men for this position regardless of education, relevant experience or keyword match – even despite half of the panellists viewing CVs with the genders reversed.

Even though the algorithm detected men had more relevant experience than women, while women had a better keyword match, the study demonstrates how recruiters aren’t capturing what gives men the advantage in these roles. This represents a challenge since the recruitment algorithms won’t always be programmed by impartial scientists, but the very recruiters with this proven partiality to male CVs. This creates the potential to amplify unconscious bias.


Is the solution to use more targeted coding?

Small differences could be having a big impact on recruitment outcomes.

Study author Leah Ruppanner states: “We know that women have less experience because they take time [off work] for caregiving, and the algorithm is going to bump men up and women down based on experience.”

Ruppanner explains how certain things must be “coded in” to an algorithm to ensure it won’t discriminate against women for things such as parental leave.

Algorithms simply find associations. Therefore, AI-tools have the potential to detect or embed bias, essentially creating both risks and benefits.

The bottom line? Without further development, study and refining, these tools are currently not effective enough to be used in the later stages of candidate screening.


The way forward (for now)

So, does that mean we should abandon AI and recruitment algorithms? No.

While we have a long way to go with AI, it definitely still has a place in recruitment.

Alcami Interactive Founder, Jane Bianchini states, “AI sits at the top of the funnel in recruiting. This means finding and sourcing candidates, as it’s useful for processing large volumes of data and making good decisions at this stage in the process.”

Based on scientific research and analysis from data scientists, Bianchini shares that for AI-based recruitment tools to produce meaningful results, huge volumes of data are required.

“Based on scientific experiments involving thousands of test subjects, we could not find any meaningful correlation to replace human judgement. While some vendors express 1,500 written words is enough to provide an indication [of a candidates’ suitability for a role], data scientists state you’re more likely to require 10,000 words.”

To put this in perspective, this equates to an hour’s speech! Not to mention, data must be taken from the same platform to compare “apples with apples”; for example, work emails, as opposed to work emails and social media posts.

The recruitment funnel still needs human judgement. As a recruiter, give your hiring team and line managers the best conditions to minimise bias (whether it be gender, appearance or ethnicity). Reduce group-think by limiting feedback between evaluators or try other tactics such as removing names from applications.

Psychometric testing and automated video interviews are still the most robust tools to identify top-ranking candidates. These tools give you a far better understanding of candidates and their capability for the role, instead of relying on a machine at the bottom end of the funnel to decide.

“We’re passionate about seeing new technology evolve in the mid-to-bottom level of the funnel,” said Bianchini.

It’s a welcome area for further experimentation and innovation.