Artificial intelligence tools used to make hiring decisions could have built-in biases that favour men and disadvantage women.

Researchers from the University of Melbourne looked into the way supposedly objective tools can be influenced by pre-existing bias confirming that implicit human bias can be implanted onto and amplified by machine learning systems.

The experiment began with a hiring panel who made quantifiable judgements about a series of CVs for three different jobs – data analyst, finance officer, and recruitment officer – which the researchers used to inform machine learning algorithms that would rank CVs.

One of the algorithms, created using a linear regression technique, did a decent job of matching the results of its human counterparts.

Except those results were already biased.

The researchers had found that its human panel tended to favour male candidates for the male-dominated data analyst role and the gender-balanced finance officer role.

When looking for sources of bias in the machine, the researchers found little evidence that it stemmed from methods of machine learning – such as keyword matching language models and classifier/predictor models – but strong evidence that human bias from the modelled dataset influenced the final output.

“We know that these biases can be exaggerated by artificial intelligence,” said co-author of the study, Leah Ruppanner.

“This means online job seeking and CV ranking will continue to work against women in jobs where humans exhibit a stronger prefernece for male candidates.

“Computers ‘learn’ that a successful candidate has a man’s name because of human decisions and will be more and more likely to rank a CV from a man more highly as a result.

“Computers don’t ask why. The onus is on us to understand the subconscious bias behind job hiring decisions, before we start embedding these problematic preferences into artificial intelligence algorithms.”

Problems with AI in hiring have already been established, such as when Amazon decided to scrap its AI recruiting tool once it was found to be biased against women.

Because the Amazon AI recruiter was trained on a dataset that mostly included male CVs, it ended up concluding that male characteristics were preferable and automatically downgraded female candidates regardless of relevant skillsets.

The University of Melbourne research recommends that, where AI is used as a recruiting tool, it is transparent, regularly audited, and built with the intention to reduce gender bias.

It also want to see more training so human resources professionals are aware of the potential for bias in algorithmic hiring processes.