featureimage-9

AI tools to aid more diverse recruitment are ‘pseudoscience’, researchers say

Artificial intelligence used by recruitment platforms to help boost diversity in the workplace is discriminating against people over random changes in clothing or lighting, a new study has warned.

A growing number of businesses have turned to AI-powered software to sift through applications and analyse interviews of candidates with the aim of boosting diversity and finding better culture fits for employers.

Such software says it helps combat unconscious bias and helps firms identify the most compatible personality for a role, but a new study from the University of Cambridge has warned that these systems can be affected by random details such as the clothes people wear, the lighting in the room they appear in and even what they have in the background for video interviews.

The researchers warn businesses that some uses of the technology are little better than “automated pseudoscience”.

The study, by the University of Cambridge’s Centre for Gender Studies – which has been published in the journal Philosophy and Technology – said the use of such AI tools was a dangerous example of “technosolutionism”, where people turn to technology to provide a quick fix to an issue that is deep-rooted and requires more time and investment.

Artificial intelligence graphic
(Alamy)

The researchers argue that the AI tools could ultimately increase uniformity in a workplace rather than diversity because many systems are built to search for the fantasy of an employer’s ideal candidate.

They say that some firms promising their software can remove bias are guilty of spreading misinformation.

“We are concerned that some vendors are wrapping ‘snake oil’ products in a shiny package and selling them to unsuspecting customers,” study co-author Dr Eleanor Drage said.

“By claiming that racism, sexism and other forms of discrimination can be stripped away from the hiring process using artificial intelligence, these companies reduce race and gender down to insignificant data points, rather than systems of power that shape how we move through the world.

“While companies may not be acting in bad faith, there is little accountability for how these products are built or tested.

“As such, this technology, and the way it is marketed, could end up as dangerous sources of misinformation about how recruitment can be ‘de-biased’ and made fairer.”

No comments yet.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.