Information Commissioner John Edwards has announced an investigation into the use of artificial intelligence in recruitment as candidates could be unfairly discriminated against. It’s commonplace for AI technology to be used in the recruitment process, such as short-listing candidates, testing applicants, and undertaking initial interviews. However there have increasing concerns about the possibility of inherent bias within the AI depending on the data inputted and programming underpinning it.
It is likely that any automated discrimination is entirely accidental, for example using past hiring success to predict likelihood success will replicate any human unconscious biases in the process. Any system using writing or speech patterns to identify weaker candidates risks penalising people for whom English is not their first language.
However accidental discrimination is never acceptable, and companies deploying AI technology in their recruitment process must do their due diligence.
John Edwards, Information Commissioner launching the investigation said “Companies may be deploying some of these applications without doing due diligence, and asking the questions about how they are filtering candidates, for example, based on voice. Companies should be asking questions: How does it treat neurodiverse applicants? How does it treat people with hearing impairments, or speech impediments? How does it treat different ethnic groups?
“And if they are not satisfied with the answers they’re getting, then they’re risking data protection consequences, but they’re also risking other liabilities. Most fundamentally, they’re risking not getting the best person for the job.” Importantly, AI decisions to reject candidates could result in a discrimination claim against the employer despite that decision not having been made by a human.
To date there has been minimal official oversight of the technology and algorithms deployed, however the risks of bias are undisputed so we are now playing regulatory catch up. The ICO decision to investigate is very positive and we hope it will protect people who might otherwise be discriminated against.