Digital data mining and machine learning technology design for computer brain

Innovate with caution: New report warns against AI bias in the recruitment process

Your chances of landing your next job might come down to your ability to impress an artificial intelligence (AI) system, according to a new report. However, it warns that recruiters must innovate cautiously as these systems can be biased against some groups.

The 2023 Hiring Benchmark Report revealed that AI is making inroads into the world of human resources. However, it’s still in its early stages, with just one in ten professionals relying on the technology to make their decisions, says the report, compiled by Los Angeles-based talent acquisition and success company Criteria Corp.

The use of AI in hiring is higher in the finance industry at 19% and lowest in manufacturing at 5%. It’s also higher in companies with a workforce exceeding 2,500 employees at 21% and lowest in companies with less than 100 employees at nine percent.

“HR and Recruitment are very people-centric disciplines. Therefore, it’s not too surprising that this industry hasn’t rushed all in on AI,” says the report.

AI has a role to play in recruitment, industry professionals concur. It’s especially handy for organizations that receive thousands of applications. Google (NASDAQ: GOOGL), for instance, hires about 20,000 people each year but receives over three million applications.

The greatest challenge for AI hiring is bias, says Josh Millet, CEO of Criteria Corp. The recruitment industry has always had a challenge with bias, and AI is only exacerbating the problem.

“We could almost not do any worse. Yes, we should be careful, deliberate, and measured about implementing AI systems. But that shouldn’t obscure the fact that there is a massive problem,” Millet told CNBC.

Race and age have been the two factors accounting for the most bias. A 2021 study found that applicants with distinctively black names had a 2.1% lower chance of getting hired in the U.S. than those with white names.

“Those aren’t imaginary concerns. Those are absolutely appropriate concerns. The promise of equitable hiring is there. I genuinely would worry about the perpetuation of bias depending on the age and the nature of the data set,” adds Sandra Sucher, a professor at the Harvard Business School.

AI bias and other challenges have led most people to oppose the use of the technology in the hiring process. A study by the Washington think tank Pew Research Center found that 66% of applicants are against AI-led hiring.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: AI is for ‘augmenting’ not replacing the workforce

YouTube video

New to blockchain? Check out CoinGeek’s Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.