Estimated reading time: 3 minutes, 12 seconds

Amazon AI System Not So Smart At Recruiting Women

Artificial intelligence (AI) has generated plenty of buzz when it comes to transforming recruitment, but Amazon recently learned that its AI system was not so smart when it came to gender.

The online mega-technology giant had to ditch an artificial intelligence system it used for recruiting after discovering the algorithm-driven program favored men for software development and technical positions, Reuters reports.

Amazon's goal with its AI recruiting system was to streamline the recruiting process to be more efficient and smarter at finding the best job applicants, five sources recently told Reuters. But AI failed big time when it came to selecting women, as it was based on analyzing resumes turned in by applicants over a 10-year span and most of those resumes came from men.

While Amazon had been tapping AI since 2014, it learned a year later that the system was bias towards men for software and other technical jobs. Amazon tried to fix the system to make it more gender neutral, but when it could not guarantee that the machine-learning system would no longer discriminate against women candidates, it ended the effort in the beginning of 2017, sources said.

Amazon would not comment on problems with its AI system, but contends that it "was never used by Amazon recruiters to evaluate candidates." It did not provide further details.

LinkedIn, owned by Microsoft Corp., provides employers with a way to use algorithmic rankings of job candidates to determine if they are the best fit for the job. But John Jersin, vice president of LinkedIn Talent Solutions, warns that AI is still not good enough to replace real humans.

"I certainly would not trust any AI system today to make a hiring decision on its own," Jersin says. "The technology is just not ready yet."

Amazon's failure with its AI recruitment tool shows the danger of feeding such systems information that is inaccurate or not reflective of the real world workforce, HR Technologist reports.

"It all comes down to what kind of data AI is using to make hiring recommendations," says Caitlin McGregor, co-founder and CEO at Ontario, Canada-based Plum, a cloud-based hiring solution provider that focuses on eliminating human bias.

Humans have traditionally reviewed skills and knowledge when gauging job candidates, McGregor says. The problem with this is when biases potentially come into play, such as when hiring managers lean favorably toward a candidate whose resume shows they are a Harvard graduate.

"These qualifications tend to point to privilege, not necessarily job fit," McGregor says. "So, when AI-based hiring solutions rely on skills and knowledge, such as resume and social media scraping tools, it's just perpetuating the same biases, but at a larger scale."

AI recruiting tools can be very useful, but that requires acknowledging the fallacy of relying on resumes. "AI can help--but if we're truly going to commit to moving beyond resumes to make the hiring process less biased and more predictive, that means we also have to move beyond AI that simply automates resume keyword matching," McGregor says.

Others warn that as confident an employer is in their AI system's ability to remove bias in the hiring process, the end result is often far different, The Washington Post reports.

Brian Kropp, group vice president of the HR practice at Gartner, knows of numerous companies that were convinced that their algorithms "eliminated bias in the hiring process and all they've done is institutionalized biases that existed before or created new ones."

"The idea that you can eliminate bias in your hiring process via algorithm is highly suspect," he added.

And as good as an AI system is, "when it comes down to those final decisions about making a judgment call, that requires intuition," says Michael Gretczko, who heads a human capital practice at Deloitte. And for "now and for some time in the future," a human will outperform a machine.

Read 2583 times
Rate this item
(0 votes)

Visit other PMG Sites:

PMG360 is committed to protecting the privacy of the personal data we collect from our subscribers/agents/customers/exhibitors and sponsors. On May 25th, the European's GDPR policy will be enforced. Nothing is changing about your current settings or how your information is processed, however, we have made a few changes. We have updated our Privacy Policy and Cookie Policy to make it easier for you to understand what information we collect, how and why we collect it.