The Artificial Intelligence Video Interview Act requires that employers who conduct video interviews and then use artificial intelligence to analyze the video tell the “applicant in writing before the interview that artificial intelligence may be used to analyze the applicant’s facial expressions and consider the applicant’s fitness for the position,” according to the Illinois General Assembly.
The Act, which passed both houses on May 29 and was sent to the Governor June 27 for signing, also mandates that prospective employers give each applicant written information prior to the interview that lays out how AI works and how it will gauge applicants. Employers also are prohibited from sharing applicant videos unless it is with a person who is a qualified expert to evaluate the applicant.
The bill is designed to deal with the risks that may come from hidden biases, says Mark Girouard, a labor and employment attorney for Nilan Johnson Lewis in Minneapolis. His comments appeared in an article from Workforce. "As with any use of AI in recruiting, this law come from concerns about how observations in the interview correlate to business value,” Girouard notes.
AI analysis of video interviews can be useful in, for example, helping pinpoint how candidates who use specific phrases or who speak at a certain speed may be the best fit for a particular role. AI is able to do this, in part, by reviewing and comparing earlier data of top candidates.
The potential problem with AI is if the algorithm learns from data that is inherently biased, Girouard says. The algorithm, for example, can incorrectly associate some word usage, facial expressions and even skin color to top achievers even if those attributes don’t have a real correlation to performance. “If algorithms are trained correctly they should’t replicate bias,” Girouard says. “But if they aren’t they can amplify disadvantage.” Other states are likely to follow Illinois’s lead in introducing similar bills, Girouard notes.
Employers may unintentionally dismiss applicants who fall under a protected characteristic, such as age, if the employer “teaches” its AI system that younger candidates are more favorable than older ones by consistently picking out the younger recruits for jobs, according to John Litchfield, an associate and litigation attorney for Foley & Lardner. Litchfield shared his perspective as the author of an article for The National Law Review.
Whether in Illinois or some other state, employers should follow some best practices, including vetting AI vendors and their software and meeting with their own attorneys to make sure such systems recognize employment protections, Litchfield writes. Employers also need to make sure the AI screening analysis does not end up excluding certain groups, ensures that AI does not inadvertently exclude applicants who claim a disability and provides proper accommodations. “In the end, while AI can be a useful tool in hiring, it should not be used as a substitute for good judgment and human-to-human interaction somewhere in the hiring process,” Litchfield notes.
Garry Mathiason, an attorney with Littler Mendelson in San Francisco, tells the Society for Human Resource Management that the Illinois legislation would likely be helpful for employers who want their HR departments to use AI. “It can encourage the use of this technology because it starts answering the question of what's permissible,” Mathiason says, adding that the regulatory uncertainty surrounding the use of AI is likely hindering some employers from deploying these systems.
Girouard says he would not be surprised to see more states adopting similar laws. “I think Illinois is leading the way… but other states, like California and New York, are now taking up hiring algorithm legislation, too,” he says.