By Kathleen Pender
San Francisco Chronicle, October 17, 2017 —
Most job applicants know their resumes will probably be scanned by a computer before a human recruiter lays eyes on it. But they might be surprised to learn that their first interview could be with or judged by a computer.
HireVue of Salt Lake City sells software that uses artificial intelligence to screen video interviews that candidates have recorded on a computer or mobile app. Instead of watching hundreds of online interviews, recruiters can let the product identify a small pool of top candidates — based on their speech, facial expressions and body language — to interview in person. Unilever, Vodafone and Urban Outfitters have used this product, called HireVue Assessments.
San Francisco’s Mya Systems sells a “virtual recruiter” or chatbot named Mya that uses natural language processing to conduct initial, two-way, text interviews with job candidates.
These are just two of many ways companies are applying artificial intelligence and machine learning to the vast amounts of data collected from and about job applicants. The goal, they say, is to automate some of the more routine aspects of hiring so recruiters can focus on the human side. Proponents say it has the potential to increase diversity, but only if companies don’t screen their own biases into algorithms that rate candidates.
“AI algorithms are only as good as the data, and the data comes from the decisions that human beings make,” said Harikesh Nair, a marketing professor at Stanford University’s Graduate School of Business. He added that most large tech companies are using some type of algorithmic screening to deal with the large volume of resumes they get.
“There is an arms race of technology being built and purchased by employers to better capture data and come up with new tools for screening and assessment,” said Josh Bersin, founder of Bersin by Deloitte, an Oakland consulting firm. Companies are using it to find top candidates and also prevent their own top performers from being poached by competitors using similar technologies.
In HireVue’s case, a team of psychologists and data scientists, working with the client, will come up with six or seven questions that all candidates for a specific job can answer in a roughly 20-minute video interview.
Using speech and facial recognition technology, the product picks up candidates’ words, tone of voice, rate of speech, complexity of vocabulary, body language and “microexpressions” that reveal emotions such as anger, joy, disgust or surprise, said HireVue Chief Technology Officer Loren Larsen.
“An asymmetric smile,” can show contempt. Larsen said. “If you say, ‘I really love my boss’ but show contempt when you say it,” the program will pick up that incongruity.
An algorithm will be applied to about 20,000 data points to rank candidates from zero to 100. The algorithm is based on how the employer’s top performers responded to the same questions. It essentially says, “Tell me who the good ones are and find me more of those,” said Dustin Cann, HireVue’s director of solution consulting.
The product is most effective for high-volume, customer-facing jobs such as call-center reps, bank tellers, retail clerks and nurses because they generate lots of data on which to build a model.
Cann says the program can be less biased than humans because a machine “doesn’t care if you are black or white, 20 or 60.” It is also more consistent. “It doesn’t get sick of listening to candidates on Friday afternoon and start rejecting them out of hand.”
The algorithm cannot take into account an applicant’s age, gender, race or other attributes protected by anti-discrimination laws. If it inadvertently discriminates against a protected class, “we can correct for that,” Cann said.
Laura Mather founded Talent Sonar specifically to reduce unconscious discrimination in hiring. Her San Francisco firm has a recruiting platform that uses artificial intelligence in two ways. It analyzes an employer’s job descriptions to weed out or replace words that discourage women or minorities from applying. Instead of saying “fast-paced environment,” we would say “productive environment,” Mather said. The term “ninja” should never be used.
Even the word “analysis” can conjure up images of a white man, she said. “You don’t have to remove all of the problematic terms. You need to balance them with terms that are inclusive. If you are hiring an analyst, include words like ‘collaborative,’ ‘loyal’ or ‘we value people with a growth mind-set.’”
Her product also uses artificial intelligence to identify and redact from resumes, including from Web addresses, any trace of a candidate’s name. A study found that fake resumes with white-sounding names like Emily and Brendan got 50 percent more interview callbacks than identical resumes with black-sounding names like Lakisha and Jamal.
“AI is using a data set to predict a future event. You have a set of data, a set of results, then find an algorithm so when you put in the data, it predicts the results,” Mather said. She hopes it will “free up a lot of the monotony” involved in recruiting.
Entelo, also in San Francisco, uses artificial intelligence to crawl the Web and social media for publicly available information about people. Clients can search its database for people with certain skills or expertise.
This month it released a product called Envoy that helps identify “passive” candidates who are not seeking a job. “Rather than (clients) having to run searches, we identify people who are good fits,” Entelo CEO Jon Bischke said. It’s similar to the technology Amazon.com uses to recommend additional products a shopper might like.
Greenhouse Software, a New York firm that sells recruiting and applicant- tracking systems, has been testing Envoy to identify and reach out to candidates for sales management jobs. “Our response rate jumped to 32 percent from 20 percent,” said Katie DiCioccio, a Greenhouse recruiter.
Entelo does not scrape data from LinkedIn, the professional social network, because that’s “a violation of terms of service for the website,” Bischke said.
That didn’t stop San Francisco startup HiQ Labs from scraping public profiles on LinkedIn. It analyzes the data to help employers determine which employees are likely to jump ship. In May, LinkedIn sent the company a cease-and-desist letter and tried to stop HiQ from accessing its data, saying it violated LinkedIn’s user agreement. HiQ filed a lawsuit seeking to halt LinkedIn’s attempt to block access.
In August, a federal judge in San Francisco granted HiQ’s request for a preliminary injunction and said LinkedIn must give it access, for now. He said access was in the public interest and questioned whether LinkedIn was trying to stifle competition. LinkedIn has appealed the ruling, saying it “ignored bedrock antitrust principles by imposing on LinkedIn a duty to assist a would-be competitor.”
LinkedIn analyzes its own user data for sale to recruiters. “We have a uniquely massive data set,” said John Jersin, LinkedIn’s head of recruiter and sourcing products. “Understanding what moves (people) have made in the past suggests what kind of moves they might make in the future.”
Also, as people look at jobs on the site, LinkedIn uses machine learning to predict what sort of jobs they might be interested in. As a result, “candidates feel the reach-out they are getting from recruiters is relevant to them,” Jersin said.
Mya Systems is also trying to save employers time and job seekers frustration. Using natural language conversation, its chatbot Mya can answer questions about a job and judge a candidate’s skills and interest. If it’s not a good fit, Mya can suggest another job in the company that might be. It can also schedule a candidate for an interview, send reminders and suggest what to wear.
It’s mainly for “high-volume, high-turnover hourly type of jobs” in places like warehouses, stores or call centers, said CEO Eyal Grayevsky. Staffing agency Adecco is a client.
Grayevsky said his system is starting to see patterns in the way people engage with Mya and whether they turn out to be good or bad employees. At some point, if those patterns are “compliant and effective” they could be worked into hiring decisions.
Kathleen Pender is a San Francisco Chronicle columnist.