Recruiting Algorithms: Appraising their Limits and Benefits

Hiring algorithms have been in the headlines lately -- and for good reason. Study after study, including this recent whitepaper  published by the National Bureau of Economic Researchers, suggest that number-crunching can produce higher-quality hires than recruiters and hiring managers.

What’s not to like about the promise of boosting objectivity in candidate assessment, reduced bias and the ability to identify applicants that are most likely to become productive employees with long tenure?

Replacing human judgment with fancy analytics, trendy big data, or a spreadsheet tally of qualifications can create challenges in organisational psychology.

So let’s take a look at what goes into hiring algorithms and the complex issues that arise when human foibles are set aside and data analytics are relied upon to select the best candidate.

The starting point: It’s not easy to hire the right people and keep them

Many business veterans want to believe they've developed an eye for candidates who will succeed. Yet countless organisations have found out the hard way that a hiring manager’s instinct and a recruiter’s experience often fall short.

“For many years we struggled with recruitment and retention,” says Carlyle Walton, CEO of Metroplex Health System in Killeen, Texas. “We were always looking for tools to improve this.”

So Metroplex decided to harness hiring algorithms to do much of the heavy lifting. After choosing a product, the next hurdle was to persuade managers to use it by involving them in the implementation process.

“This journey started with us having managers identify the top performers,” says Walton. “This helped to get buy-in. We have about half of departments doing it.”

Managers pitching the implementation of hiring algorithms will indeed “encounter a lot of resistance,” says Nathan Kuncel, a professor of psychology at the University of Minnesota. “We can get a lot of good from the algorithm, but still keep it acceptable to people by emphasising the roles that they continue to play in the process.”

For the time being, Metroplex is allowing use of the software to spread organically through the organisation. Each department decides whether to use Pegged Software’s solution. A department sets its own threshold with the software, says Walton, in terms of the level of probability that a given candidate will make a good employee.

Algorithms have their limits, but they do boost quality of hiring

Another key issue is the functionality that algorithms can fulfill. “We use the software for all positions in a hospital below the executive level: nurse, admitting clerk, pharmacy, and so on,” says Mike Rosenbaum, CEO of Pegged.

Algorithms don’t work for executive recruitment -- at least not yet -- given the insufficient amount of data available due to the small number of positions at this level, according to Rosenbaum.

Walton believes the use of hiring algorithms has improved Metroplex’s workforce. “We have seen reductions in turnover and increases in quality scores since we started using software to evaluate candidates; this is one piece of what has contributed to these improvements.”

Taking people out of hiring decisions doesn’t eliminate bias liability.

For many employers, one of the top reasons for adopting hiring algorithms is the very human inclination, whether conscious or not, to hire people who are like oneself. And that can be illegal if the similarity is about race, gender, religion, disability or other characteristics that define protected classes.

“If a company is screening out applicants on the basis of a computer algorithm, the managers need to make sure the algorithm has been validated,” says Heather Morgan, global chair of the workforce data and technology practice at law firm Paul Hastings.

“You’ve got to do your due diligence of asking the vendor the right questions,” says Morgan. “It’s hard to validate something that’s continuously learning and changing,” and some hiring algorithms do that.

Morgan further advises that human resources professionals need training from legal on how algorithms should be incorporated into hiring procedures, what the risks are, and how the process should be monitored in order to maintain a legal hiring process.

What about that temptation to overrule the algorithm?

Who wants to hire a candidate -- top-rated by a bloodless software application -- who struck the hiring manager as somehow just not right for the job? Almost no one wants that. This situation presents a conundrum for HR and company executives.

“There’s a temptation to think ‘My gut can make a better evaluation than Pegged,’ ” says Walton. “But in our experience, in the three cases where our managers overruled the software, the new employee separated within 90 days. The software is a powerful tool if the leader embraces it.”

Recruiting firms that use algorithms may be especially reticent to suggest that number crunching should trump human judgment. “The algorithm just provides another data point,” says Mike Distefano, a senior vice president at recruiter Korn Ferry. “At the end of the day, you go with your gut.”

As with any questionable departure from established HR processes, exceptions should be documented. “Decision makers should be required to write a justification for overriding a hiring algorithm,” says Kuncel.


Source: John Rossheim (N.D) Recruiting Algorithms: Appraising their Limits and Benefits [blog post] Retrieved from: