In the world of HR, the Man-vs.-Machine conflict may be less frightening than it sounds. Think less impenetrable borg vs. helpless citizens of earth and more slow-moving, tin-can robot vs. a couple of guys in polyester jumpsuits touting plastic laser guns. “The idea behind any machinery is to improve our productivity and to get more things done,” said Brooke Tristan, an HR consultant in Bristol, Conn. “It’s easy to see how robotic arms help an assembly-line worker but more difficult to appreciate the benefits of machine-based decision making and process planning.”
That’s why so many HR professionals have trouble placing their trust in algorithms and other standard forms of machine-based decisions. Berkeley J. Dietvorst, from the University of Chicago, and Joseph Simmons and Cade Massey, from the University of Pennsylvania, authored a study that considered man’s distrust of decisions made by machine technology, and found that people mistakenly avoid trusting algorithms after finding errors or questionable results. “Research shows that evidence-based algorithms more accurately predict the future than do human forecasters.
Yet, when forecasters are deciding whether to use a human forecaster or a statistical algorithm, they often choose the human forecaster. This phenomenon, which we call algorithm aversion, is costly,” Dietvorst wrote.
Personal personnel bias
It can be especially costly when it comes to HR decisions. “Humans are not as rational as economists would like them to be,” Jed Kolko, chief economist for the job search site Indeed, told attendees of this year’s Indeed Interactive, a conference for talent-acquisition professionals, according to a story by Roy Maurer for the Society for Human Resource Management.
Kolko suggested one way to improve hiring would be to eliminate bias by using machines to help make more decisions. “We know that there is discrimination and bias when it comes to hiring, pay, performance reviews and promotions,” Kolko said. “There are debates over the magnitude of discrimination and bias and the reasons for it, but there is no debate over whether any of it exists in the first place.”
Kolko listed several factors of bias that often go unchecked or unnoticed, including the willingness to promote taller men, the reluctance to hire older women and an over-reliance on shared extraneous information when it comes to hiring new employees. “This comes up during hiring when the conversation turns to hobbies or activities the candidate is engaged in,” Kolko said. “Affinity bias is a trickier kind of bias to assess … but it can be hard to tell whether cultural fit is essential to performance or a flavor of affinity bias.”
Removing those biases is key, Kolko believes, and it’s why he believes trusting machines and algorithms can have a dramatic impact on creating an effective workforce. “[It removes] the possibility of decisions based on open-ended interviews where you are looking for commonality,” he said.