should be should be hired at a rate that’s at least 80 percent of “the rate for the group with the highest [hiring] rate,” according to the human resources firm Prevue HR.
More corporate HR departments are using algorithms to automate how they filter candidates into categories after job interviews, and even how the companies make hiring decisions, says Aws Albarghouthi, an assistant professor in UW-Madison’s computer sciences department. He says those shifts have sparked researchers’ interest in what has become a hot topic in tech: the question of whether algorithms are biased, and if they are, how best to address that bias.
Albarghouthi, who is not involved with Moonsight, is one of four UW-Madison researchers who last year received a three-year, $1 million grant form the National Science Foundation to continue studying ways to remove bias from computer algorithms.
The software Albarghouthi and his colleagues are developing, which they call FairSquare, could eventually be used to certify automated decision-making programs as fair. While he says there’s not really a straightforward definition of fairness when it comes to algorithms, Albarghouthi uses a hiring-related example to explain how FairSquare might one day be used.
“If you think of a hiring case where you have black applicants and white applicants, the idea is to make the algorithm oblivious to race by effectively removing it from the data,” he says. Specifically, that could mean “fuzzing the data a little bit, such that the algorithm cannot tell whether someone is black or white—even through proxies like ZIP code,” he says.
Treige, of Moonsight Insights, recently spoke at a conference organized by the Corporation for a Skilled Workforce in Washington, D.C. While on stage, several people in the audience asked him if the algorithms Moonsight and other software companies are developing will perpetuate biases, he says, and worsen what many view as a problematic status quo.
Treige says he can’t speak for all CEOs of companies developing digital tools designed like his. But his hope is that Moonshot’s products will have the opposite effect, and help reduce the level of bias in organizations’ hiring decisions. For example, he says, instead of presenting hiring managers with a standard resume or recruiter feedback from an interview, the manager could instead see a de-identified set of information on the candidate listing his or her qualifications for the job, and little else.
“This is a transitional point” in time, Treige says. In his view, the first step toward improving things is to “understand what the current conditions are around bias and workplace, resumes and interviews being a huge problem at the crux of that.”