“The benefit of computer automation isn’t just efficiency or cutting costs. Humans evaluating job candidates can get tired by the time applicant No. 25 comes through the door. Those doing the hiring can discriminate. But algorithms have stamina, and they do not factor in things like age, race, gender or sexual orientation. ‘That’s the beauty of math,’ Salazar says. ‘It’s blind.’”
Jobaline helps companies determine who to hire based on a voice analysis. But that’s not inherently fair, as they claim. Instead, machine learning algorithms that pattern match against human preferences are likely to replicate the potentially prejudicial decisions of those people. In this case, that’s likely to lead to automated linguistic profiling – discrimination based on accent and dialect as correlated with race and ethnicity. The algorithms may be literally blind to the race of the applicants, but that won’t keep them from making discriminatory hiring decisions.
While we can’t blame Jobaline for trying to tout the wonders of their “algorithm” (and yes, those are scare quotes there), we’d expect NPR to be at least a little bit more critical about claims that “[math is] blind”. At the very least, the reporters should read all the things that Cathy O’Neill has been saying for avery long time.