Diversity & Inclusion

Your hiring algorithm might be racist

Racism maintains its stranglehold over the world and big data is encouraging it. In a talk about how big data harms the poor, researcher Solon Barocas spoke about the danger of using technology in the hiring process.

Xerox is one example of a company that hired data scientists to help the company staff up. (Xerox regional headquarters office by Ken Wolter via Shutterstock)

If you’re wondering why a company’s staff lacks diversity, you might want to take a look at the computers behind their hiring process.
Corporations are using technology in the hiring process in order to remedy historical and routine applicant discrimination, but the same technology can end up simply reinforcing this discrimination, said postdoctoral research associate Solon Barocas during “The Intersection of Data and Poverty,” a Philly Tech Week 2016 presented by Comcast symposium organized by Community Legal Services and Philadelphia Legal Assistance and held at Montgomery McCracken Walker & Rhoads in Center City. Barocas spoke on a panel about “How Big and Open Data Harms the Poor,” which was focused on the unintended consequences of data technology on vulnerable populations. 
Companies that use machine learning and big data in their hiring process use “training data,” which is typically taken from prior and current employees. A statistical process then automatically discovers the traits that correlate to high performance among the training data and looks for those traits in the applicant pools. “For more and more companies, the hiring boss is an algorithm,” a 2012 Wall Street Journal article reads.
The unspoken assumption for the incorporation of this process into hiring is that hiring managers are overtly or covertly racist, misogynistic, or otherwise prejudiced (after all, if they weren’t, this process wouldn’t be necessary). But because prior hirings could have been informed by prejudice, the training data reflects that — these data sets aren’t so much neutral analyses of employee performance as they are reflections of centuries of structural exploitation and disadvantage, Barocas argued.


He also mentioned an example when Xerox tapped data scientists from Evolv Solutions to assist with the hiring process for their call centers. Evolv found that the single best predictor for employee tenure was distance from work. However, Evolv requested that Xerox drop this variable from their applicant considerations because it was so highly correlated with race.
If you have zero employees who are women, people of color, or people with disabilities, it’s impossible to evaluate their potential performance through machine learning, Barocas noted. Additionally, if such an employee were in the training pool, it’s possible they faced a hostile work environment and were passed over for promotions in spite of high performance, further muddying the training data. Barocas argued that the idea of big data is comforting because one assumes it collects all data, but in reality, only convenient data is being collected.


Seemingly rational business decisions can have a striking resemblance to overt racism. Barocas noted how just several weeks ago, Amazon was blasted by Bloomberg for not offering its premium Amazon Prime service in certain parts of various cities. Amazon maintained that they did this because there simply wasn’t enough customer density in those places for Prime to be financially viable. However, when Prime availability is mapped, you could easily confuse them for historical maps of redlining.


In the Bloomberg article, Craig Berman, Amazon’s VP of Global Communications is quoted as saying “Demographics play no role in it. Zero.” Sounds to me like your average post-Archie Bunker deflecting criticism by saying they don’t see race.  
Barocas wondered if corporations should go against their financial interests and try to correct for the lasting effects of historical injustice. If Amazon is any example, they may not, until the government forces their hand.

Companies: Amazon

Before you go...

Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.

3 ways to support our work:
  • Contribute to the Journalism Fund. Charitable giving ensures our information remains free and accessible for residents to discover workforce programs and entrepreneurship pathways. This includes philanthropic grants and individual tax-deductible donations from readers like you.
  • Use our Preferred Partners. Our directory of vetted providers offers high-quality recommendations for services our readers need, and each referral supports our journalism.
  • Use our services. If you need entrepreneurs and tech leaders to buy your services, are seeking technologists to hire or want more professionals to know about your ecosystem, Technical.ly has the biggest and most engaged audience in the mid-Atlantic. We help companies tell their stories and answer big questions to meet and serve our community.
The journalism fund Preferred partners Our services
Engagement

Join our growing Slack community

Join 5,000 tech professionals and entrepreneurs in our community Slack today!

Trending

Philly’s tech and innovation ecosystem runs on collaboration 

Look inside: Franklin Institute’s Giant Heart reopens with new immersive exhibits

How Berkadia's innovation conference demonstrates its commitment to people and technology

Robot dogs, startup lawsuits and bouncing back from snubs: Philly tech’s biggest stories of the year

Technically Media