The New York Times recently had an interesting article about when algorithms discriminate, followed by an interview with a researcher about this subject . In the interview, Cynthia Dwork mentions two outstanding examples of algorithms discriminating: women received an add for career coaching for jobs that paid over $200,000 far less than men in one study; hypothetically, and a university which was originally segregated might discriminate against minority applicants if it used an algorithm designed based on historical information.

Farolito, and our soon-to-launch US version, StellarEmploy, use an algorithm that learns over time what type of applicants will be successful for different clients. In our US launch in particular, we have been very careful of the downside of algorithms due to some high-profile cases.

However, algorithms are more useful for reducing discrimination than for exacerbating it. We’ve seen this in Farolito’s work in Latin America. Farolito operations provide a unique comparison because the anti-discrimination laws are enforced more weakly. It’s not unusual for job descriptions to highlight the age range, marital status and preferred gender for a position. While Farolito does not reject applicants based on discriminatory characteristics, we have worked with companies that have a history of doing so.

In one case, we had a client who specified a preference for salespeople that were 18-23, and told us they did not hire single mothers. Turnover was very high: 52% of all their hires failed to complete 3 months on job, which meant that nearly every position was filled twice in any given calendar year.

When we began evaluating all applicants with our proprietary evaluation and filter, we found that if our client hired applicants based on their psychometric profile and skillset instead of personal characteristics, they could increase the percentage of workers who stayed at least 3 months from 52% to 80%. In fact, the client could benefit from those improvements in their worker base just by hiring differently from the applicants they already received.

Why would a company fall into an outwardly discriminatory hiring process? It is most likely due to two behavioral biases, “salience” and “vividness”  which occurs when people incorrectly attribute one action to another characteristic or action (salience), just because it is easy to remember (vividness). In the case of our client, it was easy to remember the one single mother who was unsuccessful in the position, because having a child was a memorable characteristic that our client already suspected was correlated with lack of success. Because the client had fewer employees older than 23, it was also easy to remember the one older worker who did not stay on the job for long. Those employees were actually leaving because they were not a good fit for the job, but the client misattributed the reason for their departure to those clients’ salient characteristics.

In reality, neither age nor having a child was correlated with success on the job. By setting aside discriminatory requirements and replacing job requirements with a filter that did evaluate important characteristics, the employer created job opportunities for people who had been the victims of discrimination in the past.

Algorithms, when implemented properly, can create opportunities for people who would not otherwise have them.


By Sara Nadel, CEO os Stellar Employ