Using Technology to Reduce Bias in Recruiting

Technology can help ensure that your recruitment processes don’t discriminate.
By: | August 29, 2017 • 4 min read

A recent Deloitte survey finds employers’ priorities regarding workplace diversity is centered around recruiting and the use of new tools to reduce bias in recruiting. Research suggests this is with good reason. Researchers found up to 40 percent of visible minority job seekers “whiten” their resumes by using an Anglicized version of their name or removing their participation in multicultural organizations. These strategies are attempts to avoid triggering unconscious biases, or automatic mental shortcuts used to process information and make decisions quickly to which everyone is susceptible.

Unconscious bias is widely accepted as an inherently human trait—and many believe stopping it requires a non-human solution. That’s where recruitment tech such as artificial intelligence comes in.

Much of the attention paid to these new recruitment technologies is focused on whether they can help when it comes to reducing recruiting bias.

Technology that recruiters can use to reduce bias includes that old standby, the applicant tracking system, and newer technologies such as software that “de-biases” job postings, AI that avoids unconscious bias in candidate screening, and technology that “blinds” applications.

Your ATS

It’s estimated that 90 percent of large companies and more than 50 percent of small and mid-sized companies use an ATS.

The Office of Federal Contract Compliance Programs oversees federal contractors and subcontractors to ensure their compliance with affirmative action requirements when they recruit, which includes collecting and storing anonymous data on the race and sex of their applicants.
The easiest way to collect this information is through your ATS as a part of the application process. For employers not required to practice affirmative action, they can voluntarily collect applicant demographic data so long as it’s voluntary self-identification.


Using an ATS creates a built-in compliance mechanism that can measure the diversity, or lack thereof, of the candidates your company is attracting.

Software that “de-biases” job postings

Job postings are gaining more and more attention these days as a tool for—or hindrance to—attracting qualified job candidates. A carelessly worded job posting can turn off otherwise interested candidates. For example, research finds that job postings which use too many masculine-type words such as “aggressive” or “challenging” dissuaded female candidates from applying.

Using sentiment analysis, software can de-bias a job posting by identifying exclusionary language and suggesting alternatives that appeal to a more diverse pool of candidates.

AI that avoids unconscious bias during screening

While screening hundreds of resumes can be mind-numbing for human recruiters, it’s exactly the kind of pattern-matching AI was designed for.

Although there’s been debate over whether AI can reduce or reinforce human bias, there are two promising avenues for AI to reduce bias in recruiting. First, software that uses AI to screen resumes can be programmed to ignore demographic factors from a resume, such as the candidate’s (implied) gender, race and age, during its learning and decision-making process.

Second, if the AI software does start to demonstrate some type of bias based on candidate demographics, its screening model can be course-corrected to remove the bias. While this AI for recruiting technology is fairly new, research suggests it has the potential to reduce bias in recruiting by 75 percent to 160 percent.

Technology that “blinds” applications

Blind hiring is any technique that anonymizes or “blinds” any demographic-related information about a candidate from the recruiter or hiring manager which can lead to bias.


Blind hiring was originally conducted by orchestras when musicians started auditioning behind a screen to hide their gender. This policy increased the proportion of female musicians from less than 5 percent in the 1970s to 25 percent in the 1990s.

Today’s blind hiring practices include removing personal information from candidates’ profiles and resumes, such as their names and photos. Employers such as Deloitte UK and the Government of Canada, for example, are testing out blind resumes by removing candidate names before submitting them to hiring managers. Other blind hiring technologies include anonymized job tests and assessments and software that obscures candidates’ gender during interviews by distorting their voices.

The most important component for a bias-free recruiting process is, of course, having an organizational culture that encourages leaders and managers to recognize their own unconscious biases and foster an inclusive environment. As the above examples illustrate, however, technology can and should play a critical role in ensuring you’ll hire the best people regardless of their race, gender or ethnicity.

Ji-A Min is head data scientist at Ideal, which builds recruitment automation software.