Blendoor is a merit-based matching and recruiting app that reduces unconscious bias in the hiring process by hiding applicants’ names, pictures, and dates. Using data-driven metrics, Blendoor helps companies make better hiring decisions. Candidates also have a window into company diversity and inclusion. Stephanie has an engineering degree from Stanford and an MBA from MIT. So when she found herself having difficulty finding a software engineering job in Silicon Valley, she had to wonder if being young made a difference? Or being a woman? Or being African-American? Or all three? She began researching diversity in tech hiring. What she found were quantifiable variables in the perception of race, gender, and age, and tech company hiring decisions that reflected these variables.
When she developed Blendoor the unconscious barriers and variables that impact hiring decisions were moved out of the way. Several companies with strong diversity and inclusion programs, such as Intel and Google, signed up to use her app before it was launched. Since that time, she has had a number of high profile companies, such as Facebook and Apple, come on board. Hiding photos and names on resumes allows applicants a safe zone where accomplishments can shine. For traditionally excluded populations, Blendoor can be the first step toward a seat at the table.
There are several other tools tech companies are using to deal with diversity, inclusion, and hiring bias. Psychologists believe that it is difficult to impossible to remove unconscious bias. What they propose are solutions that will mitigate the effects of unconscious bias in the workplace. GapJumpers is a new startup that looks at job skills and performs blind auditions, rather than looking at resumes. For jobs and skills that can be self-taught, like coding, applicants without college or experience have a way to shine. Applicants are given a skills test, or a hands-on task to complete. The audition is based on the results of the work. Textio was developed by looking at unconscious gender bias in the language of job descriptions. They use analytics and a predictive model to assess the language used in job searches and descriptions. Phrases are highlighted and areas of concern addressed, with suggestions for improvement. Companies using the method are meeting their hiring goals of more qualified and diverse candidates.
The National Center for Women and Information Technology has published some “Promising Practices” to help HR departments and hiring managers make decisions that reduce the impact of unconscious bias.