Advertisement
Article main image
Aug 7, 2017
This article is part of a series called Editor's Pick.

I was watching a rerun the other day (OK, it was The Office!), and Dwight was quizzing Ryan on those old-school brain teasers (i.e., A man builds a house with all four sides facing south. A bear walks by. What color is the bear?). If you watch The Office, Ryan knew all the answers, and Dwight was furious.

And if you were ever a kid, you likely know the answers to most of these teasers, but one jarred my memory and ticked me off.

A father and son are in a car accident. The father dies, and the son is taken to the hospital. The doctor comes in and exclaims, “I can’t operate on this boy.”

“Why not?” the nurse asks.

“Because he’s my son,” the doctor responds. How is this possible?

How is this possible??!! Well, the doctor is, in fact, his mother, a woman.

Why is this a brain teaser? Today, it’s shocking that people wouldn’t immediately guess this. But, the truth is, people tend to associate doctors with men, despite the fact that 35% of physicians in the U.S. are women. That this brain teaser even exists is one clear example of how bias is ingrained deep within us, particularly when it comes to gender and diversity.

 

Can We Eliminate Unconscious Bias?

Every day, we make tons of decisions, assumptions, and judgments about people without realizing it. Based on our unique world view, our unconscious bias fills in the blanks of what we don’t know about others. This kind of bias isn’t hateful or wrong; we’re often not even aware of it. But it’s hard to rewire.

I recently dove into this topic with Shon Burton, the co-founder & CEO of HiringSolved, a company that applies AI-based technology to automate the process of matching candidates to jobs. Burton’s company can help businesses solve a lot of problems, but one of its most common use cases is to create more diverse pipelines by removing human bias from sourcing.

FYI, removing bias isn’t a new concept. Burton brought up the example of the Boston Philharmonic. Long story short: In 1970, the five highest-ranked orchestras in the nation consisted of only 6% female musicians. To try to overcome gender-biased hiring, a majority of symphony orchestras revised their traditional hiring practices, opening auditions to a range of candidates instead of only hiring musicians handpicked by the conductor. Some also adopted “blind” auditions, where musicians played behind a curtain, thus concealing their gender. After these practices had been implemented, female musicians in the top five orchestras increased to 21% by 1993.

I’m going to go out on a limb and say next year’s recruiting trend won’t be interviewing people from behind a curtain. But these orchestras were able to level the playing field, ensuring every musician had an equal chance based on their quality of play.

 

Are We Introducing Bias by Removing Unconscious Bias?

The one caveat to the story above is this: To address the gender imbalance, they eliminated the unconscious bias that was limiting their exposure to women.

That’s where the comparison deviates from how many organizations approach diversity hiring.

As Burton puts it: “When we actively seek out more women for a role, that is inherently a conscious bias. We’re handpicking and pinpointing them for a particular job, and most people don’t seem bothered by that. But if we said we wanted to hire white men for the same role, it would be distasteful.”

Hold off on throwing your tomatoes.

Burton isn’t arguing that white men are discriminated. Think about it: Are diversity hiring initiatives leveling the playing field, or are they targeting specific groups of people over others, an inherently biased thing to do? Maybe they’re doing both. But the question we need to ask as a society and as employers is whether that’s the best thing for companies and the people they hire.

“We should be optimizing hiring to understand and potentially remove biases, ones we’re aware of and aren’t,” Burton says. “It’s certainly within the capability of technology to remove bias all the way through someone’s first day of hire. But the question is: Does that make sense? Is it going to help you hire the best people who work well on your current team? And is it going to strip out the value of human intuition and cognition?”

 

 The Tech Factor: Man With Machine

I recently read this in a Science Magazine article: “AI (artificial intelligence) is an extension of our existing culture.”

Technology is the product of humans. And AI is simply a technology that’s been created and trained by humans so that it can analyze data in a way and at a rate that humans aren’t capable. Yes, it can do so without some of the bias humans show. But that technology is still influenced by humans.

“It’s tricky with technology because AI learns from our choices to some degree,” Burton says. “If you have a lot of bias in your recruiting and hiring processes, and you push that data through a neural network or AI-based system, then you’re using data that are inherently biased. As a result, you’ll simply be training AI to be biased.”

Burton uses a simple analogy to compare our relationship with AI: “People can do good with a hammer. But they can also wreak havoc. It’s the same with technology, specifically AI. It’s up to us how we use it.”

 

Increasing Accuracy at the Top of the Funnel

So how can we leverage AI and other technologies for good, removing bias in the screening process to ensure companies have access to the most diverse talent?

It starts with applying technology in the right ways, at the right time.

“The beauty of this technology is that it can pick up on social signals and other data that are often missed or misconstrued on resumes,” Burton says. “This allows us to use machines to populate the top of the funnel with the right people. And because this process is inherently unbiased, you’re more likely to end up with a more diverse group of candidates who may not have made it through the screening process before.”

Case in point: Say a recruiting team gets an initiative that they need to hire 2,000 people in ‘X’ role by ‘X’ date, with a focus on hiring women. Technologies could work together to make intelligent guesses about which candidates might be women, stack ranks them into a pipeline, and enable your team to filter those candidates based on skills, experience, etc. This won’t eliminate profiles or resumes of qualified people, but it can bring more women into the top of the funnel with higher accuracy and without bias or compliance issues.

Now, it’s important to point out that this process isn’t consciously removing qualified men from the role. Instead, it’s giving more qualified women or anyone else a chance to be seen when bias might have eliminated them before.

As Burton so eloquently puts it: “Technology enables transparency in a way that can’t easily be undone. Everything gets laid bare.”

 

Bigger Than Buzz: What Are You Trying to Solve?

Unconscious bias is not going to end with AI; it will help you equalize candidates and become more intelligent about your top of the funnel sourcing methods. But we can’t expect technology to remove all of our inherent biases: they’re in the channels we use, the way we talk about our employer brand, the way we write job descriptions, the way we think about job and gender roles, the way we seek to work with people like us. It’s a systemic change, one that technology can advance.

This is where Burton firmly believes that talent acquisition teams need more than just AI to drive the quality and diversity of their talent pipelines. While AI can improve the diversity of candidate pools, intelligent automation (IA) is the key to nurturing relationships over time. IA helps you identify further patterns, from what content and messaging they interact with, to which channels they act on, to the different career site pages they stay on the longest, to the job titles that convert more of a particular type of candidate audience. This kind of behavioral data is integral in continuing to make objective, unbiased inferences about the talent you want to hire and the talent you might not yet know you want to hire.

“We believe social and behavioral signals the things we put on the web every day, where we are, who we know, what we like, what we don’t know, what we interact with are, in aggregate, more predictive of an ability to do a certain job than a resume,” Burton says. “That’s the real power of AI and analytics in recruiting. When you can paint this complete picture, it’s a far better predictor of whether they’re going to be the right fit.”

 

 

This article is part of a series called Editor's Pick.