Implicit bias: What you can (and can’t) do about it
- Everyone has an identity that may face bias at some point.
- Implicit bias is just one of the hundreds of mental shortcuts our minds use to process information about people and the world.
- By developing awareness and self-management skills, we can better prevent our implicit biases from leaking out into our treatment of others.
As social psychologist Valerie Purdie Greenaway warns: We all have an identity that may face bias at some point.
Consider age. You are getting older. Depending on where you are in life, that may be a source of great relief or heart palpitations. Either way, it’s happening, and with each passing birthday your identity and how people discern it shift accordingly. For example, if you work in a technological field, you may find others considering your abilities with increased skepticism. But if you’re a judge or philosopher, you may enjoy more scholarly esteem in your golden years. Bias cuts both ways.
“Age is one of the most powerful parts of discrimination,” Greenaway told us in an interview. She added: “Many of us will gain weight. The person sitting next to you could be suffering from depression or major anxiety. Some of us live with chronic pain. These are aspects of people that we associate with not being able to work up to their potential.” These aspects also reveal an important fact: The bias you ignore today may be the one you face tomorrow.
The good news is that explicit bias against many groups is declining. Not gone — some have remained steady while others are on the rise. We aren’t living in a cosmopolitan utopia. But compared to the historic watermark, we’re witnessing progress.
More alarming is that most bias isn’t overt or intentional. It compounds through everyday interactions spurred by unconscious thoughts and subtle interactions. For these reasons, we all have a stake in understanding how implicit bias works and what we can do about it.
What implicit bias is not
A common misunderstanding about implicit bias is that it’s an academic rebranding of racism. Another is that such biases only flow from an in-group to an out-group. Mike Pence made both mistakes in a 2016 debate when he questioned how a Black police officer could be unconsciously biased against a Black person.
Implicit bias is neither.
While race is a significant part of the conversation, Greenaway has already shared how implicit bias encompasses much more. Its impact radius not only includes racial but also class, religious, national, and cultural identities. The fallout further extends to those identities that don’t find their way into political debates but are no less vulnerable.
Greenaway also notes that implicit biases aren’t unidirectional. We can form one against any identity, even those we associate with or advocate for. That means they don’t just harm members of an out-group. They can disadvantage the very people we wish to support.
For example, studies looking at letters of recommendation for academic positions found that male candidates are praised more for their professionalism than female candidates, who are praised more for their personal qualities. This distinction can harm women’s chances as university hiring committees are looking for intellectual accomplishment, not caring company.
That may sound like a case of men stereotyping women, but it’s not. Other studies have looked at the writers’ gender and found it didn’t make a difference. Both men and women demonstrated the same language bias, suggesting how ingrained these tendencies can be. Even as advocates, the writers unconsciously viewed the women candidates through the lens of traditional roles and values.
What is implicit bias then?
Instead, implicit bias is one of the hundreds of mental shortcuts the human mind evolved to make sense of the flood of information we take in every moment of every day. The title of psychologist Daniel Kahneman’s seminal book Thinking, Fast and Slow christened these shortcuts fast thinking. (He technically classifies them as System 1, but we needn’t be so formal here).
Fast thinking is involuntary and emotional. It leans on readily available information to reach a decision quickly without much regard for accuracy or precision. Though, it may not seem that way to us. Because the results of fast thinking seem so intuitive, we often overrate their veracity.
Now don’t get the wrong idea. Fast thinking isn’t necessarily bad. It exists because sometimes our brains need to decide without getting bogged down in the details.
Imagine someone leaps at you from behind a blind corner. You don’t stop for a thoughtful analysis of the current threat level. You instantly jump back. If the person meant you harm, then it’s fast thinking to the rescue. If the person was your childish friend pulling a prank, then you only suffer a slight embarrassment.
Problems arise, however, when we use fast thinking in situations that demand accuracy, an open mind, and rational decision-making. That combination leaves us open to critical errors in judgment, especially when encountering complex issues or new experiences. And our interactions with others often come with both.
In the mind of the beholder
To see fast thinking in action, philosopher Dan Yin uses the Müller-Lyer illusion. This illusion places two lines next to each other. The lines are of equal length with an important difference: One line has arrowheads pointing inward at its ends while the other has arrowheads pointing outward.
Despite being the same length, the line with the arrowheads pointing inward appears to be longer. Your mind fools you because of how your visual system evolved to assess lines and angles. The illusion’s odd pairing breaks the typical pattern, and our brains have difficulty processing the switch up. In a way, we trick ourselves.
Yin likens this to the associations we unconsciously make through social cognition. As we encounter others, our minds search for links between traits and behaviors. It then derives patterns from these, and when we meet someone in a similar situation, we assume the traits and behaviors carry over. Consider all the associations and assumptions you have about someone who shops at Wal-Mart. Now, Yin asks, do the same for someone who shops at Nordstrom.
We don’t even have to encounter these identities to form a bias. Our environments are awash with generalizations that we inherit through cultural osmosis. Consider all you know about people you’ve never met from places you’ve never been. And then think about how certain you were of that knowledge before you consciously questioned it. Culture readily provides a caricature in place of an experience.
“This implicit social cognition also illustrates perhaps one of the most unnerving and challenging aspects of cognitive vice. The reason these vices are so stubborn is that they hide in plain sight,” Yin wrote.
Thanks to neural imaging, scientists can now peek behind that cranial curtain. And the research has revealed our brains light up with activity as we unconsciously sort people into in- and out-groups.
Slow and steady now
Things so far may lead some to disheartening conclusions. If implicit bias is human nature, there’s nothing to be done about it. Asking people to consciously govern the happenings of their unconscious minds is to demand the impossible. If anything, the research exonerates us from shouldering that responsibility.
Not quite.
True, you’ll never be free of implicit bias. It’s not a disease that can be cured or a sin that can be absolved by reciting the right prayer at corporate training. Your mind will always presume traits and behaviors about others. Uncomfortable thoughts will occasionally appear uninvited. And you’ll make mistakes even when trying to help.
But none of this means we aren’t without recourse. As Kahneman shows, fast thinking comes paired with a complement, slow thinking. It is everything fast thinking is not: voluntary, attention-demanding, data-driven, and slow to reach conclusions. The two form a dual nature, and when fast thinking would lead to an error in judgment, we can turn to slow thinking to boost our accuracy and precision.
Returning to the Müller-Lyer illusion, you can never intuit that the lines are the same length. Even if you know the trick, they always look different. But that doesn’t mean you must accept that conclusion. You can question your assumption, break out a ruler, and measure them. It takes more time and effort, but the results are more truthful.
Similarly, we can tap into slow thinking to recognize and address our bias triggers. This won’t prevent implicit biases from forming, but it does help us process them more efficiently so our conscious behavior becomes more in line with our values — less a cure and more a treatment.
What you can do about implicit bias
Recognizing bias triggers is a form of emotional intelligence. To develop the skill, you must become mindful of your thoughts and emotions, and learn to communicate those with awareness of yourself, others, and the context. The approach combines self-management and relationship management to reduce unintended consequences and cultivate amity.
Easier said than done? Definitely.
For decades now, businesses have invested billions in diversity training programs that are ineffective and counterproductive. Part of the problem is that these training programs aren’t standardized — they may be based on untested principles or measure outcomes using wildly different methods.
A meta-analysis of 260 studies found that most diversity training “fell short in demonstrating effectiveness.” But evidence from those that proved efficient revealed a path forward. The best weren’t one-off sessions but long-term endeavors that looked toward developing skills and awareness.
As Greenaway said, “There is no right recipe for every company and every person. Different kinds of things are going to work. The first step is just awareness. Just being able to know that there’s a particular kind of situation in which you should think about bias.”
For example, writers of recommendation letters shouldn’t just consider what they value in the candidate (values that, while positive, may be biased). They should look toward the context and draft toward that. Such an approach encompasses skill, awareness, and, just to be safe, a second set of eyes to suss out unanticipated subtext.
Greenaway also recommends learning to recognize personal and situational triggers. Like emotional outbursts, unconscious biases are best mitigated by developing self-awareness of those moments when bias is likely to leak into conscious behavior. Are you tired, upset, stressed, or otherwise cognitively depleted? How might this influence your ability to make a decision? Once you can recognize those moments, you can better invest in the skills to reframe your mindset and behavior.
Of course, cognitive depletion will happen. That’s unavoidable. But simply being aware of how it exacerbates bias helps you slow your thinking and improve your attention in those critical moments. It also helps you mend relationships when mistakes occur. (Difficult conversations are also unavoidable. Being able to work through them builds any relationship.)
Is it that simple? Of course not. Learning to address your implicit biases is a lifelong journey. You’ll never reach a state of social enlightenment. But if enough of us keep an open mind, a willingness to learn, and a willingness to forgive, we can do better.
Learn more on Big Think+
With a diverse library of lessons from the world’s biggest thinkers, Big Think+ helps businesses get smarter, faster. To access Valerie Purdie Greenaway’s full class for your organization, request a demo.