Skip to content
The Well

Change your mind with these gateway drugs to intellectual humility

Intellectual humility demands that we examine our motivations for holding certain beliefs.
Credit: Vincent Romero, Jorm Sangsorn / Adobe Stock
Key Takeaways
  • There are more than 180 cognitive biases that affect our judgments and beliefs, one of the most pernicious being the notion that we understand a topic better than we actually do. 
  • We cannot escape our cognitive biases through education. In fact, the more intelligent you are and the more education you gain, the better you become at rationalizing and justifying your existing beliefs — even if they’re wrong.
  • If you want to change your mind about all the things you are wrong about, you must first consider what might be motivating you to remain blissfully unaware of your wrongness. 

While working on my most recent book, How Minds Change, I learned a lot of things that required me to unlearn a lot of other things before I could add the new things I learned to the collection of things I thought I knew for sure.

Subscribe for a weekly email with ideas that inspire a life well-lived.

For instance, one thing I learned was that the 1938 radio broadcast of The War of the Worlds never led to any sort of mass panic. Rumors of such a panic had spread via newspaper think pieces about how getting news from anywhere other than newspapers was a bad idea. I also learned you can’t boil a live frog by slowly and gradually raising the temperature of the water. Turns out they jump right out once they become uncomfortable. Oh, and lemmings don’t sometimes march off cliffs because they blindly follow each other while walking in a single file. That one has been a nugget of popular, but untrue, folklore since the 1800s, long before both the 1990s video game that perpetuated the myth with its whimsical gameplay and the 1950s Disney documentary that did the same by tossing an unsettling number of real lemmings off a cliff.

In each case, right up until the moment I received evidence to the contrary, all this misinformation, these supposed facts, felt true to me. I had believed them for decades, and I had accepted them, in part, because they seemed to confirm all sorts of other ideas and opinions floating around in my mind (plus they would have been great ways to illustrate complicated concepts if not for the pesky fact that they were, in fact, not facts). 

That’s one of the reasons why common misconceptions and false beliefs like these spread from conversation to conversation and survive from generation to generation to become anecdotal currency in our marketplace of ideas. They confirm our assumptions and validate our opinions and thus raise few skeptical alarms. They make sense, and they help us make sense of other things, and as Carl Jung once wrote, “The pendulum of the mind oscillates between sense and nonsense, not between right and wrong.” 

Well, I used to believe he once wrote that. I’ve certainly shared that bit of wisdom many times thinking he had. But, while writing this paragraph I discovered he, in fact, did not. It turns out I was wrong about that as well. Which brings me to the topic at hand. 

The gateway drugs to intellectual humility

A few years ago the great science writer Will Storr shared with me a powerful thinking exercise, one I’ve been passing forward ever since. I’d like to share it with you now. It’s very simple, just two questions. 

First, ask yourself: Do you think you are right about everything? 

If your answer is “yes,” then perhaps you should consider a career in politics, but if your answer is “no,” now ask yourself this second, more crucial question: If you aren’t right about everything, then what, exactly, are you wrong about?

This second question should produce a nice, long, “Um, well…” pause followed by a lengthy and uncomfortable shrug. Consider the vast network of neurons in your skull devoted to topics like the Revolutionary War, Raiders of the Lost Ark, Rembrandt, rhinoceroses, red velvet cake — if you plunged into a nice, long, internet deep dive on any of one of them, what do you suppose the odds are that you’d discover at least a few of your beliefs, a smattering of your certainties, a tincture of the truths you’ve harbored for years, were, in fact, not facts? 

I am forever grateful to Will for those two questions — Are you right about everything? If not, then what are you wrong about? — because not only are they a great way to introduce the concept of intellectual humility without misquoting any famous scientists, but answering them encourages the very virtue they introduce. 

If you sit with the icky feeling of not knowing what you are wrong about, a series of serious questions should begin to bubble up. Things like: What’s keeping all that misinformation in your head alive? How much does being right matter to you? If it matters more than “not at all,” then what are you doing, or not doing, that’s preventing you from discovering your wrongness? And in the areas where being wrong matters the most — like your health, the health of your planet, your relationships, your income, and your vote — what should you be doing, or not doing, to open your mind to change?

Sideswiped from our blind spots

Questions like the ones Will’s thought experiment encourages are the gateway drugs to true intellectual humility. That’s the term psychologists use to describe the degree to which you recognize, accept, and willingly admit to the limitations of your cognitive abilities. To be intellectually humble is to embrace the likelihood that on any topic, big or small, you might be wrong about some, or all, the things you believe, feel, and assume thanks to an assortment of biases, fallacies, and heuristics that sometimes serve to maintain your misconceptions.

And intellectual humility requires an understanding that the word “wrong” can mean many things. Acknowledging the possibility of your wrongness could describe admitting the beliefs you hold in high certainty could be false, or the attitudes you currently hold might be founded on poor or incomplete evidence, or the opinions you routinely share could be biased and very well might change if someone were to present you with good arguments to their contrary.

Complicating matters is the fact that we often feel like we are well aware of all this, that we know the limitations of our knowledge and fallibility of our comprehension, but the research into intellectual humility reveals we are usually wrong about that as well. Though we may think of ourselves as open to new ideas and perspectives and conscious of our individual levels of ignorance from subject to subject, we tend to approach most situations with an undeserved overconfidence in our understanding.

For instance, in a study by Leonid Rozenblit and Frank Keil, researchers asked subjects to rate how well they understood the mechanics of everyday things like zippers, toilets, and locks. People usually rated themselves as having a pretty good grasp of how such things worked, but when asked to provide  detailed, step-by-step explanations, most couldn’t, and that fact came as a surprise. 

Psychologists call this the illusion of explanatory depth, the belief you understand something better than you truly do. It’s a cognitive bias, one of more than 180, each reliably skewing your perceptions and affecting your judgments from moment to moment. This one in particular leaves you overconfident in your understanding of most things and thus unmotivated to truly understand them — until one day the toilet won’t flush or your zipper won’t zip. 

Later studies have revealed the illusion extends well beyond bicycles and helicopters and coffee makers. For instance, when researchers asked for people’s opinions on topics like healthcare reform or carbon taxes, they tended to produce strong, emotionally charged positions. But when asked to explain those issues in detail, most realized they had only a basic grasp, and as a result, their certainty dipped and their opinions became less extreme.

Studies like these reveal we have a rather complicated relationship with our own understanding. We tend to discover our incomprehension by surprise, sideswiped from our blind spots because we were unaware those blind spots existed. For the most part, that’s because we rarely go looking for evidence of our ignorance unless motivated to do so, especially when we feel like we have a pretty good grasp of what is and isn’t so. 

The backrooms of our minds

Ok, so you want to be less wrong. You want to change your own mind. How, exactly, does one go about doing that?

I’d love to tell you that you should just go read a bunch of books and watch a lot of documentaries and earn a few degrees, but there’s no escaping your biases, fallacies, and heuristics. The research is pretty clear on this: The more intelligent you are and the more education you gain, the better you become at rationalizing and justifying your existing beliefs and attitudes regardless of their accuracy or harmfulness. 

A great example of this comes from the work of psychologist Dan Kahan. He once brought together more than 1,000 subjects, asked them about their political dispositions, tested their math skills, and then presented them with a fake study into the effectiveness of a new rash-reducing skin cream. Their challenge? Determine if the cream worked after two weeks of use. 

Subjects looked at a table full of numbers showing who got better and who got worse. The top row showed patients who used the cream, the bottom those who didn’t. The catch was that more people had used the cream than did not, so the number of people who got better after two weeks was higher in that group simply because there were more of them to count. They had made it easy to get the wrong answer if you just looked at the numbers and made a snap judgment. But if you knew how to calculate percentages, and took a moment to work out the math, you’d find that 75% of the cream group got better while 84% of the no cream group did. So, not only did the cream not work, it may have made the rash worse.

Unsurprisingly, the better people were at math, regardless of their political dispositions, the more likely they would take the extra step to calculate the percentages instead of going with their guts. If they took that step, it was less likely they would walk away with an incorrect belief. Kahan’s team then repeated the study so that the numbers showed the cream worked, and once again, the better people were at math, the more likely they arrived at the correct answer. 

But here’s the twist. When researchers relabeled the exact same numbers as the results of a study into the effectiveness of gun control, the better some people were at math, the more likely they made mathematical mistakes. If the results showed gun control was effective, the more likely a conservative with good math skills would get the wrong answer; if the results showed gun control was ineffective, the more likely a liberal with good math skills would get the wrong answer.

Why? Because people didn’t take the extra step when they intuited taking that step meant they would arrive at evidence that challenged their beliefs. 

However, when the numbers were reversed so the results showed conservative subjects that gun control was ineffective and liberal subjects that it was effective, math skills snapped back into place and determined subjects’ performance the same as when those numbers revealed the effectiveness of a skin cream.

Kahan had found that the better you are with numbers, the better you are at manipulating them to protect your beliefs, even if those numbers suggest those beliefs are false. 

And here’s the kicker: None of the subjects had any idea they were doing this. In psychology, this is called motivated reasoning, and Kahan’s study is a single pebble on an enormous mountain of evidence not just for how powerful a force it can be, but how it can operate in secret within the backrooms of our minds.

Thinking about thinking about thinking

The big takeaway here is that if you want to embrace intellectual humility, if you want to change your mind about all the things you are wrong about, you must first consider what might be motivating you to remain blissfully unaware of your wrongness. 

While writing How Minds Change, I traveled all over the world to meet experts and activists who had developed various persuasion techniques for changing other people’s minds. Some were being studied by scientists, some were being used by therapists, others were being used out on the streets to change laws by knocking on doors and having conversations. 

I discovered the people who had developed the best persuasion techniques — approaches like deep canvassing, street epistemology, and motivated reasoning — had all learned to avoid fact-based arguing and rhetorical attempts to defeat their opponents through debate. Instead, they each used something I like to call guided metacognition. They avoided focusing on a person’s conclusions and instead focused on the processes that person was using to arrive at those conclusions — their logic, their motivations, their justifications, and so on. 

The good news is that if you want to change your own mind you can direct that kind of focus inward as well. 

How? Make your claims, state your opinions, express your attitudes — but then ask yourself just how certain, how confident, and how strongly you feel. Put a number on that certainty. One to ten, zero to 100. Now ask yourself: Why that number? Why not higher? Why not lower? And most importantly, ask yourself what reasons you are using to justify that level of confidence. Do they seem like good reasons? How would you know if they weren’t? And if you discovered they weren’t good reasons, would that change anything? 

Once you start thinking about your own thinking and begin to recognize what contributes to your certainty or lack thereof, it’s difficult not to change your mind.

Remember, the research suggests all judgment, decision-making, information processing, and memory encoding is motivated by something, some drive or goal. Intellectual humility requires we stay vigilant for when that motivation might be to reach some desired conclusion, one that avoids a threat to our beliefs, our well-being, our identity, or all three. In other words, recognize that you can always find a justification to eat the cake when you ought to eat an apple. And you can always find a rationalization for your mistakes when you ought to apologize instead.

None of this is to say you shouldn’t pursue as much education and edification as possible; it’s just that intellectual humility demands you pair those pursuits with an awareness of your propensity for motivated reasoning. Before you can truly embark on a journey of self-discovery, you’ll need to know what parts of yourself you currently consider off-limits to change. As John Steinbeck once wrote, (and I checked this one, he really did), “Sometimes a man wants to be stupid if it lets him do a thing his cleverness forbids.”


Related

Up Next