We Fear First and Think Second, and We Fear More and Think Less
Funny thing about fear. By the time you feel it, your body is already quite busy keeping you safe. Your brain is first and foremost a survival machine, and at the slightest hint of danger, before the thinking conscious part of the brain even gets involved, it instinctively sets off a subconscious Fight or Flight response. That triggers changes to your heart rate and blood pressure and body chemistry, and when your brain senses those changes, it finally dawns on your consciousness, “I’m scared!”.
Our brains are hard-wired to fear first, and think second. That’s adaptive, a good trait for staying alive, since thinking takes more time than the instinctive first response, and even a few milliseconds can be the difference between life and death. But then, in the ongoing response to risk, emotion and instinct still have the upper hand over cognition and reason. (For an excellent source on the neuroscience of fear, see Joseph LeDoux’s interview on BigThink or his wonderful book “The Emotional Brain” ) Our brains are hard-wired to feel more and think less. And that doesn’t make sense, because relying more on our feelings than the facts means that we sometimes end up more afraid of lesser risks, and less afraid of bigger ones – as best the evidence tells us at any given time – and that Perception Gap can be a risk all by itself, to the individual and to the greater community.
I’ve written here about one of the clearest examples…vaccines. Vaccines do pose risks, but for rare side effects and allergies, not for autism – as the evidence pretty convincingly tells us. Yet the persistent fear of that risk creates a greater risk; to unvaccinated kids suffering and dying from diseases they could be safe from, and to infants too young to be vaccinated, and to vaccinated people in whom the immunity has worn off or whose immune systems are weakened because of age or illness.
But vaccines is only one example of a dangerous The Perception Gap. — People not worried enough about drinking and driving, or texting and driving, don’t just endanger themselves. They put you and me at risk.
— Some people are too worried about using radiation to sanitize food, (low levels of radiation applied during food processing can kill living germs without changing the food) so food companies don’t use the technology, and we’re all at greater risk from e.coli and salmonella and botulism, etc.
— Some pregnant women are too worried about mercury in their fish, (it’s a risk to the cognitive development of the fetus, but at the levels we eat, a very low one) and by avoiding the fish they give up a good source of fatty acids, which are also necessary for the healthy development of the fetal brain. Studies have found that avoiding the fish, and fats, does more harm than the mercury in normal doses of fish.
There are a lot of examples of The Perception Gap. To close it, we need to know why it happens. And it turns out we know quite a bit about why our fears sometimes don’t match the facts. There is rich detail about that knowledge – including the neuroscience of fear – in my book, “Kahnemanet.al. has discovered a set of heuristics and biases – mental shortcuts – we use to quickly make sense of partial information and turn a few facts into the full picture of our judgment. Representativeness, Framing , Availability …they have academic names, but in short, these are subconscious tools for making judgments and decisions when we don’t know everything we’d need to know to make a fully informed choice – which is most of the time.
— The research of Paul Slovicet.al. has revealed a suite of psychological characteristics that make risks feel more frightening, or less, the facts notwithstanding. These ‘risk perception factors’ include:
Imposed (nuclear accident radiation)
Voluntary (medical radiation)
More Pain and Suffering (cancer)
Less Pain and Suffering (heart disease)
Human-made (radiation from technology)
Natural (radiation from the sun)
Less Benefit (vaccines for diseases that have been mostly eliminated)
More Benefit (vaccines for new strains of influenza – H1N1/”Swine flu”)
Uncertainty (nuclear radiation –because we can’t detect it, science doesn’t have all the answers, or we don’t understand all the science.)
Certainty/Familiarity – (motor vehicle crashes)
Risks to Children (childhood vaccines)
Risks to Adults (adult vaccines)
— Recent research on the theory of Cultural Cognition by Dan Kahan et.al has found that our views on risks are shaped to agree with those we most strongly identify with, based on our group’s underlying feelings about how society should operate. We fall into four general groups about the sort of social organization we prefer, defined along two continua, represented as a grid. We all fall somewhere along these two continua, depending on the issue.
Individualists —- Communitarians
Individualists prefer a society that maximizes the individual’s control over his or her life. Communitarians prefer a society in which the collective group is more actively engaged in making the rules and solving society’s problems. (Individualists deny environmental problems like climate change because such problems require a ‘we’re all in this together’ communal response. Communitarians see climate change as a huge threat in part because it requires a social response.) Along the other continuum, Hierarchists prefer a society with rigid structure and class and a stable predictable status quo, while Egalitarians prefer a society that is more flexible, that allows a fairer social and economic mobility, and is less constrained by ‘the way it’s always been’. (Hierarchists deny climate change because they fear the response means shaking up the free market-fossil fuel status quo. Shaking up the status quo is music to the ears of Egalitarians, who are therefore more likely to believe in climate change.)
The explanations for why our fears sometimes don’t match the facts is fascinating stuff. And important, because closing the Perception Gap has to start from understanding it. LeDoux and Kahneman and Slovic and Kahan and their colleagues have given us important knowledge about why we sometimes get risk wrong. But knowledge is not wisdom. We will be wise if we accept that we simply can not be the uber rational thinkers we’d like to think we can be, and rationally apply what we’ve learned about the flaws in our risk perception system as tools for making wiser, healthier choices for ourselves and for society.