Prediction: We’ll Continue to Make Mistakes About Predictions
This is the time of year for predictions. Here are mine.
The urge to predict is understandable. We forecast the future, and continue to do so even after repeated mistakes, because of the deep psychological need for a sense of control, to keep ourselves safe. But it’s less obvious where the mistakes come from. Not the mistakes in the predictions themselves (some of which turn out to be right), but in how we use them. We make errors in what we do with predictions, and I predict we always will, because we give our powers of reason way too much credit. We think we’re smarter than we are.
Ever since the earliest human brain developed the ability to consciously realize there is a future, we’ve wanted to know what comes next. At first we looked up to the stars, or down at the tea leaves in the bottom of the cup, or in at the shape of the organs of sacrificed animals, and we venerated the Prophets and Oracles who supposedly had the magic power to know what was coming next, all to keep ourselves safe. After all, not knowing what lay ahead was risky, dangerous. In fact, the English word risk derives from a Greek word found first in Homer’s The Odyssey, which loosely means the unknown chance that, sailing into uncharted waters, your ship will hit some hidden rock, and sink.
Then along came Blaise Pascal and Pierre de Fermat and “The Problem of the Points” (explained in the footnote below) and the beginnings of mathematics as a tool for calculating the chances of things. The probabilities of risk became, at least in part, knowable. The insurance industry was born, and ever since, we’ve been getting smarter and smarter about how to use an ever-expanding range of knowledge to peer into the future. The collective mood of Twitterers, the rate at which DNA mutates, massive databases of information on the patterns of the past…these are today’s version of Mayan calendars and crystal balls and figuring out the future based on the features of animal feces. (‘Hey, Joe, check out the lines in this cow pattie! It’s gonna rain next week!”)
We have made incredible progress in developing much smarter tools for divining what’s to come. But what matters is not just how good those tools are. In the end it’s how smart we are, or aren’t, at putting the product of those tools to use. It’s a matter of how the most important tool of all, our brain, interprets what those lesser tools tell us. And that part of the process…not just figuring out the facts but interpreting how those facts feel…is why predictions remain such an iffy business.
As much as we’ve learned about how to make more accurate predictions, we’ve learned at least that much about the psychology of how our mind works, and that body of knowledge tells us quite clearly that as smart as we like to think we are, the brain is only the organ with which we think we think. It turns out that our instincts and feelings and subconscious mental processes for interpreting information turn the cold hard facts into subjective judgments and behaviors that often fly in the face of the evidence, and even fly in the face of what would do us the most good. Mountains of research from psychology and neuroscience and sociology and economics, constantly confirmed by evidence from the real world, makes indisputably clear that, as Scottish philosopher David Hume observed “Reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them.”
So experts can warn us about economic ‘irrational exuberance’ and predict a ‘correction’, yet we deny the hard facts until we’re staring, through our own hubris, at huge losses. Experts can predict that human activity is altering earth’s climate and seriously threatening our future as a species, yet ideology can deny the data produced by one of the most thorough scientific enterprises of modern human existence. We choose to live on major earthquake fault lines, next to active volcanoes (hey, Seattle!), near forests that burn and coasts that flood…all predictable natural events that only become natural catastrophes because we ignore the predictions and put ourselves in harm’s way.
Oops! Oops! And Oops again, and again. No matter how well we can see into the future, the vision is of less value if the subconscious cognitive filters of our subjective perceptions blur the picture. We know a great deal about the specifics of those filters, and the underlying psychology of decision making, particularly about risk, that describes how we actually interpret information, and how we judge and make decisions and why we behave the way we do. But we don’t use that knowledge to make predictions nearly to the extent that the forecasters insightfully use crowd sourcing and computers and databases and all sorts of other shrewd tools to read the tea leaves.
So I predict that no matter how powerful our forecasting tools become, we won’t be able to fully apply their wisdom until we get smart enough to accept that we’re not the all-powerful rational thinkers we like to think we can be. Starting from that humility, we can combine the facts, and the insights we have about human cognition that helps predict how we’re like to interpret those facts, to produce predictions that might do a better job of telling us where we’re headed, and keeping us safe.
The Problem of the Points
Imagine that Bob and Betty have each put up $100 and are playing a tournament of the board game RISK. The first one to reach six wins takes the $200. But when the score is 5-3, with Betty ahead, both of them are sick of playing RISK for the last 19 hours and agonizing over how many armies to move to Kamchatka, and they agree to quit. How should they split the pot?
Pascal and de Fermat, realizing that Betty needed only one more win, and Bob needed three, calculated the odds of those future outcomes, and produced a formula for splitting the pot. The specific calculations and formula are not included here, because they would mostly make your head hurt. If you would like to learn more…