Surprising Science

# Don’t Panic: Learn How to Read Risk Data First

If somebody tells you the risk of something is “1 in a million” or “1 in ten thousand” or even “1 in ten”, you still don’t know nearly enough to gauge how big or small that risk actually is. Get more information before you decide how worried to be.

A report from The Institute for Economics and Peace offers some sobering news about the global rise of terrorist deaths. They’re up 61% from last year. Pretty scary. But the report, and the news coverage it has received, also teaches some important lessons about risk generally, and risk numbers, and how without a little careful reflection we can jump to scarier conclusions than the numbers warrant.

The New York Times headline, “Deaths Linked to Terrorism Are Up 60 Percent, Study Finds”  is accurate. But rarely does one single number tell the whole story for any risk. 61% increase is the relative risk, the new numbers compared to the old ones. That’s one way to look at the risk. But to put the risk in perspective you also have to know the absolute risk…the actual number of victims. The total number of terrorism victims in 2013 was 17,958. And then to put that in perspective you have to compare the number of actual victims to the total number of possible victims to get the risk rate, another important statistic. Here’s what those three calculations would look like for these new findings.

Relative risk of terrorism death globally, 2013  –  61% increase from 2012

Absolute risk of terrorism deaths globally, 2014 – 17,958

Risk rate for terrorism deaths globally, 2014 – 0.000003 percent (17,958 victims divided into the total number of possible victims in a global rate, 7 billion people.)

All three numbers matter. All help put the risk in perspective.

But you also have to know the size of the pool of actual potential victims? Not just all possible victims, which is essentially everybody, but the people who because of circumstance are actually in harm’s way. There is no such thing as the average risk for the average person. Prostate cancer only kills men. Plane crashes don’t kill people who don’t fly. Some low-dose environmental risks that might harm the developing fetus or infants, like mercury in seafood, are not a risk to adults. Every risk has specific at-risk populations. A single risk number like One in a Million is almost meaningless.

For example, this report found that 80% all the terrorism deaths were in 5 countries; Iraq, Pakistan, Afghanistan, Syria, and Nigeria. So the risk in those countries is higher than it is elsewhere. The overall average global risk rate doesn’t tell an accurate story for people in Canada or Brazil or New Zealand. And it doesn’t reflect the true risk of terrorism death in those five countries either. Risks are higher for some people and lower for others.

Okay, the math part is over. But the warning here isn’t just about getting risk numbers right. It’s about how readily we can assess risk wrong. Our brains are lazy. It takes calories to think, and since our cognitive skills evolved back when we weren’t sure when (or whether) the next meal would show up, we have developed all kinds of mental shortcuts to save calories by avoiding thinking any harder than the situation requires. That means we jump to conclusions about lots of things, including jumping to conclusions when we hear just one risk statistic. A single risk number like One in a Million prompts an instinctive response, even though it rarely tells us all we need to know. This new report is an example.

“60% increase” sounds pretty big…pretty dramatic. Anyone reading that headline might take away the impression that the global risk of terrorism is rising dramatically. Well, it is…regionally. But not for most NY Times readers. The story does state that, but not until several paragraphs in. You have to dig into the story to discover that critical caveat, and you have to think a little more carefully about risk numbers, and most of us don’t.

So here’s the warning. No, it’s not about how you and I need remedial math. It is about the danger of jumping to conclusions about risk with partial information, whether that information is numeric or any other kind. I don’t know the actual number, but the risk that we will make up our minds about risk without learning a little more and thinking things through, is really high. And that raises the probability that our fears won’t match the facts, leading to potentially dangerous mistakes, whether we’re more afraid than the evidence warrants or not as afraid as the evidence warns.

So here’s a statistical prediction. The chances that you’ll make healthier choices about risk if you don’t just go with your first gut reaction but also stop and think and get a little more information, are pretty close to 100%.

Related
Who — or what — really controls your mind?
More than a century ago, Halifax suffered an accidental blast one-fifth the size of the atomic bomb dropped on Hiroshima.
Dive into the twisted truths and concealed realities told by literature’s most unreliable narrators.
If you don’t feel better after the weekend, the “burnout paradox” could explain why.
7 min
with
If you can identify a foreground star, the spike patterns are a dead giveaway as to whether it’s a JWST image or any other observatory.

Up Next
“If someone is to consider black and white for their project I guess I would say the heads up that I would offer is that they need to be prepared to fight for it and lobby for that… Connections are not in color or in black and white. They’re invisible but just as real.”