- In "Cautionary Tales," economist Tim Harford explores why humans are so susceptible to con artists.
- In the podcast's first episode, Harford uses a famed oil tanker spill to highlight how important it is to admit mistakes.
- Future episodes compare the Oracle at Delphi with computer algorithms and a famous awards show messing up the envelopes.
Humans are susceptible to cons. We’re even more likely to fall for larger-than-life personalities. This isn’t me writing about the vague “other humans” out there, the ones that you and I (wink, wink) know exist but would never fall victim to. As economist and journalist, Tim Harford — the author of the bestselling book, The Undercover Economist — recently told me, the con is “baked into” human nature.
Yet, as he explores in his excellent new podcast, “Cautionary Tales,” we can learn from past mistakes. Take a few deep breaths, count to 10, make better decisions — decisions, he points out on his podcast, that can save lives. We can better educate ourselves to learn about things we think we know about but actually do not.
Part of Pushkin Industries, the company co-founded by Malcolm Gladwell and Jacob Weisberg, Harford joins Gladwell and Michael Lewis for a series that explores the meaning behind both historical and modern-day events. In the debut episode, Harford discusses the tragic Torrey Canyon reef crash in 1967, which dumped 120,000 short tons of crude oil into the waters near Cornwall. The captain’s inability to change course is, in itself, a lesson about the value of admitting mistakes and putting our new-found knowledge into action.
In episode two, we hear the story of Wilhelm Voigt, a Berlin native that isn’t a captain but played one in society, an incredible story that shows us the depths of our beliefs in powerful con men. Harford discusses the political consequences of such cons during our interview. I’m sure you can guess where that conversation ends up.
As with Revisionist History and Against the Rules, “Cautionary Tales” is a welcome addition to podcasting. Humans might fall for cons and be unwilling to own up to mistakes, but we’re also animals with a deep love for storytelling. Harford excels at that medium, both in writing and narration. The podcast is a pleasure to listen to and, bonus, you might just learn something along the way.
Derek: Your new podcast is on Pushkin, which has this vibe of radio from a century ago. It’s not just people talking; there’s music, sound effects, and acting involved. Why did you go that route with “Cautionary Tales”?
Tim: One of the things that quickly became clear as I was looking at the stories that I wanted to tell is that very often these are stories where we don’t have tape. We don’t have a lot of archival footage. Very often there was nobody there when it happened. The journalist showed up afterwards; in some cases—there’s one story that’s two and a half thousand years old—we don’t have tapes.
What do you do? Well, you can do the usual thing, which is to put an expert in an arm chair and ask him or her to explain what happened. We wanted to do something different. These little historical reenactments are like fresh herbs and spices throughout the podcast. These little scenes include a very different way to access a story that you wouldn’t have another way to tell.
Derek: Your show is billed as “the science behind what happens.” There has long been a replication problem in science. I know you mostly deal with the social sciences, but what was your training in science and why did you choose what you chose when approaching a topic?
Tim: It’s a very good point because of the replication crisis—I think crisis is the right word. One of the issues is people looking for the perfectly counter-intuitive results, the thing that’s just weird enough to be surprising and yet not so weird that you completely dismiss it. There’s a lot of psychology published that has been filtered through that medium. I’m coming at it from a slightly different angle.
Rather than the coolest new study that might surprise you, I’m saying, “This thing happened, this oil tanker hit the rocks or this economist was the most famous economist in the world and he went bankrupt or they gave the Oscar to the wrong movie.” Start with that story and then say, “What is it that social scientists can tell us about that story? What are the explanations?” Very often you find there’s more than one explanation. There’s usually no single cause. Then the question is: What explains it? What do the people who have thought hard about this sort of thing make of these accidents?
I talk about Milgram’s experiments, but I try to remind people that a lot of the experiments that he did were not reported. These are very famous electric shock and obedience experiments. I’m trying to pick that apart and think about what modern psychologists now make of those experiments of what they think those experiments really tell us—to not to be uncritical in the way that I think about these studies.
Derek: I’ve read that study in many different contexts. The way you frame it about being an example not of obedience, but of a willingness to admit our mistakes, is really important. Why are we so unwilling to admit when we’re wrong?
Tim: That’s a big question. In some cases it’s a social thing. In politics, for example, you don’t want to admit that you’re wrong because you’re conceding ground to the other side and you don’t want to lose faith socially. You don’t want to lose political advantage. In other cases, you personally have committed so much to a particular viewpoint that it becomes extraordinarily painful to face up to the error.
This is the old idea of cognitive dissonance, which I explore in an episode about John Maynard Keynes and Irving Fisher, two great economists and their forecasting. Long story short, both are geniuses; both get really into stock market investing. One goes bankrupt; one dies a millionaire. What explains the differences?
One of them is willing to admit he made a mistake and one is not. Irving Fisher is more exposed. He’s more publicly committed. He’s going to lose face socially. But he’s also too deep in debt to admit “I’m getting this wrong, I need to change direction.” It’s more painful to the sense of who he is, which the guy who doesn’t make mistakes.
There’s a third problem, which is something I emphasized in Adapt: We didn’t know we made a mistake. No one ever tells you that you made a mistake; no one ever gives you the feedback. That’s a very common problem.
Derek: Your podcast is supposed to help us learn from our mistakes. How do you help people actually learn what is in their best interest? Is that even possible?
Tim: This is something I explore in the final episode, which is about what happens when we just hand over our process to an authority figure or to a computer algorithm. What happens when we just let our GPS tell us where to go? One of the really interesting groups of studies that I talk about in that episode is what happens when you are forced to stop and think. These studies explore something called the illusion of explanatory depth.
In the initial study, they say, “How well do you reckon that you know how a flush laboratory works on a scale of zero to seven?” People will say, “Oh yeah, maybe six.” Then the researchers say, “That’s really interesting. Here’s a pen and paper. Just explain to us in detail how it works.” People get really stuck because they realize they don’t know how it works. It was all a bit vague. They weren’t lying to the researchers; they were lying to themselves. They felt that they understood this everyday object and they didn’t.
The next study asked the same questions but about politics. It’s by a different group of researchers. They said, “Tell us how a cap and trade system works. Tell us how the US will apply unilateral sanctions on Iran. How does that actually work?” People often feel they know pretty well what these policies are. Then again, when you ask them to explain, not to advocate, don’t tell me whether it’s a good idea, just tell me what it is. Again, people go, “Ah hmm. Uh hmm. I thought I knew but I don’t know.”
What’s fascinating is that people’s views about politics become more moderate. They think, quite reasonably, “Maybe my previous view that I was willing to die in a ditch to defend cap and trade or to prevent cap and trade, maybe that view that I thought was super important, maybe I shouldn’t hold that view so strongly anymore given that I didn’t really understand what it is that I’m talking about.”
Not in every Cautionary Tale, but it comes up again and again, is that if you can calm down and slow down, whatever terrible thing happened wouldn’t have happened if somebody had been able to count to 10 and think about what was going on.
Derek: When I was listening to episode two, I was reminded of a story growing up. There was a sporting goods chain called Herman’s. Two men walked in, went to the back of the store, and grabbed a canoe. They put it over their heads and walked out of the store. It took 20 minutes for anyone to realize that they stole it.
Tim: Because they just walked right out as if they had bought it.
Derek: You say the judge, at the end of “The Captain of Köpenick,” goes down and shakes Voigt’s hand even though he admitted his crime and was a con man. What do we learn from that?
Tim: We’re tremendously subjected to appearances. I wish I had a silver bullet for that one, some pill you could take that would cure us of that. I talk about the fact that just being tall is tremendously advantageous if you’re running for political office.
Derek: I’m six-three, so I appreciated that.
Tim: Yeah, me too. As far as presidents go, that’s not that tall. When you look at it, it’s like they’re picking a basketball team. It’s a myth that the taller candidate always wins, but it definitely seems to be an advantage. The example of appearances matching the eye that I just can’t get my head around is the adverts where the guy says, “I’m not a doctor, I just play one on TV,” as though it’s the most natural thing in the world. And it clearly works! That that advert ran for a long time is absolutely astonishing.
Even this particular con man, Wilhelm Voigt, would not have said, “I’m not actually a military captain. I’m just wearing the uniform.” Of course, I can’t help but think of a certain president who’s most famous for playing a successful business man on TV. He’s famous for acting as a businessman. It makes a huge difference to how we perceive the world.
Derek: Do we ever get over that? Is that something we can teach out of ourselves?
Tim: I have never seen a piece of research that says there is a cure for that. That is why, for example, blind recruitment processes and blind audition processes are so powerful. Claudia Goldin and Cecilia Rouse studied what happens when the great American orchestras switched to blind auditions. They thought they were doing it to prevent discrimination against particular students who have powerful teachers; they didn’t only want the “in crowd” to be recruited. They put up screens so you wouldn’t know who was playing. Surprise, surprise, suddenly a load of women who previously wouldn’t thought to be good enough were being recruited.
It’s not enough to just tell people, “You shouldn’t discriminate against women. Hey, don’t be too impressed by uniforms. Treat people who don’t look attractive the same way as you treat attractive people.” You can tell people that, but I’m not sure it makes a great deal of difference. We can, again, slow down, have a think, and ask ourselves, “Am I overweighting this person’s appearance? Am I favoring this person for president because they they look presidential rather than this other person who doesn’t seem to look like what I imagined the president to look like?”
I don’t think there is an easy cure for that. That’s heavily baked in human nature. It’s simpler with con artists. If you can slow them down and slow yourself down enough, you can usually spot the con. With a more subtle influence, like who we want to run our companies and who we want to run our country, appearances are always going to matter.