Irony lurks in the surge of interest in cognitive psychologists’ research on human reasoning: we seem to be desperately interested in reading about how poorly we think. If Descartes could behold the popularity of books by Daniel Kahneman and Leonard Mlodinow and the hundreds of articles and blog posts they and similar books have spawned, he might alter his pronouncement: I think about thinking, therefore I am. Metacognition rules.
I have a meta-meta-cognitive question or two about all this cerebrum gazing. But before I get to those, here’s a little quiz to get you thinking. The questions are my adaptations of items from research studies from the 1960s, 1970s and 1980s, two of them initiated by Kahneman and his late research partner, Amos Tversky. See how well you do:
* * *
1. At a dinner party this weekend, a friend introduces you to a woman named Genevieve. He tells you that Genevieve recently graduated from Bryn Mawr College with a B.A. in Philosophy, where she was active in the Occupy movement and edited a literary magazine. You’re interested in talking to Genevieve about Hegel, the subject of her senior thesis, but your friend jumps in and asks you to rank the following statements about Genevieve in order of their probability:
(1) Genevieve is a feminist.
(2) Genevieve is looking for a job as a sanitation worker.
(3) Genevieve is a feminist who is looking for a job as a sanitation worker.
Given what you know about Genevieve, rank the statements from most likely to least likely.
2. Later that evening, your friend presents you with a deck of cards with a number on one side and a letter on the other. He deals you four cards from the deck. Here is what you see laid out before you on the four cards:
9 J U 2
Your friend then asks you which cards you will need to turn over in order to determine whether the following rule holds for the deck (assuming these four cards represent the rest of the deck):
If a vowel is printed on one side of the card, then an even number is printed on the other side
Which cards do you turn over in order to test this rule?
3. Genevieve offers you a bet. “Flip this quarter,” she says. “If it’s heads, I’ll give you $200. If it’s tails, you pay me $100.”
Should you take the bet?
1. This is known in the literature as the “Linda” problem, or the “conjunction fallacy.” It tests how well individuals reason using probability theory. In Kahneman and Tversky’s 1983 study, 85 percent of subjects got it wrong. Your answer was incorrect, too, if you ranked statement (3) in the first or second position. Logic dictates that (3) is the least likely scenario: two conditions being true (Genevieve is an ardent feminist + Genevieve is looking for a job as a sanitation worker) is always less probable than only one of these being true. If you got this one right — it doesn’t matter whether you put (1) or (2) first, just that you ranked (3) last — congratulations. If not, you’re in good company: only 15 percent of Stanford business school students who had received training in probability theory got it right.
(For more on Linda/Genevieve, including an examination of criticism of the question, see chapter 15 of Kahneman’s Thinking, Fast and Slow.)
2. The card question, first asked by Peter Wason in 1966, challenges your deductive reasoning skills. In his 1977 book, Wason (with co-author Philip Johnson-Laird) reports that only 5 percent of subjects answered questions like this correctly. The most common mistake is to turn over the U and 2 cards — an error that flows from the rule’s specification of a relationship between vowels and even numbers. You do need to flip over the U card to check if an even number is on the other side (as the rule specifies). But you do not need to see what’s on the other side of the 2 card: the rule does not specify that even numbers are always paired with vowels, just that there must be an even number opposite a vowel. You do need to flip over the 9 card, however: if there is a vowel on the other side, you can disprove the rule. So the answer is: you must turn over exactly two cards: the U and the 9.
(To try your hand at more examples of this selection task, with some interesting variations, try this link.)
3. The bet question does not have a right or wrong answer, per se, but it highlights what Kahneman calls an irrational “loss aversion” everyone seems to suffer from, at least to some extent. Technically speaking, any bet where the payoff is greater than the loss, given an equal chance at either outcome, is a good one. And the prospect of earning $200 is a much better payoff that easily outweighs the $100 you’d have to pay Genevieve if you lose. Assuming the loss of $100 is tolerable — you know where your next meal is coming from, and you don’t need the money to pay the rent — you should, as a rational agent, accept the bet. The real-world problem with loss aversion isn’t that you’ll pass up great bets like these — Genevieve would have to be crazy to offer it, after all. The loss aversion ends up costing you dearly if you spend too much time protecting your precious assets when you should be just as assiduous about prospecting for new ones. I once spent about 3 hours, over several weeks, making calls to a merchant who had charged me shipping for an item I purchased online with a free shipping coupon. I finally got my $8 back. But if someone would have offered me a job calling up multiple customer service agents, waiting on hold, getting the runaround, etc., for a promise of $8 in compensation, there’s no way I’d accept it.
* * *
So, how did you do? If you avoided the common errors of reasoning that led large majorities of subjects to do the irrational thing on repeated experiments, you may justly gloat a little. (But only a little: as Jonah Lehrer and Big Thinker Tauriq Moosa report, smarter people may have a particularly hard time talking themselves out of other biases.)
If you answered one or more of these questions incorrectly — and chances are very high that you did — the question is what this says about you individually and about humanity writ large. Do experiments like these belie the faith of philosophers and social scientists in baseline human rationality? Do these results show that only a select slice of humanity (somewhere between 5 and 15 percent, depending on the study) qualifies for the title “rational”? One way out of this mess is to deny that any of these experiments are really measuring rationality. But if we seek to disentangle rationality from deductive logic and probability theory, our account of reason gets messy. Rationality may be about more than logic alone, but without logic at its base, isn’t it one confused puppy?
In his 1993 book, The Nature of Rationality, Robert Nozick sketched a concept of “symbolic utility” in which rational irrationality becomes a potential reality rather than an oxymoron:
Producing evident bad consequences, these apparently irrational actions and symptoms have a symbolic significance that is not obvious; they symbolize something else [which] has some utility or value..for the person. (p. 26)
So refusing Genevieve’s bet may symbolize your lack of greed, your conservative nature or your pride in protecting assets you have worked hard to earn. And you may benefit in various ways from having one or more of these self-conceptions. Nozick’s idea raises a host of questions and intellectual tangles, but at least it points a path around the faddish denial that human beings can think straight. As delicious as that idea seems to be.
Follow Steven Mazie on Twitter: @stevenmazie
For a more sustained critique of the experiments that inspired this quiz, and words of solace for those of you who didn’t ace it, have a look at my follow-up post.