How To Factor in Your Blind Spots for Ethical Decision Making

This content is locked. Please login or become a member.

5 lessons • 30mins
1
A Case Study in Strategic Empathy from Inside the CIA
07:41
2
Three Common Cognitive Pitfalls in Decision-Making
06:52
3
Analyze Present-Day Problems with Historical Methods
05:38
4
Are You Using the Lessons of European History to Predict Asia’s Future?
06:35
5
How To Factor in Your Blind Spots for Ethical Decision Making
04:01

Analyze Your Choices: How To Factor in Your Blind Spots for Ethical Decision Making, with Max Bazerman, Professor, Harvard Business School

Lessons from The Challenger Story

In the Challenger story there’s obviously an ethical issue when we’re sending seven astronauts into space. There were alarm bells that went off. We had engineers from Morton Thiokol saying there’s a problem. But the political and organizational desire to have a successful launch quickly really overwhelmed the discussion so that we didn’t construe the problem in terms of the ethics. Rather, we construed the problem in terms of the technical decision, are we safe to launch? And because of our motivation to have a successful launch we didn’t even do the engineering well.

Lesson 1: Heighten your awareness

The evening before the launch, there was a meeting and some of the engineers that were on fire call who provided the engine believed that there were problems with the O-rings at temperatures below 53 degrees and they provided that argument. And NASA, under political pressure, dramatically wanted to launch. And most astounding was the engineers were looking at the seven previous launches out of 24 where there had been problems, and they couldn’t discern an appropriate pattern. What they failed to do was to ask what any engineer should have known, and that is if you want to know whether temperature is related to O-ring failure you need to look at the successes and the failures. But what they had instead was a motivated discussion on leading to the decision to launch.

Lesson 2: Establish firm grounding

Throughout this process there was virtually no discussion about the fact that seven astronauts had their lives at stake in their discussion. The discussion turned entirely technical. It had very much of a managerial feel to it, and the ethical part of the decision faded from awareness.

Lesson 3: Consider omissions

So the question is, what kind of intervention might have been effective? And the amazing part of the Challenger story is that the intervention was shockingly easy. All that was really needed was for somebody to say, “if we want to know is temperature related to O-ring failure, what data do we need?” And the answer is quite obvious. We need to look at the data for all 24 launches rather than just the seven launches with problems to see if there’s a difference in temperature between successful and unsuccessful launches. Had somebody asked that question it would have been very easy to get the other 17 data points, and when you look at all 24 data points together the decision to not launch becomes extremely obvious.

When people want to make the decision before they’ve analyzed the data or what we might call ready, fire, aim, that’s a good hint that we might be missing critical issues and we need to step back and take another look at the problem.