What caused the Challenger disaster? It's not what people think. Photographer: Bob Pearson/AFP/Getty Images
What caused the Challenger disaster? It's not what people think. Photographer: Bob Pearson/AFP/Getty Images

In our last class on policy failures, we discussed the 1986 Challenger space shuttle disaster. For those of you not old enough to remember, this was a catastrophic failure that took the lives of seven astronauts and put the relatively young space shuttle program in jeopardy. After the fact, it emerged that management had been warned by engineers at Morton Thiokol Inc. that it might have been too cold on that January morning to launch safely -- and launched anyway. An O-ring seal in one of the shuttle's solid rocket boosters failed in the sub-freezing temperatures, and the shuttle was destroyed, killing everyone aboard.

Inevitably, the public launched into a frenzy of what my book dubs “blamestorming” -- it’s just like brainstorming, except that we all get together to choose a villain to pin the blame on. The villain they chose was management at the National Aeronautics and Space Administration who had gone forward with the launch despite warnings.

To be clear, the public was right to blame the NASA managers, who should have delayed the launch. It was wrong, however, about what motivated those managers. The picture that emerged from the news media and congressional hearings was of management that pushed forward with the launch just to keep on schedule, in reckless indifference to the potential loss of human life. In "The Challenger Launch Decision," Diane Vaughan argues pretty convincingly that this makes no sense.

Let’s say that management was aware of a high risk of failure. Why on earth wouldn’t it delay the launch? We’re not talking about a delay of years; we’re talking about a delay of hours, until it got a bit warmer. Management had authorized such delays many times before.

You can’t chalk this up, as people often do, to some kind of greed; the managers were paid at NASA scale. You could chalk it up to overdedication to the space shuttle program, at the expense of human life -- except that any fatal disaster would have endangered the program, and their jobs, and those managers surely knew that. As our professor, Steven Teles, put it: “What’s the probability of a serious problem not being detected? P=1.” Yet much of the response to the disaster consisted of calls for more oversight.

Vaughan locates the problem not in inhumane managers, but in the culture of NASA. Over time, as things went wrong, NASA had responded to unexpected developments by giving itself permission to ignore them. As physicist Richard Feynman, who sat on the congressional commission investigating the disaster, noted, the shuttle had been getting unexpected blowby with the O-rings for a while. That should have raised red flags: Something is going wrong, and we don’t understand it! Instead, they decided that unexpected blowby must be OK, because they’d had it before, and the shuttle hadn’t blown up.

This is a very common pattern in organizations -- one that I explore a lot in the book. Humans are prone to something called “normalcy bias.” In movies, the biggest risk of a natural disaster is getting trampled in the panic. But in fact, people often under-react to dangerous situations if they aren’t actually missing limbs -- sitting in a crashed plane, or stopping to grab their luggage, rather than expeditiously getting the heck out of Dodge.

At NASA, this tendency was exacerbated by collective decision-making, a phenomenon I call “groupidity.” Groups diffuse responsibility and can make it easier to do something stupid. They also tend to discourage revisiting questions the group has already settled, such as “Is blowby on the O-rings dangerous?” And people may suppress their concerns because no one else seems to share them -- and humans, being social animals, take cues from others about what is dangerous and what is not. (That’s why my generation is horrified when we see people smoking while pregnant, or letting young kids ride in the front seat, as if … well, as if we were their parents.)

NASA’s groupidity ended up being fatal. But few of the remedies proposed actually fixed the groupidity. Instead, they tried to fix a nonexistent problem of evil men deliberately doing evil things -- another common instinct that I discuss in the book.

To contact the writer of this article:
Megan McArdle at mmcardle3@bloomberg.net.

To contact the editor responsible for this article:
James Gibney at +1-202-624-1863 or jgibney5@bloomberg.net.