On a conference call the night before, Boisjoly and his colleagues at NASA contractor Morton Thiokol described the risk of low temperatures to NASA managers from their headquarters in Utah, and urged NASA to postpone the launch."It isn't what they wanted to hear," Allan McDonald, one of the Thiokol engineers on the call, told the producers of "Major Malfunction," a short documentary produced by Retro Report and the New York Times (it's embedded below)."My God, Thiokol, when do you want me to launch — next April?" Larry Mulloy, a NASA manager, shot back.All eyes were on the Shuttle. NASA was five years and twenty-four missions into the program, which in spite of rising costs and complexities, was meant to make space travel more routine with the help of a reusable spacecraft. The launch of Challenger, carrying the first civilian astronaut, "teacher in space" Christa McAuliffe, would be broadcast to thousands of schools across the country.But inside NASA, problems with the Shuttle had quietly piled up. The presidential commission's report on the accident later found that as early as 1977, NASA managers had known that the O-rings performed poorly at low temperatures, and that they wouldn't properly form a seal in the cold. In earlier launches, the engineers found that seals had been damaged, though not enough to cause catastrophe. Rather than redesigning the part, however, managers at NASA and Thiokol had filed the problem away as "an acceptable flight risk."I did the smartest thing I ever did in my lifetime. I refused to sign [that document]. I just felt it was too much risk to take.
See something, say something (and do something)
GM: Group thought and speaking up
Prior to Challenger's launch, Thiokol engineers had gone through official NASA channels to air complaints and were ignored; after they took their protest public in front of the Rogers commission, they were ostracized. In Truth, Lies and O-Rings, Allen MacDonald wrote that "Roger and I already felt like lepers, but when we returned to Utah following the [Rogers commission interview] our colleagues treated us as if we had just been arrested for child sexual abuse." Boisjoly was shunned by colleagues, taken off "space work" by his employer. "Managers isolated him in his position and 'made life a living hell on a day-to-day basis.'"We are never ever going to say that there is nothing we can do.
Columbia
Even in organizations where autonomy is considered paramount, stronger forces are at play. When a problem is ignored long enough, it can go from being an acceptable risk to a disaster in an instant. And if and when someone within the group sounds an alarm, that person is going up against organizational inertia, chains of command, and cultures that derive at least some of their strength from a sense of obedience.We all make mistakes. It's very likely I've made a few in this article, which is why I've relied on others to let me know if I've made any and to offer suggestions. Paradoxically, however, this system can lead to complacency and a false sense of security: if I rely too much on others to check for my own mistakes, I might atrophy my own ability to recognize them. If an editor sees a problem but doesn't mention it, perhaps thinking that's it not important enough to mention or that I have already seen it, or if he or she is simply overwhelmed by a host of other little concerns, it's easy to see how an error can slip through a system meant to prevent them. This is only an article, not a Space Shuttle, but if it were, it's also easy to see how a tiny error can lead to catastrophe.MR. McCORMACK -- Well it could be down to the, we could lose an entire tile, I mean, and then the ramp into and out of that. It could be a significant area of tile damage down to the S.I.P. [strain isolation panel]. Perhaps it could be a significant piece missing but----
MS. HAM -- Would be a turnaround issue only?
MR. McCORMACK -- Right.
MS. HAM -- Right, O.K., same thing that you told me about the other day in my office, we've seen pieces of this size before, haven't we?
MR. LEINBACH -- Hey, Linda, we are missing part of that conversation.
MS. HAM -- Right . . . He was just reiterating, it was Calvin [Schomburg], that he does not believe that there is any uh burnthroughs so no safety of flight kind of issue, it's more of a turn around issue similar to what we have had on other flights. That's it? All right, any questions on that? O.K. . . .
The trick is knowing which errors must be addressed and which can be accepted, and which are being accepted simply because we fail to see how dangerous they are. Hank Paulson, who presided over the 2008 financial crisis as Secretary of the Treasury, laments in a recent op-ed that "we're making the same mistake today with climate change" as we did with the financial markets: building up excesses without providing powerful solutions."The warning signs are clear and growing more urgent as the risks go unchecked," he wrote. "This is a crisis we can't afford to ignore. I feel as if I'm watching as we fly in slow motion on a collision course toward a giant mountain. We can see the crash coming, and yet we're sitting on our hands rather than altering course."Like most crises, the financial calamity of 2008 resulted in some major corrections. But it also demonstrated a paradox of large complex systems, the kind that increasingly determine our daily lives. When risks balloon into crisis, these large systems can become too big to manage. But if we're unable to imagine that they could fail to begin with, simply because they're too important to society—think of the banks bailed out during Paulson's tenure—then we might overlook problems. "Too big to fail," in a sense, makes failure even harder to avoid. (Despite the protestations of Alan Greenspan and others that "if they're too big to fail, they're too big," a survey by the International Monetary Fund this year warned that the problem still exists.)Among other flaws, the financial crisis exposed some of the new methods companies use to shield themselves from risk, to reduce the moral hazard involved. One of the interesting aspects of the so-called "sharing economy" umbrella is the way that some of its companies engineer not just new software but new methods for avoiding the liabilities that companies used to carry. Insurance companies flourish.Despite its shortcomings, NASA's Space Shuttle left a positive legacy for spaceflight and for everyone. Part of that was the lesson that not making mistakes requires the brute force of computers, torrents of data, and an understanding of laws, both the physics and the government kind. It demands focus. But it also needs human doubt and dissent.Of course, speaking up or speaking out carries its own risks: as the experiences of Roger Boisjoly and Allan McDonald and others showed, speaking up can be a mammoth and expensive venture, and you could end up looking like a Chicken Little, branded as disobedient, or become the target of a government investigation.Or, worst of all perhaps, you could simply be ignored.I feel as if I'm watching as we fly in slow motion on a collision course toward a giant mountain. We can see the crash coming, and yet we're sitting on our hands rather than altering course.