The U.S. is no stranger to the collapse of complex systems. But two decades before the break-up of Space Shuttle Columbia, the Deepwater Horizon blowout, and Fukashima Daiichi, America witnessed the Challenger disaster on January 28, 1986, and saw what happens when enormously complicated high-tech systems meet the smallest of human errors.
The horror of the catastrophe, 26 years ago Saturday, overshadows the story of how it happened. The night before the disaster, scientists at NASA’s rocket engineering contractor Morton Thiokol caught what they thought was a potentially catastrophic fault: the O-rings that sealed the bottoms of the Space Shuttle’s solid rocket boosters would fail in the cold temperatures predicted for next morning’s flight. The problem had been known about for years. In an unusually sobering account, the otherwise light-hearted physicist Richard Feynman would later lambast NASA’s executives for fundamental misunderstandings and hubris. “For a successful technology, reality must take precedence over public relations,” he wrote, “for Nature cannot be fooled.”
The crew of STS-51L
CNN coverage of Challenger
That night, the Thiokol engineers tried hard to persuade their superiors to postpone the launch, raising voices and pounding tables. They were ultimately overruled. The launch countdown proceeded as scheduled. On that unusually cold morning, at 11.38 AM, Space Shuttle Challenger broke apart 73 seconds into its flight, instantly killing its seven crew members.
Years later, one unlikely question lingered: could the engineers have been successful at canceling the launch if they had had better slides?
Statistician and data visualization legend Edward Tufte argues in his book Visual Explanations that the engineers failed to communicate dangers because the data wasn't presented in an easily digestible form. Witness the two charts engineers used to describe the erosion of O-rings – the cause of the catastrophe – on the morning of the launch:
The typography was sloppy. Unnecessary icons of rockets obscured key numbers. Worst of all, the performance data of the O-rings was arranged by launch date, rather than by the critical factor, temperature. That, says Tufte, made it all but impossible for decision makers to envision that a launch in weather below 66 degrees probably would involve O-ring failure.
When Tufte graphs the same data on a chart with a horizontal temperature axis, we get a much more vivid illustration of the correlation between cooler temperatures and an increased chance of damage:
The launch proceeded with an ambient temperature of 36 degrees; the shuttle exploded 73 seconds after liftoff.
Of course, bad slides or bad engineering doesn’t kill people: bad decisions do. Tufte has been taken to task for assigning blame on the engineers’ representation of the data, and as one critic points out, even his revised graph isn’t accurate. Whether or not Tufte’s charges are unfair, the fact remains that information may be our most vital tool, and to know how to use it, we have to understand it first.
Richard Fenymann explains the o-ring problem.
After the Space Shuttle Columbia disaster in 2003, Tufte picked up his microscope again and found more visualization failure: in this case, he fingered a key PowerPoint presentation NASA officials used to analyze data about the shuttle after its launch, which damaged its protective tiles. The presentation, he said, obscured a crucial lack of data, reassuring NASA officials, tragically, that no rescue was needed.
Even if, as he points out, PowerPoint carries a bias towards form over content, style over evidence, it remains a major tool for data visualization for everyone from teachers to companies to the U.S. military, who have even been internally warned about its dangers. A tool for sure, but perhaps an accidental weapon too.