In 2003, the world watched, shocked, as the Space Shuttle Columbia exploded as it was re-entering the Earth’s atmosphere. NASA investigated and concluded that Columbia exploded because during take-off, a bit of foam fell off the shuttle and hit the wing. The shuttle made it to orbit safely, but on re-entry the damage from the foam was enough to destroy the vehicle.
The astonishing thing about the Columbia tragedy is that NASA engineers not only noticed the foam in the take-off video, they even realized that it could cause problems on re-entry. Indeed, while the space shuttle was still in orbit, NASA hired a Boeing team to conduct an analysis of the potential damage. Tragically, after hearing the Boeing team’s report, NASA decided to go ahead with the re-entry.
Edward Tufte is a legendary statistician and graphic designer whose research has focused on the challenges of clearly and accurately conveying complex empirical information. In an analysis of the Columbia disaster, he blames PowerPoint for the shuttle’s explosion. Specifically, he argues that when the Boeing team presented its findings to senior NASA officials, the limitations of the PowerPoint format became an impediment to clear communication:
In the reports, every single text-slide uses bullet-outlines with 4 to 6 levels of hierarchy. Then another multi-level list, another bureaucracy of bullets, starts afresh for a new slide. How is it that each elaborate architecture of thought always fits exactly on one slide? The rigid slide-by-slide hierarchies, indifferent to content, slice and dice the evidence into arbitrary compartments, producing an anti-narrative with choppy continuity. Medieval in its preoccupation with hierarchical distinctions, the PowerPoint format signals every bullet’s status in 4 or 5 different simultaneous ways: by the order in sequence, extent of indent, size of bullet, style of bullet, and size of type associated with various bullets. This is a lot of insecure format for a simple engineering problem.
The format reflects a common conceptual error in analytic design: information architectures mimic the hierarchical structure of large bureaucracies pitching the information. Conway’s Law again. In their report, the Columbia Accident Investigation Board found that the distinctive cognitive style of PowerPoint reinforced the hierarchical filtering and biases of the NASA bureacracy during the curicla period when the Columbia was damaged but still functioning.
Tufte then quotes from the CAIB report:
While Tufte blames PowerPoint, it’s clear that what he’s talking about is fundamentally a management problem. It’s true that PowerPoint bullets are a lousy way to communicate complex technical information. But it’s also not a coincidence that middle managers love them so much. A big part of the job of a middle manager is act as a kind of information funnel: to gather information from dozens or hundreds of people below him in the hierarchy and communicate it in condensed form to the people above. The beauty of bullet point is that it makes it possible to present complex information in arbitrarily truncated form while glossing over details that the presenter may not fully understand.The Mission Management Team Chair’s position in the hierarchy governed what information she would or would not receive. Information was lost as it traveled up the hierarchy. A demoralized Debris Assessment Team did not include a slide about the need for better imagery in their presentation to the Mission Evaluation Room. The presentation included the Crater analysis, which they reported as incomplete and uncertain. However, the Mission Evaluation Room manager perceived the Boeing analysis as rigorous and quantitative. The choice of headings, arrangement of information, and size of bullets on the key chart served to highlight what management already believed. The uncertainties and assumptions that signaled danger dropped out of the information chain when the Mission Evaluation Room manager condensed the Debris Assessment Team’s formal presentation to an informal verbal brief at the Mission Management Team meeting.
This “information funnel” function is essential to the role Paul Graham describes managers playing in hierarchical organizations: to make a big group of people appear to upper management as if it were a single person. A given manager might be an MBA who’s never computed a rocket trajectory in his life, but when there’s a meeting of the organization’s senior staff, he’s a stand-in for the 100 aerospace engineers who report to him. Not only is it unreasonable to expect him to accurately represent the knowledge and concerns of all 100 people, it’s not even reasonable to expect him to understand the views of all those people. Yet it would be awkward for the manager to admit that he has only a vague idea of what a lot of the people under him do. So he’s going to gravitate toward a presentation style that allows him to sound authoritative while summarizing information he may or may not understand in any detail.
A related phenomenon is a tendency toward excessive optimism within large bureaucracies. People dislike giving their bosses bad news. Given that managers have to summarize and abbreviate the information their employees give them anyway, there’s going to be a natural tendency to pay less attention to negative results than positive ones. And this tendency gets amplified when there are multiple layers of reporting: At each layer, bad news gets pruned more aggressively than the good news, and so the glasses get more and more rose-tinted. The information that bubbles up to the top of the hierarchy can be massively skewed and incomplete.
Unfortunately, reality doesn’t respect organizational hierarchies, so when the people under you think your space shuttle is going to explode, it’s pretty important that you hear about it. There are a number of worthwhile strategies for mitigating this kind of problem. Tufte advocates that technical information be communicated in the form of written reports rather than PowerPoint bullets. But the more fundamental point is that we need to recognize that these kinds of information-flow problems are inherent to hierarchical organizations. Even in the best-run large organizations, the people at the top levels of management are going to have a partial and distorted view of what their employees know and what the organization is doing.
Tim, very good post, and apropos to a conversation I had with my boss yesterday.
(I only told him the good news).
😉
http://bloodandtreasure.typepad.com/blood_treasure/2009/08/they-sentenced-me-to-twenty-years-of-boredom.html
In the technical world, or any world for that matter, one must have the balls to say what needs to be said, regardless of bureacratic heirachies. The ability to think and communicate outside the box is critical when dealing with matters of potential catastrophy.
The larger and sadder problem we face is that as collective humanity, or even in a business enterprise, we simply refuse to learn from our mistakes- the mistakes cited here will manifest themselves in other forms in the future with the same results, even if posted here and available to the world because we tend to encapsulate ourselves in our own worlds in our own jobs and under pressure, groupthink rears its ugly head too often.