I suggested that we were fighting the birthrate of the nation, that the war was essentially a stalemate—but a stalemate which favored the other side, since eventually we would have to go home… [Bunker] listened politely to what I said. He had, Bunker said, spoken with his generals—he named several of them—all fine men, and they had assured him that, contrary to what I said, everything was on schedule and that there was an inevitability to the victory we sought, given the awesome force we had mounted against the North Vietnamese army and the Vietcong.
Obviously Bunker was wrong. And there’s a temptation to be smug about this. A lot of people reading a story like this would be inclined to say that Bunker and other senior officials connected with the Vietnam war effort were arrogant, blinkered, and surrounded themselves with “yes men.” And of course there’s something to this—on the next page Halberstam cites the example of a general who “had always gotten things right because when he went into the countryside he unpinned his stars.” The best leaders understand the dangers of subordinates telling them what they want to hear and take these kinds of steps to stay connected to the “facts on the ground.”
But this isn’t the whole story. If it were, stories of oblivious managers and bureaucratic catastrophes wouldn’t be so depressingly common. Presidents, generals, and CEOs have powerful incentives to avoid this trap, and yet they fall into it over and over again. There’s something deeper going on here.
One of the most fundamental problems is that the filtering process of hierarchical organizations works not only on ideas but on people as well. I’ve already noted how bureaucratic reporting processes tend to filter out contrarian views. Even more important is the tendency of bureaucratic organizations to simultaneously filter out contrarian people. I’ll look at a classic example from the Vietnam era in my next post.