The Hart-Rudman report [commissioned by U.S. President Clinton in 1997 and completed in March 2001] established the nation’s vulnerability, but even it could not say when, how, or from where that vulnerability might be tested. Its conclusions, however striking, therefore fell within the realm of the hypothetical. Press coverage was minimal, and the response of the newly installed Bush administration–like that of the outgoing Clinton administration–to the commission’s preliminary findings was little more than polite thanks. That the foundations of national security were about to suffer a seismic jolt was still by no means clear.
There was yet a third reason for the surprise, though, which went beyond the concerns of Hart-Rudman: it had to do with a widespread sense in the academic and policy communities during the 1990s that the international system had become so benign that the United States no longer faced serious security threats of any kind. Paradoxically, the success of American grand strategy during the Cold War encouraged this view.
The record was indeed impressive. The United States had used military occupations to transform Germany and Japan into thriving capitalist democracies, and the Marshall Plan had secured similar results elsewhere in Europe. Over the next four decades democracy and capitalism spread much more widely, even tentatively into the Soviet Union itself. Meanwhile the world’s other great communist state, China, was pulling off a dialectical transformation that neither Marx nor Mao could ever have imagined, becoming a hotbed of capitalism, if not yet of democracy. By the time the Cold War ended, no other models for organizing human society seemed viable: Americans were remaking the world, or so it appeared, to resemble themselves. And the world, it also seemed, was not resisting.
Certain theorists concluded from this that the movement toward democracy and capitalism was irreversible, and that “history” therefore was coming to an end. It might have been an innocuous enough argument, given the care social scientists had taken in recent years to ensure that their theories bore little connection to reality; but this particular theory–associated most closely with the political scientist Francis Fukuyama–did wind up shaping the course of events. The Clinton administration drew from it the idea that if progress toward political self-determination and economic integration was assured, then the United States need only, as national security adviser Anthony Lake put it, “engage” with the rest of the world in order to “enlarge” those processes. The hegemony by consent the United States had won during the Cold War would simply become the post-Cold War international system. President Clinton himself saw little need for a grand strategy under these circumstances. Neither Roosevelt nor Truman had had one, he told a top adviser early in 1994: “they just made it up as they went along.”
There were several problems with this position, quite apart from the chief executive’s shaky knowledge of World War II and early Cold War strategy. It encouraged a tendency to view history in linear terms, and to ignore the feedback effects that can cause successes to breed failures by inducing complacency–just as failures can breed successes by shattering complacency. It sought coherence through alignment with vague processes rather than through the specification of clear objectives. It brought the Clinton team closer to the examples of Harding and Coolidge than to those of Roosevelt and Truman, for those presidents of the 1920s had also allowed an illusion of safety to produce a laissez-faire foreign and national security policy. Finally, Clinton and his advisers assumed the continued primacy of states within the international system. If you could make most of them democratic, if you could bind them together by removing restrictions on trade and investment as well as on the movement of people and ideas, then the causes of violence and the insecurity it breeds would drop away. The argument was well intentioned but shallow.
For what if the power of states themselves was diminishing? What if the very remedies the Clinton model prescribed–political self-determination and economic integration–were slowly undermining the authority of those for whom the prescription had been intended? What if the hidden history of the Cold War was one in which the great powers, under American tutelage, ultimately resolved most of their differences, only to find that their own power was no longer as great as it had once been? It doesn’t take a rocket scientist to see how this might have happened.
Self-determination certainly enhances legitimacy: that’s why democracies during the Cold War proved more durable than autocracies. But it can also expose an absence of legitimacy, which is what led to the breakup of the Soviet Union, Yugoslavia, and Czechoslovakia after the Cold War. There are now more independent states than ever before–almost 200, as compared to about 50 at the end of World War II–but that doesn’t mean that the international state system is stronger. It means just the opposite: that there are more “failed” or “derelict” states than ever before.
Integration certainly enhances prosperity: that’s why so many people benefited from the liberalization of trade and investment that took place during and after the Cold War. But the resulting global market has also constrained the ability of states to determine the conditions under which their citizens live. Marx was right in pointing out that although capitalism generates great wealth, it distributes that wealth unevenly. States used to have the capacity to cushion that process, thereby minimizing the resentment it generated: progressivism and the New Deal in the United States, social democracy in Europe, and their equivalents elsewhere provided the social safety nets that saved capitalism from the self-destruction Marx had forecast for it. Now though, in an unregulated global economy, those nets are sagging and becoming frayed.
It’s also the case that states–even democracies–used to have some control over movements of people and exchanges of ideas. We tend to celebrate the fact that it’s more difficult to impose such restrictions in a world of cheap air travel, liberal immigration policies, fax machines, satellite television transmitters, cell phones, and the internet. But there’s also a price, which is that it’s harder than it used to be for states to monitor the activities of those individuals, gangs, and networks who are their enemies.
The bottom line, then, is that states are more peaceful these days–that’s a major accomplishment of the Cold War–but they’re also weaker than they used to be. That situation too contributed to the events of September 11th, and it’s certainly shaping the era that has followed. The most important failure of strategic vision in Washington, therefore, lay in the inability of American leaders to look beyond their Cold War victory to the circumstances that might undermine its benefits. As after World War I, they allowed the absence of visible danger to convince them that nothing invisible could pose a threat. They assumed that it was enough simply to have won the game. It did not occur to them that the arena within which the game was being played–together with the rules by which the United States, its allies, and its defeated adversaries had played it–might now be at risk.
It was not just the Twin Towers that collapsed on the morning of September 11, 2001: So too did some of our most fundamental assumptions about international, national, and personal security.
SOURCE: Surprise, Security, and the American Experience, by John Lewis Gaddis (Harvard U. Press, 2004), pp. 74-80