Colossus Read online

Page 2


  Yet it would be absurd to deny that much of what has happened in the past year—to say nothing of what has been revealed about earlier events—has tended to undermine the legitimacy of the Bush administration’s policy. To put it bluntly: What went wrong? And have failures of execution fatally discredited the very notion of an American imperial strategy?

  The first seed of future troubles was the administration’s decision to treat suspected al Qa’eda personnel captured in Afghanistan and elsewhere as “unlawful enemy combatants,” beyond both American and international law. Prisoners were held incommunicado and indefinitely at Guantánamo Bay in Cuba. As the rules governing interrogation were chopped and changed, many of these prisoners were subjected to forms of mental and physical intimidation that in some cases amounted to torture.16 Indeed, Justice Department memoranda were written to rationalize the use of torture as a matter for presidential discretion in time of war. Evidently, some members of the administration felt that extreme measures were at once justified by the shadowy nature of the foe they faced, and at the same time legitimized by the public appetite for retribution after the terrorist attacks of September 11, 2001. All of this the Supreme Court rightly denounced in its stinging judgment, delivered in June of last year [2004]. As the justices put it, not even the imperatives of resisting “an assault by the forces of tyranny” could justify the use by an American president of “the tools of tyrants.” Yet power corrupts, and even small amounts of power can corrupt a very great deal. It may not have been official policy to flout the Geneva Conventions in Iraq, but not enough was done by senior officers to protect prisoners held at Abu Ghraib from gratuitous abuse—what the inquiry chaired by James Schlesinger called “freelance activities on the part of the night shift.”17 The photographic evidence of these “activities” has done more than anything else to discredit the claim of the United States and its allies to stand not merely for an abstract liberty but also for the effective rule of law.

  Second, it was more than mere exaggeration on the part of Vice President Cheney, the former CIA chief George Tenet, and, ultimately, President Bush himself—to say nothing of Prime Minister Tony Blair—to claim they knew for certain that Saddam Hussein possessed weapons of mass destruction. This was, we now know, a downright lie that went far beyond what the available intelligence indicated. What they could legitimately have said was this: “After all his evasions, we simply can’t be sure whether or not Saddam Hussein has any weapons of mass destruction. So, on the precautionary principle, we just can’t leave him in power indefinitely. Better safe than sorry” But that was not enough for Dick Cheney, who felt compelled to make the bald assertion: “Saddam Hussein possesses weapons of mass destruction.” Bush himself had his doubts, but was reassured by Tenet that it was a “slam-dunk case.”18 Other doubters soon fell into line. Still more misleading was the administration’s allegation that Saddam was “teaming up with al Qa’eda.” Sketchy evidence of contacts between the two was used to insinuate Iraqi complicity in the 9/11 attacks, for which not a shred of proof has been found.

  Third, it was a near disaster that responsibility for the postwar occupation of Iraq was seized by the Defense Department, intoxicated as its principals became in the heat of their blitzkrieg. The State Department had spent long hours preparing a plan for the aftermath of a successful invasion. That plan was simply junked by Secretary Rumsfeld and his close advisors, who were convinced that once Saddam had gone, Iraq would magically reconstruct itself (after a period of suitably ecstatic celebration at the advent of freedom). As one official told the Financial Times last year, Undersecretary Douglas Feith led

  a group in the Pentagon who all along felt that this was going to be not just a cakewalk, it was going to be 60–90 days, a flip-over and hand-off, a lateral or whatever to … the INC [Iraqi National Congress]. The DoD [Department of Defense] could then wash its hands of the whole affair and depart quickly, smoothly and swiftly. And there would be a democratic Iraq that was amendable to our wishes and desires left in its wake. And that’s all there was to it.19

  When General Eric Shinseki, the army chief of staff, stated in late February 2003 that “something of the order of several hundred thousand soldiers” would be required to stabilize postwar Iraq, he was brusquely put down by Deputy Secretary Wolfowitz as “wildly off the mark.” Wolfowitz professed himself “reasonably certain” that the Iraqi people would “greet us as liberators.” Such illusions were not, it should be remembered, confined to neoconservatives in the Pentagon. Even General Tommy Franks was under the impression that it would be possible to reduce troop levels to just 50,000 after eighteen months. It was left to Colin Powell to point out to the president that “regime change” had serious—not to say imperial— implications. The “Pottery Barn rule,” he suggested to Bush, was bound to be applicable to Iraq: “You break it, you own it.”20

  Fourth: American diplomacy in 2003 was like the two-headed Push-mepullyou in Dr. Doolittle—it faced in opposite directions. On one side was Cheney, dismissing the United Nations as a negligible factor. On the other was Powell, insisting that any action would require some form of UN authorization to be legitimate. It is possible that one of these approaches might have worked. It was, however, a mistake to try both at once. Europe was in fact coming around as a consequence of some fairly successful diplomatic browbeating. No fewer than eighteen European governments signed letters expressing support of the impending war against Saddam. Yet the decision to seek a second UN resolution—on the grounds that the language of Resolution 1441 was not strong enough to justify all-out war—was a blunder that allowed the French government, by virtue of its permanent seat on the UN Security Council, to regain the diplomatic initiative. Despite the fact that more than forty countries declared their support for the invasion of Iraq and three (Britain, Australia, and Poland) sent significant numbers of troops, the threat of a French veto, delivered with a Gallic flourish, created the indelible impression that the United States was acting unilaterally—perhaps even illegally.21

  All of these mistakes had one thing in common. They sprang from a failure to learn from history. For among the most obvious lessons of history is that an empire cannot rule by coercion alone. It needs above all legitimacy—in the eyes of the subject people, in the eyes of the other great powers and, above all, in the eyes of the people back home. Did those concerned know no history? We are told that President Bush was reading Edward Morris’s Theodore Rex as the war in Iraq was being planned; presumably he had not reached the part when the American occupation sparked off a Filipino insurrection. Before the invasion of Iraq, Deputy National Security Advisor Stephen Hadley was heard to refer to a purely unilateral American invasion as “the imperial option.” Did no one else grasp that occupying and trying to transform Iraq (with or without allies) was a quintessentially imperial undertaking—and one that would not only cost money but would also take many years to succeed?

  Had policy makers troubled to consider what befell the last Anglophone occupation of Iraq, they might have been less surprised by the persistent resistance they encountered in certain parts of the country during 2004. For in May 1920 there was a major anti-British revolt there. This happened six months after a referendum (in practice, a round of consultation with tribal leaders) on the country’s future, and just after the announcement that Iraq would become a League of Nations “mandate” under British trusteeship rather than continue under colonial rule. Strikingly, neither consultation with Iraqis nor the promise of internationalization sufficed to avert an uprising.

  In 1920, as in 2004, the insurrection had religious origins and leaders, but it soon transcended the country’s ancient ethnic and sectarian divisions. The first anti-British demonstrations were in the mosques of Baghdad, but the violence quickly spread to the Shiite holy city of Karbala, where British rule was denounced by Ayatollah Muhammad Taqi al-Shirazi, the historical counterpart of today’s Shiite firebrand, Moktada al-Sadr. At its height, the revolt stretched as far north as the Kur
dish city of Kirkuk and as far south as Samarra. Then, as in 2004, much of the violence was more symbolic than strategically significant—British bodies were mutilated, much as American bodies were at Fallujah. Still, there was a real threat to the British position. The rebels systematically sought to disrupt the occupiers’ infrastructure, attacking railways and telegraph lines. In some places, British troops and civilians were cut off and besieged. By August 1920 the situation in Iraq was so desperate that the general in charge appealed to London not only for reinforcements but also for chemical weapons (mustard gas bombs or shells), though, contrary to historical legend, these turned out to be unavailable and so were never used.22

  This brings us to the second lesson the United States might have learned from the British experience. Reestablishing order is no easy task. In 1920 the British eventually ended the rebellion through a combination of aerial bombardment and punitive village-burning expeditions. Even Winston Churchill, then the minister responsible for the Royal Air Force, was shocked by the actions of some trigger-happy pilots and vengeful ground troops. And despite their overwhelming technological superiority, British forces still suffered more than 2,000 dead and wounded. Moreover, the British had to keep troops in Iraq long after the country was granted “full sovereignty.” Although Iraq was declared formally independent in 1932, British troops remained there until the 1950s (see chapter six).

  Is history repeating itself? For all of the talk in 2004 of restoring “full sovereignty” to an interim Iraqi government, President Bush made it clear that he intended to “maintain our troop level … as long as necessary” and that U.S. troops would continue to operate “under American command.” This in itself implied something significantly less than full sovereignty. For if the new interim Iraqi government did not have control over a well-armed foreign army in its own territory, than it lacked one of the defining characteristics of a sovereign state: a monopoly over the legitimate use of violence. That was precisely the point made in April of 2004 by Marc Grossman, undersecretary of state for political affairs, during Congressional hearings on the future of Iraq. In Grossman’s words: “The arrangement would be, I think as we are doing today, that we would do our very best to consult with that interim government and take their views into account.” But American commandes would still “have the right, and the power, and the obligation” to decide on the appropriate role for their troops.23

  There is, in principle, nothing inherently wrong with “limited sovereignty”; in both West Germany and Japan, as chapter two shows, sovereignty was limited for some years after 1945. Sovereignty is not an absolute but a relative concept. Indeed, it is a common characteristic of empires that they consist of multiple tiers of sovereignty. In what Charles Maier has called the “fractal geometry of empire,” the imperial hierarchy of power contains within it multiple scaled-down versions of itself, none fully sovereign. Again, however, there is a need for American policy makers and voters to understand the imperial business they are now in. For this business can have costly overheads.

  The problem is that for indirect rule—or “limited sovereignty”—to be successful in Iraq, Americans must be willing to foot a substantial bill for the occupation and reconstruction of the country. Unfortunately, in the absence of a radical change in the direction of the U.S. fiscal policy, their ability to do so is set to diminish, if not to disappear—the bottom line of chapter eight.

  Since President Bush’s election in 2000, total federal outlays have risen by an estimated $530 billion, an increase of nearly a third. This increase can only be partly attributed to the wars the administration has fought; higher defense expenditures account for just 30 percent of the total increment, whereas increased spending on health care accounts for 17 percent, that on Social Security and that on income security for 16 percent apiece, and that on Medicare for 14 percent.24 The reality is that the Bush administration has increased spending on welfare by rather more than spending on warfare. Meanwhile, even as expenditure has risen, there has been a steep reduction in the federal government’s revenues, which slumped from 21 percent of gross domestic product in 2000 to less than 16 percent in 2004.25 The recession of 2001 played only a minor role in creating this shortfall of receipts. More important were the three successive tax cuts enacted by the administration with the support of the Republican-led Congress, beginning with the initial $1.35 trillion tax cut over ten years and the $38 billion tax rebate of the Economic Growth and Tax Reform Reconciliation Act in 2001, continuing with the Job Creation and Worker Assistance Act in 2002, and concluding with the reform of the double taxation of dividend income in 2003. With a combined value of $188 billion—equivalent to around 2 percent of the 2003 national income—these tax cuts were significantly larger than those passed in Ronald Reagan’s Economic Recovery Tax Act of 1981.26 The effect of this combination of increased spending and reduced revenue has been a dramatic growth in the federal deficit. President Bush inherited a surplus of around $236 billion from the fiscal year 2000. At the time of writing, the projected deficit for 2004 was $413 billion, representing a swing from the black into the red of two-thirds of a trillion dollars.27

  Government spokesmen have sometimes defended this borrowing spree as a stimulus to economic activity. There are good reasons to be skeptical about this, however, not least because the principal beneficiaries of these tax cuts have, notoriously, been the very wealthy. (Vice President Cheney belied the macroeconomic argument when he justified the third tax cut in the following candid terms: “We won the midterms. This is our due.”28) Another Cheney aphorism that is bound to be quoted by future historians was his assertion that “Reagan proved deficits don’t matter.”29 But Reagan did nothing of the kind. The need to raise taxes to bring the deficit back under control was one of the key factors in George H. W. Bush’s defeat in 1992; in turn, the systematic reduction of the deficit under Bill Clinton was one of the reasons long-term interest rates declined and the economy boomed in the later 1990s. The only reason that, under Bush junior, deficits have not seemed to matter is the persistence of low interest rates over the past four years, which has allowed Bush—in common with many American households—to borrow more while paying less in debt service. Net interest payments on the federal debt amounted to just 1.4 percent of the GDP last year, whereas the figure was 2.3 percent in 2000 and 3.2 percent in 1995.30

  Yet this persistence of low long-term rates is not a result of ingenuity on the part of the U.S. Treasury. It is in part a consequence of the willingness of the Asian central banks to buy vast quantities of dollar-denominated securities such as ten-year Treasury bonds, with the primary motivation of keeping their currencies pegged to the dollar, and the secondary consequence of funding the Bush deficits.31 It is no coincidence that just under half the publicly held federal debt is now in foreign hands, more than double the proportion ten years ago.32 Not since the days of tsarist Russia has a great empire relied so heavily on lending from abroad. The trouble is that these flows of foreign capital into the United States cannot be relied on to last indefinitely, especially if there is a likelihood of rising deficits in the future. And that is why the Bush administration’s failure to address the fundamental question of fiscal reform is so important. The reality is that the official figures for both the deficit and the accumulated federal debt understate the magnitude of the country’s impending fiscal problems because they leave out of account the huge and unfunded liabilities of the Medicare and Social Security systems.33 The United States derives a significant benefit from the status of the dollar as the world’s principal reserve currency; it is one reason why foreign investors are prepared to hold such large volumes of dollar-denominated assets. But reserve-currency status is not divinely ordained. It could be undermined if international markets took fright at the magnitude of America’s still latent fiscal crisis.34 A decline in the dollar would certainly hurt foreign holders of U.S. currency more than it would hurt Americans. But a shift in international expectations about U.S. finances might also bri
ng about a sharp increase in long-term interest rates, which would have immediate and negative feedback effects on the federal deficit by pushing up the cost of debt service.35 It would also hurt highly geared American households, especially the rising proportion of them with adjustable-rate mortgages.36

  Empires need not be a burden on the taxpayers of the metropolis; indeed, many empires have arisen precisely in order to shift tax burdens from the center to the periphery. Yet there is little sign that the United States will be able to achieve even a modest amount of “burden sharing” in the foreseeable future. During the Cold War, American allies contributed at least some money and considerable manpower to the maintenance of the West’s collective security. But those days are gone. At the Democratic Party convention in Boston in July 2004, and again in the presidential debate on foreign policy two months later, John Kerry pledged to “bring our allies to our side and share the burden, reduce the cost to American taxpayers, and reduce the risk to American soldiers,” in order to “get the job done and bring our troops home.” “We don’t have to go it alone in the world,” he declared. “And we need to rebuild our alliances.”37 Yet it is far from clear that any American president would be able to persuade Europeans today to commit additional troops to Iraq, or even to subsidize the American presence there. In accepting his party’s nomination, Kerry recalled how, as a boy, he watched “British, French, and American troops” working together in postwar Berlin. In those days, however, there was much bigger incentive symbolized by the Red Army units that surrounded West Berlin for European states to support American foreign policy. It is not that the French or the Germans (or for that matter the British) were passionately pro-American during the Cold War; on the contrary, U.S. diplomats constantly fretted about anti-Americanism in Europe, on both the left and the right. Nevertheless, as long as there was a Soviet Union to the East, there was one overwhelming argument for the unity of “the West.” That ceased to be the case fifteen years ago, when the reforms of Mikhail Gorbachev caused the Russian empire to crumble. And ever since then, the incentives for transatlantic harmony have grown steadily weaker. For whatever reason, Europeans do not regard the threat posed by Islamist terrorism as sufficiently serious to justify unconditional solidarity with the United States. On the contrary, since the Spanish general election, they have acted as if the optimal response to the growing threat of Islamist terrorism is to distance themselves from the United States. An astonishingly large number of Europeans see the United States as itself a threat to international stability. In a recent Gallup pole, 61 percent of Europeans said they thought the European Union plays “a positive role with regard to peace in the world”; just 8 percent said its role was negative. No fewer than 50 percent of those polled took the view that the United States now plays a negative role.38