Why We Should Call It the “War for Terror”
Throughout history, the US has defined itself by its capacity to make war and its ability to convince itself that it was doing so in the name of democracy and progress. This is the first of a two-part series.
The numbers are in, though they are provisional and incomplete. Brown University’s research center, the Watson Institute for International and Public Affairs, has issued a study that analyzes the human cost of the US wars in the Middle East conducted under the aegis of the war on terror since 2001. They calculate the death toll at between 480,000 and 507,000, and counting.
According to the political logic inaugurated by US president George W. Bush when he vowed to avenge the first attack on US soil since Pearl Harbor, the tally of deaths cited by the study is the payback for 3,000 Americans who died on 9/11. If we analyze those numbers to calculate the price of vengeance, each death in September 2001 has now been repaid at a rate of more than 160 to 1. On the basis of those figures alone, trading deaths for deaths, some number-crunchers — those who believed that it was all about “teaching them a lesson they’ll never forget” — would call that pretty good return on investment.
After a little reflection, however, they may balk at the idea that the initial death toll on that fateful day 17 years ago should be called an “investment” or that the hundreds of thousands who have died since should be called a “return.”
Or do they? It may sound extreme, but that is the question no one dares to ask. It violates our traditional ideas of morality as well as elementary notions of accounting, even though it would be perfectly consistent with some of our more modern business practices. Strategic positioning, for example. It doesn’t matter what damage you do, even to yourself, if your action allows your business to establish a solid competitive advantage. If the point of the Middle East wars was to demonstrate the extent of US military power and its ability to endure long wars, Bush might say today, more justifiably than in 2003, “mission accomplished.”
Can this be the way the military strategists have been thinking all along? The Watson Institute’s study tells us, for example, that there has been “a more than 110,000 increase over the last count, issued just two years ago in August 2016.” Why would a rational manager of military operations continue such a monumental effort on such a scale for so long if there wasn’t some business sense to it? Isn’t it all about cost and payback? It may simply be that death alone is not the best metric to measure success, though it remains an essential metric to measure the impact on people, institutions, the economy and geopolitical power relationships.
War and the American Public
Some will say that it has nothing to do with the quantity of human suffering. It is about honor and respect, which must be defended whatever the price. But there are few human activities left to which such “intangible” criteria apply and in this era of rational management those relics of an outdated aristocratic code of behavior receive short shrift in strategic planning sessions. A code of honor isn’t the same thing as a moral code, but in today’s civilization both have given way to the notion of business acumen. Even the perception of honor by the outside world has lost its value in a diplomatic and business culture that regards a show of strength as the factor that differentiates the successful competitors from the losers.
Quoting the Watson Institute’s study, Al Jazeera makes the observation that, “Though the war on terror is often overlooked by the American public, press and lawmakers, the increased body count signals that, far from diminishing, this war remains intense.” From a politician’s point of view, that amounts to a monumental achievement, highlighting a long historical trend. Lyndon Johnson and Richard Nixon dreamt of being able to intensify a war that they wished might be “overlooked” by the American public. Nixon took the first bold and effective step when he abolished the draft, replacing a citizen army with a professional army of volunteers. Instantaneously, he eliminated the deepening anguish shared by young men and their mothers, who feared their being plucked away by the government to die in foreign lands for a cause that made no sense.
Removing the threat of the draft made overlooking easier for most citizens. America was preparing for the Reagan years, when unconcerned patriots could sit back and watch a trained actor describe America’s noble conflict with “the evil empire” that would take place in the stratosphere with the latest technology. With a sense of relief, war could for once appear as a fundamentally rhetorical and psychological conflict that would require no boots on the ground, sacrifice no unwilling youngster’s life and presumably be good for the economy.
In terms of business acumen, it was also the most efficient way of consolidating America’s unassailable leadership in high-powered technology. With the end of the draft, the drama of the Vietnam years was over, but not the drama of overseas military and aggressive intelligence operations, which continued discreetly, without the fireworks of Apocalypse Now or any direct impact on the lives of American families. The protests of the 1960s, the subversive hippie movement and the organized opposition to an aggressive foreign policy all vanished. Average Americans no longer felt their life and future were at risk. Young adults could, for the first time in decades, plan their professional lives without the inconvenience of two years of military service. Communism was still the enemy, but in some ways, the Vietnam fiasco had the merit of proving that war wasn’t needed to stop its expansion. There were no post-Vietnam dominos.
After the heat of a conflict in the tropics of Indochina, the Cold War could go back to being cold. It impelled Nixon and Kissinger to move in a different direction, dramatically opening a dialogue with China. This produced the unintended but beneficial effect of calming the post-colonial troubles in Southeast Asia. Killing and destruction would no longer require the services of the US military. It could be assured by local puppets, such as General Suharto in Indonesia. The US could concentrate on undermining governments in Latin America, most spectacularly in Bolivia and Chile, without deploying troops and in a part of the world suitably far removed from the influence of Soviet Russia and China.
Jimmy Carter Plays Hamlet
Then came a new drama. In 1979, Jimmy Carter’s administration had to suffer the slings and arrows of an outrageous Iranian revolution, the delayed reaction to the 1953 coup fomented by the concerted intelligence operations of the UK and US to oust a democratic government that had nationalized the Iranian oil industry. The democratic powers of the West had imposed a quarter century of rule by the despotic Shah Mohammad Reza Pahlavi and were baffled when a fundamentalist cleric, Ruhollah Khomeini broke the spell, galvanized the population, overturned the corrupt government and declared the United States the enemy.
After ignominiously losing a helicopter in the desert sent to rescue 53 American hostages, Carter refrained from taking arms against what he correctly saw as a sea of Islamic troubles. Initially praised at home for his “measured response” to the hostage crisis, “in the following months, [his] restraint had begun to smell like weakness and indecision.” War was avoided; there would be no repeat of Vietnam. But Carter’s apparent pusillanimity would eventually undermine his bid for reelection, paving the way for the first Hollywood president, Ronald Reagan. Traumatized by defeat in Vietnam and humiliation in Iran, America sought the reassurance of a scripted version of foreign policy that might contain the kind of satisfying Hollywood ending that Carter was incapable of providing.
Paradoxically, at the end of the 1970s the US was undergoing serious withdrawal symptoms from its lack of occasions to reaffirm its military prowess and growing doubts about even its capacity to solve the world’s problems through its forceful and uncontested leadership of what had become known as the “free world.” Those doubts had been magnified by the crisis of authority brought about by the Watergate affair. When Reagan won the election in 1980, no one knew what to expect. Carter’s hesitations and the nation’s doubts set the scene for a period of experimentation and the eventual elaboration of a new type of global conflict management.
See our co-director, Neta Crawford, discussing her latest research for the Costs of War Project: https://t.co/YwXxPffV0Q
— The Costs of War (@CostsOfWar) November 19, 2018
Reagan stepped into the role accompanied by a team that included — alongside former CIA director and now vice president George H.W. Bush — Dick Cheney and Donald Rumsfeld, two men who would later play important roles in the next phase of innovative war policy that would take place two decades later under George W. Bush.
An actor’s capacity to bluff on a stratospheric level, with a missile defense program appropriately called Star Wars, set the tone for the next eight years. The Reagan administration avoided major military campaigns while expanding and strengthening clandestine intelligence operations hiding under diplomatic cover. This meant that the messy boots-on-the-ground engagements of the 1950s (Korea) and 1960s (Vietnam) were off the agenda during the Reagan years. Following Henry Kissinger’s lead during the Nixon years, the state department focused on supporting strong-arm leaders across the globe who put down rebel movements with US support on the pretext that they were led by communists, even when they weren’t.
With its new non-communist enemy, Iran, the US could apply a similar strategy. The Reagan administration egged on their puppet in Iraq, Saddam Hussein, to invest in a brutal war against Iran — a war that ended up killing half a million people. Although the Iranians sacrificed more lives than the Iraqis, who benefited from American logistical and intelligence support, the war ended in 1990 as an expensive stalemate for both countries. In what may have appeared at least locally as a bizarre twist of traditional diplomatic logic, the end of the Iraq-Iran conflict set the stage for the first operation resembling a full scale war initiated by the US since Vietnam.
Believing the Americans would continue to support him in his effort to expand strategically to secure an Iraqi access to the Persian Gulf, Saddam Hussein invaded and occupied Kuwait. The US, under its new president, the recently departed George H.W. Bush, saw this as an opportunity to shift the game of alliances in the Middle East. From Washington’s point of view, Saddam Hussein had failed in his mission. That made him dispensable. Emboldened by the collapse of the Soviet Union, the US reassessed its position and began feeling it could dictate its will with little resistance.
With communism neutralized, it saw its new mission as that of controlling and reshaping the global economy. The conditions for military failure that marked both Korea and Vietnam had disappeared, in particular the influence of the Soviet Union on other nations, coupled with ability to provide supplies and logistical support to “freedom fighters” opposed to local dictators and motivated by the idea of resisting American imperialism. From this comfortable position, president George H.W. Bush declared the first Gulf War in January 1991 and, after mobilizing an international coalition under the authority of the United Nations, humiliated Saddam Hussein, who capitulated within weeks.
Bush Sr. had restored the honor of the US. The glory of military victory, whose every strategic move was loyally documented, amplified and transmitted to an eager public by CNN, made it possible for Americans to believe again that the US could dominate entire regions of the world through direct military action whenever the need should arise.
Making War Great Again
The Cold War ended with the fall of the Berlin Wall in 1989, the definitive collapse of the Soviet Union in the ensuing years and the extraordinary friendship between Boris Yeltsin and Bill Clinton that confusingly turned Russia, at least temporarily, into a US ally before allowing it to drift into the enemy people appear to want it to be again. Francis Fukuyama had already declared the end of history, positing the end of a need for wars to settle international differences. He failed to appreciate the fact that the rehabilitation process some believed to have begun after Vietnam, with the effect of weaning the US from its psychological dependence on the regular exercise of military might, could never be complete.
At least since Andrew Jackson’s presidency in the early 19th century the US has defined itself by its capacity to make war and its ability to convince itself that it was doing so in the name of democracy and progress. The recently declassified transcripts of conversations between Bill Clinton and his friend, Boris Yeltsin, revealed just how close their relationship was and how strongly they both claimed to believe that everlasting peace between the two nations was at hand. They were surprisingly familiar and frank with each other and committed to helping the other achieve his goals. Nevertheless, in 1998 Clinton, ignoring the very personal pleas of his bosom buddy Yeltsin, and without the authority of the United Nations, launched the war against Serbia that turned Russia into an adversary, paving the way for its more recently perceived status (at least in the media) as the perennial enemy.
During the Cold War, the US defined itself and shaped its identity as the nation leading humanity’s opposition to an evil, expansionist ideology: communism. Committed to this goal in the context of the nuclear threat, the nation began mobilizing its entire economy to that end, as president Dwight D. Eisenhower acknowledged just before leaving office, when he warned of the ever-encroaching influence of the military-industrial complex. The war in Vietnam accelerated the trend, which became unstoppable.
In the early 1990s, the shift to a post-Soviet world where everyone could, as Fukuyama envisaged, share the same values, turned out to be psychologically uncomfortable for a nation so dependent on its belief in its own military might. Conversion to a peace economy, with a scaled-down defense budget, proved impossible to manage as no one dared to challenge the goose that had laid so many golden eggs. The economy had become structured around the military-industrial complex, essentially a socialistic system funded by taxpayers, in which added value increasingly resided in the development of new generations of military technology, exploited by private industry in multiple ways and, through home computing and the internet, increasingly consumed by the public.
At the dawn of what was believed to be an era of universal peace, reasonable people (such as Fukuyama) expected that with the play of free markets the centralized, socialistic side of the economy that revolved around the military would be gradually reduced to a function of basic security. Like Yeltsin, they also assumed that NATO, initially designed to confront the Soviet threat, if it didn’t disappear, would at least reduce its scope and redefine its purpose to become what Fukuyama called a kind of “a league of nations according to Kant’s own precepts.” Instead, Clinton betrayed his own promises to Yeltsin and promoted a policy of NATO expansion into Eastern Europe that the Russians to this day see as a stab in the back after dutifully converting to capitalism.
No one better (or more inimitably) expressed the culture shock that the sudden lack of an ideological enemy represented for the US than aspiring presidential candidate George W. Bush in January 2000: “When I was coming up, it was a dangerous world, and you knew exactly who they were. It was us vs. them, and it was clear who them was. Today, we are not so sure who the they are, but we know they’re there.” A year later Bush would be inaugurated on the steps of the Capitol. Eight months later, the events that enabled him to identify “who them was” began a new period in which the US not only went to war, but vowed to stay at war. The ambiguity about why the nation hadn’t managed to reduce its dependence on a military economy was definitively removed from any “serious” discussion.
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.