The end of the month figures tell the story not just of this particular moment of history but reveal much about the entire epoch we are living in. On the last day of April, The New York Times posted this headline that fittingly summarizes the central paradox of living in a financialized world: “U.S. Stocks Have Their Best Month Since 1987.” This was combined with a subheading that reads: “The news is terrible, but Wall Street had its best month in decades.”
Should the American Way of Life Be a War on American Life?
The Times can call what people assume to be good news terrible because, in the same month, 30 million people in the US applied for unemployment. Was it a recession, as many feared, or even the beginning of a second Great Depression? The fact that the stock market remains buoyant has analysts and investors hesitating. With limited asset deflation, can this be called a depression?
“The rally, even in the face of crushing economic data, highlights investors’ confidence that things will return to normal sooner than they thought when stocks were collapsing in late February and early March,” The Times reports.
Here is today’s 3D definition:
An imaginary state of equilibrium believed to have existed in an identifiable past that only seems like equilibrium to those who comfortably survived its severest consequences
In times of radical uncertainty, societies tend to invent a fictional universe they call “normal.” It can even take on the status of a mass hallucination. This type of psychological reaction is certainly predictable, but the idea it conveys tends toward the absurd. Situations of radical uncertainty can evolve in one of two directions: either toward ever-increasing chaos or — thanks to well-planned adjustments — a new state of equilibrium that will always be, in some significant way, qualitatively different than the normal everyone imagines.
The first in a succession of stimulus bills passed by the US Congress was inspired by the idea that the return to normal was only two or three months away following a lockdown due to the coronavirus pandemic. That timescale encouraged the belief on the part of lawmakers and economic thinkers that their job was to implement emergency measures permitting businesses and people to prepare for a rapid return to all their former habits and rituals. Today, the reasons to consider that assumption as totally unrealistic are rapidly multiplying.
The first factor of contradiction relates to the US economy itself. For Wall Street, the idea of “too big to fail” has become a dogma, validated in 2009 and again by the first coronavirus stimulus plan in March 2020 that focused on supporting big corporations. For lawmakers, the logic was simple. Allowing an airline, for example, to fail would mean permanent unemployment for tens of thousands of people, including those who work upstream and downstream. It would also have an impact on the stock market in an election year, something President Donald Trump and the Republicans couldn’t possibly allow. More than one isolated failure of a major corporation would in all probability lead to a definitive crash of the stock market. The priority was clearly the Dow Jones.
Another key factor that the economists who created and managed the consumer society appear to be utterly incapable of appreciating is the psychology of consumers. For the past century, the whole game has turned around the strategy of branding and pricing. Consumers could be counted on to behave in predictable ways. The brand or product image managed their desires, and the price created the possibility of generating waves of consumption.
That was then. This is now. The traumatic experience of the coronavirus shutdown and its lingering consequences over the next year or two will subtly or possibly even boldly remodel the idea people have of themselves as consumers. Unless all the traces of the current pandemic were to miraculously disappear by September this year, people will no longer be thinking in the same way about their own role as producers and consumers. The debate has already begun on social media and, to some extent, even in that pocket of resistance we call the corporate media. It will accelerate in the future as the accepted ideas about how society should function are already undergoing a massive rethink.
Then there is the world of politics. It’s difficult to imagine anything quite as sclerotic as what passes for political thinking today. Nothing is likely to change on the initiative of politicians alone. At the same time, the sudden instability of all our institutions has produced a potentially explosive situation, given the sometimes desperate political reflexes on display in the US, especially in this presidential election year. The situation isn’t much better in Europe, which has been living through a serious identity crisis at least since the Brexit referendum in 2016.
Finally, once some kind of relative calm returns as the pandemic cools down (whenever that may be), the world is now acutely aware of the drama of climate change and the need to address this massive issue. The collapse of oil prices has contributed to that awareness. Politicians and economists may want to continue to promote the fiction that these phenomena are unrelated, but everyone can see the connection.
After the financial crisis of 2007-08, enterprising governments found various fixes to create the illusion of a return to normal. It turned out to be a clever instance of politically managed hyperreality, built on a dynamic that could only be described as the opposite of equilibrium. Accelerating the trends that had been operative since the Reagan-Thatcher orthodoxy that dominated 1980s, the course toward ever-increasing economic inequality believed itself definitively immunized against risk.
The 2009 financial bailout taught the markets that assets will be supported by artificial means even if everything else in the economy becomes unstructured. The perception of what is normal became defined by the idea and practice of quantitative easing. Whatever didn’t work at the macro level would be supported by hitherto inexistent resources, just to keep the machine going. The macro level — dominated by financial institutions, hedge funds and monopolistic enterprises — had less and less to do with the real economy that nevertheless depended on their health.
And the trend kept accelerating. All of this was possible because, at the end of the day, everything could be decided in Washington, the global capital of quantitative easing. Other central banks followed the trend, but the partnership between Wall Street and Washington pulled all the strings.
That appears to be changing. In the age of cryptocurrency, the very status of the dollar as the universal reserve currency is already being challenged. As Oscar Jonsson, writing for Forbes in 2017, observed: “When trust is guaranteed by a protocol instead of financial institutions, mostly based in the West, the capability of the West to leverage economic power is reduced, which has been a key component of its grand strategy since the Second World War.”
The inevitable but still unforeseeable economic consequences of the pandemic will accelerate a trend that is linked to the ever weakening image of geopolitical authority of the United States. A world in which the trust that underlies economic transactions is distributed differently will no longer resemble the “normal” of the past. Cryptocurrencies, precisely because they are not controlled by governments, promise a new “new world order” — different from the one George H.W. Bush imagined after the fall of the Soviet Union in 1991. No one can predict how that will play out, and it is only one factor in a panoply of historical forces tending toward radical transformations.
The lesson of the whole story is that no one can predict anything anymore, especially what the “new normal” may look like or even whether there will be one. In short, we can be certain that the current pandemic will, in the short term, slightly reduce the world’s population. But it has already changed the mood and the level of understanding of citizens across the globe. And that trend is likely to continue.
*[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news.]
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.
For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.
In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.
We publish 2,500+ voices from 90+ countries. We also conduct education and training programs on subjects ranging from digital media and journalism to writing and critical thinking. This doesn’t come cheap. Servers, editors, trainers and web developers cost money. Please consider supporting us on a regular basis as a recurring donor or a sustaining member.