No, money isn’t free. Photographer: Matthew Lloyd/Getty Images
After the disease, the debt. After the plague, the pile of IOUs. It is a veritable mountain — a reminder that the original public debt in medieval Venice went by the name monte. According to the International Monetary Fund’s October Fiscal Monitor, the Covid-19 pandemic and associated lockdowns have prompted a plethora of fiscal measures amounting to $11.7 trillion, around 12% of global GDP — and that number has probably risen since it was calculated on Sept. 11. “In 2020,” according to the Fund, “government deficits are set to surge by an average of 9 percent of GDP, and global public debt is projected to approach 100 percent of GDP, a record high.”
In advanced economies, public debt relative to output has increased as much since the late 1970s as it did between 1914 and 1945. Together, the global financial crisis and the pandemic have had roughly the same doubling effect as World War II. While Covid-19 will not kill as many people globally as history’s biggest war, the ultimate U.S. death toll is very likely to be higher. The pandemic’s financial cost also looks similar to that of a world war.
The IMF’s global averages conceal huge variations between countries. The deficits of seven developed countries — Canada, the U.K., the U.S., Brazil, Italy, Spain and Japan — have each risen by more than 10% of GDP. In all these countries, gross public debt will exceed 100% of GDP this year, with Japan’s reaching 266%. The IMF’s projection for U.S. gross debt is 131%.
In many advanced countries, public debt will exceed 100% of GDP this year
Source: International Monetary Fund
Because the Federal Reserve has purchased most of the new Treasury bonds created this year, the increase in the federal debt held by the public is not so daunting. 1 The Congressional Budget Office projects that it will be just under 100% this year (98.2%), but that is still nearly three times what it was at the beginning of this century. Next year it is forecast to surpass the level in 1945.
But the story doesn’t end there, because the federal government is expected to keep borrowing as far as the eye can see, with the deficit rising inexorably from 4% in the late 2020s to over 12% by mid-century. The CBO’s baseline projection is for the federal debt in public hands to reach 195% of GDP by 2050, nearly twice as high as at the end of World War II.
U.S. debt in public hands could near 200% by 2050, nearly twice as high as at the end of World War II
Source: U.S. Congressional Budget Office
Historically, large public debts have had a terrible reputation. In his “Rural Rides,” which he began in 1822 and published in 1830, the English radical William Cobbett inveighed against the vast national debt incurred during the Napoleonic Wars. The political purpose of the debt, Cobbett argued, had been “to crush liberty in France and to keep down the reformers in England” but its principal effect after the war was redistributive. “A national debt, and all the taxation and gambling belonging to it, have a natural tendency to draw wealth into great masses … for the gain of a few.”
“The Debt, the blessed Debt,” he continued, was “hanging round the neck of this nation like a millstone.” It was a “vortex,” sucking money from the poor to a new plutocracy.
Cobbett was not wrong. According to the best estimates from the Bank of England, Britain’s wars between 1776 and 1815 drove up the debt/GDP ratio from 86% to more than 172% by 1822. In those days, government bonds were almost entirely owned by a tiny wealthy elite, while taxation was largely indirect, on imports and consumption, and therefore highly regressive. Moreover, real (inflation-adjusted) long-term interest rates were strongly positive, averaging 5.27% in the 1820s.
The “Blessed Debt”
In the 19th century, Britain’s war debts and high real rates enriched a new plutocracy
Source: Bank of England
The good bad news is that today’s public debts have a very different character. Ownership is more evenly distributed, as most bonds are held institutionally by insurance and pension funds and other financial institutions. Taxation everywhere is far more progressive than it was in the early 19th century, a time when income taxes were regarded as wartime expedients. And, as we have seen, a significant portion of today’s public debts are held by central banks, meaning that one part of the government owes money to another.
In an important new paper, economists Jason Furman and Lawrence Summers argue that public borrowing today offers something very like that rarity in economics: a free lunch. The key is the historically low level of nominal and real interest rates. On the one hand, low rates mean that “monetary policy cannot be relied on to stabilize the economy.” On the other hand, low rates also mean that “fiscal expansions themselves can improve fiscal sustainability by raising GDP more than they raise debt and interest payments.”
Today’s interest rates mean that debt/GDP ratios are a bad way to measure debt burdens. After all, the accumulated public debt is a stock, whereas GDP is a flow. If debt is measured relative to estimates of the present value of GDP or prospective tax receipts, then “current debt levels are at low rather than high levels.”
Furman and Summers are not saying — as the proponents of modern monetary theory do — that debt doesn’t matter and the sky is the limit. They are simply arguing that “traditional ideas of a cyclically balanced budget on the grounds that [high debt] would likely lead to inadequate growth and excessive financial instability” are anachronistic. Fiscal policy can support growth with ongoing deficits so long as real debt service (i.e., interest payments adjusted for inflation) does not rise above 2% of GDP over the coming decade.
If governments borrow to finance investment, then, so much the better because “many public investments pay for themselves, or come close to paying for themselves, and the risk of not undertaking these investments is larger than the risk of doing too little deficit reduction.”
“Currently,” Furman and Summers conclude, “the primary worry for policy in the United States and several other countries is doing too little to expand the debt, not doing too much.” Democrats hoping for dual victories in next month’s Senate run-off elections in Georgia will read these words with tears of joy in their eyes. If they can only replace Sen. Mitch McConnell with Vice President-elect Kamala Harris as master/mistress of the Senate, they can fulfill their campaign pledges of spending up to $4 trillion, with nothing to fear from the bond vigilantes that terrorized former President Bill Clinton’s administration in its early days.
Furman and Summers are by no means the first to argue that debt-to-GDP is the wrong way to measure fiscal sustainability. In 2001’s “The Cash Nexus,” for example, I made a similar point: What really matters is keeping the real growth rate above the real debt service rate. Like Furman and Summers, I cited the pioneering work of Laurence Kotlikoff, who focuses on the present value of projected spending and revenues, as well as on the distributional effects of fiscal policy between generations.
And credit where credit is due. In the debates on interest rates and inflation that followed the global financial crisis, Summers was the winner. His 2014 lecture on “secular stagnation” — which argued that for a variety of structural reasons (e.g. aging populations and inequality) interest rates would remain stuck close to zero for the foreseeable future — proved prescient. Those of us who worried (as I did briefly) that Fed purchases of bonds (quantitative easing) might be inflationary were wrong. So were the Fed economists who wanted to normalize monetary policy by raising rates preemptively. (Remember how well that went two years ago?)
The problem is that for there to be a free lunch, financed by borrowing that pays for itself, secular stagnation has to continue: In other words, interest rates have to stay at their present low levels, which isn’t what the CBO expects. In its most recent long-run forecasts, nominal and real rates rise over the course of the 2020s. That means that net interest payments would rise above 2% of GDP from 2030 onward and hit 8.1% in 2050.
No More Free Lunch
Rising rates would drive up the government’s net interest payments
Source: U.S. Congressional Budget Office
True, as Furman and Summers point out, the CBO has been consistently wrong about the future path of interest rates, overshooting repeatedly since 1990. True, any forecasts beyond a ten-year time horizon are subject to great uncertainty.
Even so, my Hoover Institution colleague John Cochrane has history on his side when he expresses concern. As he argued in National Review last week, the situation is very different today from the situation in 1945, the last time the U.S. had a debt mountain this big. “By 1945, the war and its spending were over. For the next 20 years, the U.S. government posted steady small primary surpluses, not additional huge deficits. Until the 1970s, the country experienced unprecedented supply-side growth in a far less regulated economy with small and solvent social programs. … [Today] we are starting a spending binge with the same debt relative to GDP with which we ended WWII.”
In any case, as noted above, around three-quarters of this year’s deficits have been financed by Fed money creation in the form of excess bank reserves. “When the economy recovers,” Cochrane argues, “people may want to invest in better opportunities than trillions of dollars of bank deposits. The Fed will have to sell its holdings of Treasury securities to mop up the money. We will see if the once-insatiable desire for super low-rate Treasury securities is really still there. If not, the Fed will have to raise rates much faster than their current promises.”
This goes to the heart of the matter. A debt mountain doesn't matter only so long as interest rates remain low. That implies that there could very well be a key role for monetary policy if market participants anticipate higher inflation and start selling their holdings of Treasury bonds. The Fed has a new framework now, which states that inflation above its 2% target is just fine, after 12 years mostly below that level, as long as it averages out around 2%. But that clearly means a prolonged period of negative returns on government bonds, made worse for foreign investors if the dollar continues to slide against other major currencies.
If market rates start to rise, the Fed will be put to the test. Will it behave as it did in World War II, intervening to keep rates low in order to avoid a rapid rise in government debt-servicing costs? There is a widespread belief that it will and that Japanese-style “yield curve control” lies ahead. But in 1945 that was a wartime expedient and it was ended with the Fed-Treasury Accord of 1951, which restored the separation of monetary policy from debt management.
Another way of thinking about this is to contrast the likely trajectory of the post-pandemic economy with the sluggish path of recovery after 2008-9. A financial crisis originates in overstretched balance sheets — in the case of the 2008-9, those of banks, shadow banks and subprime mortgage borrowers. It took the better part of a decade for balance sheets to be repaired, which was one reason for the slow pace of recovery in the Obama years — the background against which secular stagnation seemed the right diagnosis.
The post-pandemic economy will be very different — and this is the bad good news. This year, thanks to Covid-19, the U.S. household savings rate has had its most volatile year since modern data began in 1948. In the second quarter, it jumped to an unprecedented 26%, compared to 7.3% a year before. As lockdowns and other restrictions were relaxed, the rate declined to 16% in the third quarter.
To expect such high rates to persist into 2021, as the OECD does in its latest Economic Outlook, is surely wrong. This was forced saving of income boosted by government handouts, prompted by a supply-led shock (lockdowns), not balance-sheet repair as after 2008-9. According to our estimates at Greenmantle, U.S. households are now sitting on roughly $1 trillion of excess savings as a result. Many are itching to spend a large chunk of that money as soon as they can.
The best analogy for the Covid-induced economic slump is not a normal recession but a war. With vaccine distribution in sight, society is now preparing to demobilize. As World War II wound down, many esteemed economists — notably Alvin Hansen, who coined the term “secular stagnation” — wrongly predicted an enduring economic crisis. Instead, the gradual removal of wartime restrictions led to a boom in consumption. Something similar seems in prospect next year.
The key question is how inflationary that post-pandemic boom will be. Most economists seem to agree with Furman and Summers that secular stagnation is here to stay. Charles Goodhart of the London School of Economics is one of the few to predict a “surge of inflation” as soon as next year. If he is right, the promised debt-funded free lunch could turn out to be a very expensive dinner.
While I doubt Goodhart’s prediction that inflation might rise to 5% next year, inflation can come at you fast, as my Bloomberg colleague John Authers pointed out last week. The U.S. housing market has roared back. Home equity withdrawals have soared. Bank deposits are way up and household debt-service ratios are at all-time lows. We are heading for a roaring 2021, if not the full Roaring Twenties. With a weak dollar and rising commodity prices, inflation might just give the Fed a fright.
I am not a macroeconomist; I am a mere economic historian. To me, past experience is more compelling than any model. The lesson of history is indeed that there is no correlation between debt-to-GDP ratios and long-term interest rates, just as there is no simple relationship between the size of central bank balance sheets and inflation.
But history also teaches us that debt and power are connected: A great power or empire that accumulates too high a mountain of debt and fails to keep growth ahead of debt service is destined to decline. The Bourbons, the Ottomans and the British all learned this hard lesson. So the post-pandemic debt dynamics matter not just for markets but for geopolitics.
In a new book published in online installments, “The Changing World Order,” Bridgewater Associates LP founder Ray Dalio argues that the U.S. is in the wrong stage of classic debt cycle. “When the government runs out of money (by running a big deficit, having large debts, and not having access to adequate credit) it has limited options,” he writes in chapter 9:
It can either 1) raise taxes and cut spending a lot or 2) print a lot of money, which depreciates its value. Those governments that have the option to print money always do so because that is the much less painful path, but it leads investors to run out of the money and debt that is being printed. Those governments that can’t print money have to raise taxes and cut spending, which drives those with money to run out of the country, state, or other jurisdiction because paying more taxes and losing services is intolerable. If these entities that can’t print money have large wealth gaps among their constituents, these moves typically lead to some form of civil war/revolution. This late-cycle debt dynamic is now playing out in the United States.
Scary stuff. And, to judge by an essay written by Guo Shuqing, chair of the China Banking & Insurance Regulator Commission and party secretary of the Chinese central bank, Dalio has influential readers in China, the inexorable rise of which is the other big theme of his book.
Still, I am old enough to remember Paul Kennedy’s argument in “The Rise and Fall of Great Powers” that the total U.S. federal debt in 1985 (then a mere 35% of GDP) was a sign of impending American overstretch reminiscent of “France in the 1780s, where the fiscal crisis contributed to the domestic political crisis.” What followed instead was the collapse of the Soviet empire and American triumph in the Cold War, not to mention Japan’s lost decades.
Most commentators are ending the year bullish on China — the only major economy that grew this year, and forecast by the OECD to grow by 8% next year. China’s gross public debt will be just 62% of GDP this year, less than half the U.S. figure.
But it is private debt that worries Chinese officials such as Guo and Vice Premier Liu He, not public debt. Since President Xi Jinping came to power in November 2012, according to the Bank for International Settlements, credit to households has doubled as a share of GDP to 59%, while credit to non-financial corporations has jumped by 38 percentage points to 162%.
China’s “Gray Rhino” Risk
The government is more worried about private debt as a share of GDP than public debt
Source: Bank for International Settlements
Guo’s fear is that excess leverage in the property sector — which accounts for about 39% of total outstanding bank lending — is the “biggest gray rhino risk” facing China’s financial system. The enduring impact of the pandemic has created a growing problem of non-performing loans, driving smaller lenders to insolvency. Last month saw a series of defaults by state-owned companies in China. And last week S&P Global Ratings warned that local government financing vehicles could be the next casualties as the authorities clean house.
After the disease, the debt. But it’s important to look not just at public debt but also at private debt when trying to see which great power has the steeper mountain to climb.
(Adds footnote in fourth paragraph to clarify what is included in federal debt held by the public. )
The federal debt held by the public excludes bonds held by federal trust funds such as Medicare and Social Security, but includes bonds bought by the Federal Reserve.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Virtual reality. Photographer: Ozan Kose/AFP/Getty Images
In “Shuggie Bain,” Douglas Stuart’s award-winning and harrowing depiction of alcoholism, sectarianism and deprivation in post-industrial Scotland, money is always scarce and often dirty. Deserted by her second husband and unable to hold down a job, Shuggie’s mother, Agnes, relies on her twice-a-week child benefit to feed her children — or her booze habit. As the latter nearly always wins, she and Shuggie are regularly reduced to desperate expedients to fend off starvation: Extracting coins from electricity and television meters, pawning their few valuable possessions, and ultimately selling their bodies for brutal sexual favors.
Stuart vividly captures the miseries of a Glasgow of greasy coins and filthy banknotes. After one of many wretched copulations in the back of a taxi, one of Agnes’s lovers inadvertently showers her with coins from his pocket. Shuggie’s father briefly reappears at one point, handing his son two 20-pence pieces from his taxi’s change dispenser by way of a gift, grudgingly adding four 50-pence pieces when the boy looks nonplused. (“Don’t ask for mair!”) The “rag-and-bone man,” who goes from house to house buying old clothes and junk, pays “with a roll of grubby pound notes” bound by an old Band-Aid. The image is especially startling because banknotes have so rarely featured in the narrative. The only credit in this world is from rent-to-own catalogues, the Provident doorstep lender, and a few hard-pressed shopkeepers.
I grew up in middle-class, mostly sober Glasgow, but I still remember the tyranny of those damned coins: the nightmare of having too few for a bus fare or the wrong sort for a phone box. To my children, all this is as much a part of ancient lore as pirate chests of doubloons once were to me. Coins are fast fading from their lives, soon to be followed by banknotes. In some parts of the world — not only China but also Sweden — nearly all payments are now electronic. In the U.S., debit card transactions have exceeded cash transactions since 2017. Even in Latin America and parts of Africa, cash is yielding to cards and a growing number of people manage their money through their phones.
We are living through a monetary revolution so multifaceted that few of us comprehend its full extent. The technological transformation of the internet is driving this revolution. The pandemic of 2020 has accelerated it. To illustrate the extent of our confusion, consider the divergent performance of three forms of money this year: the U.S. dollar, gold and Bitcoin.
The dollar is the world’s favorite money, not only dominant in central bank reserves but in international transactions. It is a fiat currency, its supply determined by the Federal Reserve and U.S. banks. We can compute its value relative to the goods consumers buy, according to which measure it has scarcely depreciated this year (inflation is running at 1.2%), or relative to other fiat currencies. On the latter basis, according to Bloomberg’s dollar spot index, it is down 4% since Jan. 1. Gold, by contrast, is up 15% in dollar terms. But the dollar price of a bitcoin has risen 139% year-to-date.
This year’s Bitcoin rally has caught many smart people by surprise. Last week’s high was just below the peak of the last rally ($19,892 according to the exchange Coinbase) in December 2017. When Bitcoin subsequently sold off, the New York University economist Nouriel Roubini didn’t hold back. Bitcoin, he told CNBC in February 2018, had been the “biggest bubble in human history.” Its price would now “crash to zero.” Eight months later, Roubini returned to the fray in congressional testimony, denouncing Bitcoin as the “mother of all scams.” In tweets, he referred to it as “Shitcoin.”
Fast forward to November 2020, and Roubini has been forced to change his tune. Bitcoin, he conceded in an interview with Yahoo Finance, was “maybe a partial store of value, because … it cannot be so easily debased because there is at least an algorithm that decides how much the supply of bitcoin raises over time.” If I were as fond of hyperbole as he is, I would call this the biggest conversion since St. Paul.
Roubini is not the only one who has been forced to reassess Bitcoin this year. Among the big-name investors who have turned bullish are Paul Tudor Jones, Stan Druckenmiller and Bill Miller. Even Ray Dalio admitted the other day that he “might be missing something” about Bitcoin.
Financial journalists, too, are capitulating: On Tuesday, the Financial Times’s Izabella Kaminska, a long-time cryptocurrency skeptic, conceded that Bitcoin had a valid use-case as a hedge against a dystopian future “in which the world slips towards authoritarianism and civil liberties cannot be taken for granted.” She is on to something there, as we shall see.
So what is going on?
First, we should not be surprised that a pandemic has quickened the pace of monetary evolution. In the wake of the Black Death, as the historian Mark Bailey noted in his masterful 2019 Oxford Ford lectures, there was an increased monetization of the English economy. Prior to the ravages of bubonic plague, the feudal system had bound peasants to the land and required them to pay rent in kind, handing over a share of all produce to their lord. With chronic labor shortages came a shift toward fixed, yearly tenant rents paid in cash. In Italy, too, the economy after the 1340s became more monetized: It was no accident that the most powerful Italian family of the 15th and 16th centuries were the Medici, who made their fortune as Florentine moneychangers.
In a similar way, Covid-19 has been good for Bitcoin and for cryptocurrency generally. First, the pandemic accelerated our advance into a more digital word: What might have taken 10 years has been achieved in 10 months. People who had never before risked an online transaction were forced to try, for the simple reason that banks were closed. Second, and as a result, the pandemic significantly increased our exposure to financial surveillance as well as financial fraud. Both these trends have been good for Bitcoin.
I never subscribed to the thesis that Bitcoin would go to zero after it plunged in price in late 2017 and 2018. In the updated 2018 edition of my book, “The Ascent of Money” — the first edition of which appeared more or less simultaneously with the foundational Bitcoin paper by the pseudonymous Satoshi Nakamoto — I argued that Bitcoin had established itself as “a new store of value and investment asset — a type of ‘digital gold’ that provides investors with guaranteed scarcity and high mobility, as well as low correlation with other asset classes.”
“Satoshi’s goal,” I argued, “was not to create a new money but rather to create the ultimate safe asset, capable of protecting wealth from confiscation in jurisdictions with poor investor protection as well as from the near-universal scourge of currency depreciation … Bitcoin is portable, liquid, anonymous and scarce … A simple thought experiment would imply that $6,000 is therefore a cheap price for this new store of value.”
Two years ago, I estimated that around 17 million bitcoins had been mined. The number of millionaires in the world, according to Credit Suisse, was then 36 million, with total wealth of $128.7 trillion. “If millionaires collectively decided to hold just 1% of their wealth as Bitcoin,” I argued, “the price would be above $75,000 — higher, if adjustment is made for all the bitcoins that have been lost or hoarded. Even if the millionaires held just 0.2% of their assets as Bitcoin, the price would be around $15,000.” We passed $15,000 on Nov. 8.
What is happening is that Bitcoin is gradually being adopted not so much as means of payment but as a store of value. Not only high-net-worth individuals but also tech companies are investing. In July, Michael Saylor, the billionaire founder of MicroStrategy, directed his company to hold part of its cash reserves in alternative assets. By September, MicroStrategy’s corporate treasury had purchased bitcoins worth $425 million. Square, the San Francisco-based payments company, bought bitcoins worth $50 million last month. PayPal just announced that American users can buy, hold and sell bitcoins in their PayPal wallets.
This process of adoption has much further to run. In the words of Wences Casares, the Argentine-born tech investor who is one of Bitcoin’s most ardent advocates, “After 10 years of working well without interruption, with close to 100 million holders, adding more than 1 million new holders per month and moving more than $1 billion per day worldwide,” it has a 50% chance of hitting a price of $1 million per bitcoin in five to seven years’ time.
Whoever he is or was, Satoshi summed up how Bitcoin works: It is “a purely peer-to-peer version of electronic cash” that allows “online payments to be sent directly from one party to another without going through a financial institution.” In essence, Bitcoin is a public ledger shared by a network of computers. To pay with bitcoins, you send a signed message transferring ownership to a receiver’s public key. Transactions are grouped together and added to the ledger in blocks, and every node in the network has an entire copy of this blockchain at all times. A node can add a block to the chain (and receive a bitcoin reward) only by solving a cryptographic puzzle chosen by the Bitcoin protocol, which consumes processing power.
Nodes that have solved the cryptographic puzzle — “miners,” in Bitspeak — are rewarded not only with transaction fees (5 bitcoins per day, on average), but also with additional bitcoins — 900 new bitcoins per day. This reward will get cut in half every four years until the total number of bitcoins reaches 21 million, after which no new bitcoins will be created.
There are three obvious defects to Bitcoin. As a means of payment, it is slow. The Bitcoin blockchain can process only around 3,000 transactions every 10 minutes. Transaction costs are not trivial: Coinbase will charge a 1.49% commission if you want to buy one bitcoin.
There is also a significant negative externality: Bitcoin’s “proof-of-work” consensus algorithm requires specialized computer chips that consume a great deal of energy — 60 terawatt-hours of electricity a year, just under half the annual electricity consumption of Argentina. Aside from the environmental costs, one unforeseen consequence has been the increasing concentration of Bitcoin mining in a relatively few hands — many of them Chinese — wherever there is cheap energy.
But these disadvantages are outweighed by two unique features. First, as we have seen, Bitcoin offers built-in scarcity in a virtual world characterized by boundless abundance. Second, Bitcoin is sovereign. In the words of Casares, “No one can change a transaction in the Bitcoin blockchain and no one can keep the Bitcoin blockchain from accepting new transactions.” Bitcoin users can pay without going through intermediaries such as banks. They can transact without needing governments to enforce settlement.
Source: Bank for International Settlements
The advantages of scarcity are obvious at a time when the supply of fiat money is exploding. Take M2, a measure of money that includes cash, bank accounts (including savings deposits) and money market mutual funds. Since May, U.S. M2 has been growing at a year-on-year rate above 20%, compared with an average of 5.9% since 1982. The future weakness of the dollar has been a favorite 2020 talking point for Wall Street economists such as Steve Roach. You can see why. There really are a lot of dollars around, even if their velocity of circulation has slumped because of the pandemic.
The advantages of sovereignty are less obvious but may be more important. Bitcoin is not the only form of digital money that has flourished in 2020. China has been advancing rapidly in two different ways.
Nowhere in the world are mobile payments happening on as large a scale as in China, thanks to the spectacular growth of Alipay and WeChat Pay. Those electronic payment platforms now handle close to $40 trillion of transactions a year, more than double the volume of Visa and Mastercard combined, according to calculations by Ribbit Capital. The Chinese platforms are expanding rapidly abroad, partly through investments in local fintech companies by Ant Group and Tencent.
At the same time, the People’s Bank of China has accelerated the rollout of its digital currency. The potential for a digital yuan to be adopted for remittance payments or cross-border trade settlements is substantial, especially if — as seems likely — countries participating in the One Belt One Road program are encouraged to use it. Even governments that are resisting Chinese financial penetration, such as India, are essentially building their own versions of China’s electronic payments systems.
Some economists, such as my friend Ken Rogoff, welcome the demise of cash because it will make the management of monetary policy easier and organized crime harder. But it will be a fundamentally different world when all our payments are recorded, centrally stored, and scrutinized by artificial intelligence — regardless of whether it is Amazon’s Jeff Bezos or China’s Xi Jinping who can access our data.
In its early years, Bitcoin suffered reputational damage because it was adopted by criminals and used for illicit transactions. Such nefarious activity has not gone away, as a recent Justice Department report makes clear. Increasingly, however, Bitcoin has an appeal to respectable individuals and institutions who would like at least some part of their economic lives to be sheltered from the gaze of Big Brother.
It is not (as the term “cryptocurrency” misleadingly implies) that Bitcoin is beyond the reach of the law or the taxman. When the Federal Bureau of Investigation busted the online illegal goods market Silk Road in 2013, it showed how readily government agencies can trace the counterparties in suspect Bitcoin transactions. This is precisely because the blockchain is an indelible record of all Bitcoin transactions, complete with senders’ and receivers’ bitcoin addresses.
Moreover, the Internal Revenue Service is perfectly prepared to demand information on bitcoin accounts from exchanges, as Coinbase discovered in 2016. A rumor of new U.S. Treasury regulations requiring greater disclosures by exchanges caused a sharp crypto selloff over Thanksgiving. The point is simply that the financial data of law-abiding individuals is better protected by Bitcoin than by Alipay. As the Stanford political theorist Stephen Krasner pointed out more than 20 years ago, sovereignty is a relative concept.
Rather than seeking to create a Chinese-style digital dollar, Joe Biden’s nascent administration should recognize the benefits of integrating Bitcoin into the U.S. financial system — which, after all, was originally designed to be less centralized and more respectful of individual privacy than the systems of less-free societies.
Life in the East End of Glasgow in the 1980s was nasty, brutish and short of money. But all those transactions in grubby pounds and pence — genuine shitcoins — were, if nothing else, private. If Agnes Bain bought Special Brew instead of oven chips, it was a matter for her, the shopkeeper, and her long-suffering kids; the state was none the wiser. That was scant consolation to poor Shuggie. But, as we have learned again this year, a free society comes at a price that is not always payable in cash.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Social mores change more than you think. If a time machine could take you back to 1981, you would be shocked by how much people smoked, for example, and how much time they spent standing outside telephone boxes that stank of urine, jingling coins in the pockets of their nasty flares or drainpipes, grinding their nasty yellow teeth. If you are a woman, you would be appalled by the overt sexism of male conversation. If you’re not white, you’d be revolted by the casual racism. And if you’re gay . . . well, more about that later. Not that these prejudices have been eradicated, but they were worse then.
So let’s ask ourselves how much social mores are going to change as a result of the Covid-19 pandemic. In the past week, I’ve had several conversations on the topic of “the world after Covid-19”. My immediate response has been: “Why do you use the word ‘after’? Why not ‘with’?”
Yes, there is undoubtedly a benign scenario in which one of the more than 70 teams working on a vaccine against Sars-CoV-2 collects the prize. If all goes well, that vaccine could jump through all the scientific and regulatory hoops, go into mass production and be available some time in the second half of 2021.
In this same happy-ever-after scenario, there are also breakthroughs in Covid-19 therapies. New research confirms that the disease doesn’t do anything much to endanger the lives and health of younger people, and if they do get infected, they get lasting immunity. Summer comes to the northern hemisphere and the contagion recedes. As lockdowns are lifted and people return to their normal gregarious habits, there is no second wave of the pandemic. Far from devastating the southern hemisphere, the disease proves a minor event in Africa.
And — still looking on the bright side — stock markets rally and economies surge to a high-speed V-shaped recovery that makes me, and others who worry about a protracted depression, look silly.
All this is possible, and devoutly to be hoped for. But it is by no means a 100% certainty. Just consider the odds against a successful vaccination. Do we have one for malaria? No. Tuberculosis? No. HIV-Aids? No. (After 40 years of toil, there have been just a handful of phase 3 clinical trials, one of which made the disease worse. The best had a success rate of just 30%.) How about rotavirus, the most common cause of diarrhoea among infants? Yes, they did find a vaccine for that — after 15 years.
Even if a vaccine is found, there will be multiple risks associated with the current rush to devise and deploy one. And even if there are no setbacks, it might turn out to be like influenza: you can get your flu shot each year, but there’s no guarantee you won’t get some other strain than the ones you were vaccinated against.
That’s why we need to give at least some thought to the not-so-nice scenario of living with Covid-19 — at best, the way we live with flu, which delivers its regular seasonal bump in the mortality rate; at worst, the way we have slowly and painfully learnt to live with HIV-Aids.
Which takes us back to being gay in 1981, the year the New York Native newspaper published the first article about gay men being treated in intensive care units for a strange new illness. (The headline was: “Disease rumours largely unfounded.”) It was more than a year later that the term Aids (acquired immune deficiency syndrome) was proposed for the all too real disease.
Here’s a thought experiment. Imagine a world in which Covid-19 — which still has a long way to go before it catches up with Aids as a killer — has the same effect on social life as Aids had on sexual life. That would be quite a different world, and more visibly so (as changes in sexual behaviour largely take place behind closed doors).
Imagine a world in which we routinely wear masks on public transport and in offices; a world in which we greet one another with a wave, not a hug or a handshake; a world in which grandparents see their grandchildren only on FaceTime; a world in which to cough or sneeze in public is as shameful as to fart; a world in which we rarely eat in restaurants or fly; a world without theatres and cinemas (other than a few retro drive-ins); and a world in which football is played in silent, empty stadiums. (Will there be canned cheering, just as there used to be canned laughter in sitcoms?)
I’m not the first person to notice that there are some lessons to be learnt from the last really lethal pandemic caused by a virus, despite the important differences between HIV and Sars-CoV-2, and between Aids and Covid-19. One nurse has recalled the similar ways the authorities responded — at first with complacency and then by stigmatising victims (for “the Chinese virus”, read “the gay plague”). Last month, The New York Times published an article asking “Are facemasks the new condoms?” — destined to become “ubiquitous, sometimes fashionable [and] promoted with public service announcements”.
Yet the lesson of HIV-Aids is not quite that it “changed everything”. The really striking feature of the history of the Aids pandemic is that behaviour only partly changed after the recognition of a new and deadly disease spread by sex, needle-sharing and blood transfusions. An early American report noted “rapid, profound but . . . incomplete alterations in the behaviour of both homosexual/bisexual males and intravenous drug users”, as well as “considerable instability or recidivism”. By 1998, just 19% of American adults reported some change in their sexual conduct in response to the threat of Aids.
The advent of antiretroviral drugs that stop HIV carriers succumbing to Aids has somewhat diminished the fear factor. Even so, one might have expected a bit more fear to persist. A 2017 paper showed that fewer than half of at-risk men had used a condom last time they had sex. According to a recent British study, sustained campaigns of public and individual education are necessary to discourage gay men from having sex without condoms. In Africa, the “ABC” — abstain, be faithful and “condomise” — approach has had limited success.
Yes, there have been changes in sexual behaviour. According to the psychologist Jean Twenge, millennials have fewer sexual partners on average than earlier generations. Another American study concluded: “Promiscuity hit its modern peak for men born in the 1950s.” And let’s not forget the invaluable UK National Survey of Sexual Attitudes and Lifestyles, the most recent version of which revealed a marked decline in the frequency of sex in Britain.
Yet few if any of these changes can be attributed to HIV-Aids. The return of “No sex, please, we’re British” mainly affects married and cohabiting couples, and, according to the definitive analysis in the BMJ, is most likely due to “the introduction of the iPhone in 2007 and the global recession of 2008”.
Social mores change more than you think. In the face of a deadly disease, however, they also change less than you might expect.
Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford, and managing director of Greenmantle
One of my favourite cartoon characters from the 1970s was the little Japanese-Italian chick Calimero, whose constant, plaintive refrain was: “It’s an injustice, it is!” I have been hearing modern versions of Calimero’s lament a lot recently.
“It’s true that more men are dying than women from Covid-19 around the world,” wrote Ryan Heath and Renuka Rayasam in Politico, “but that’s not exactly cause for celebration.” Not exactly? Then there was the Atlantic journalist Annie Lowrey, who wanted to persuade us that the economic burdens of the pandemic were disproportionately falling on millennials.
Growing up in Glasgow, my friends and I liked to quote Calimero sarcastically at anyone who complained about their lot. “It’s an injustice, it is!”
Let’s get one thing straight: the principal losers in a pandemic are the people the contagious disease kills before their time. They are disproportionately old and (to a lesser extent) male.
The latest provisional figures for deaths registered in England and Wales show significant excess mortality, relative to five-year averages, in the first three weeks of last month. In the week ending April 17, for example, there were nearly 12,000 excess deaths, more than double the five-year average. Deaths attributed to Covid-19 were equivalent to three-quarters (74%) of the excess; 88% of Covid-19 deaths were of people older than 65; and 58% of Covid-19 victims were men.
As we learn more about the excess deaths not attributed to Covid-19, we shall see that most were directly or indirectly attributable to the pandemic — people in care homes who probably did have the virus, or people dying of heart attacks because they were afraid to go to hospitals — so the basic story will not change: this is no virus for old men.
But what about the economic injustices of the pandemic? Hans Holbein’s Dance of Death series makes it clear that death in the era of plague was no respecter of rank. By contrast, cholera pandemics in the 19th century waged class war against an urban proletariat living cheek by jowl in filthy slums.
Covid-19 is different. It began with the relatively well-off jet set — the kind of people who fly to conferences in Singapore and then to ski chalets in the Alps. As soon as it got out of the airports, however, the virus went downmarket, spreading rapidly wherever people are tightly packed indoors — subways in big cities, for example. The mortality rate in poor areas of England is double that in rich areas, according to the Office for National Statistics. In Britain and America, the non-white population is being harder hit. Yet it has been the responses of government that have principally determined how the costs of the pandemic have been distributed.
Wherever you look, the economic data is the worst of our lifetimes. In America, about 30 million jobs have been lost in the space of just six weeks. Donald Trump’s economic adviser Kevin Hassett warned last week that the unemployment rate could reach between 16% and 20% next month, the highest since the early 1930s.
And yet the US stock market has rallied so much since its low on March 23 that it ended last month just 14% below its pre-pandemic peak. In other words, investors think it’s as bad as . . . early October 2019, when the S&P 500 index was last at Thursday’s level. The market has actually rallied 30% since the nadir of March 23.
We are simultaneously a) suffering a public health disaster, with a second wave of infections and illness likely at some point when we go back to work and school; b) inflicting a deep and probably long recession on ourselves, with lockdowns that are the bluntest possible instrument for controlling contagion; and c) breaking the record for an equity market rally. How can we resolve this huge paradox?
The answer is that unconventional monetary policy is being used on an unprecedented scale with the principal aim of shoring up the prices of financial assets. For that is the principal effect of near-zero interest rates and quantitative easing, which has led to a 60% expansion of the Federal Reserve’s balance sheet.
Backstopping Wall Street is not in the Fed’s statutory mandate, admittedly, but it has been the Fed’s practice since the days of Alan Greenspan’s “put” option, which established an implicit floor — but not a ceiling — for stocks. Since Ben Bernanke, the Fed has also done the job of shoring up the rest of the world’s financial assets, via international swap lines.
Under Jerome Powell, all restraint has been cast aside. If the market blinked, even at full employment, he cut rates. Now he is conducting repo operations as well as swaps with foreign central banks. The amount of swaps outstanding is now $446bn (£357bn). I can’t find data on the repos.
It doesn’t hurt that about 20% of the S&P is made up of big tech companies that may ultimately make more money as a result of the pandemic, because we’re all now strongly incentivised to do more in their virtual world than in the real one (Amazon is up 24% year-to-date).
It also helps that we’re getting good news about therapies (remdesivir, for example) and vaccines (Moderna’s, for example), though I can’t help noticing that Wall Street screens out bad news about Covid-19, such as the story about people in their thirties and forties suffering strokes after contracting the disease. And, of course, we’ve flattened those curves of confirmed cases. So the stock market’s rally is not wholly illusory.
Really smart guys tell me that the second wave that I wrote about here last week is already “priced in”. Maybe. But what’s not priced in is the enduring effect the pandemic will have on demand as older consumers steer clear of shopping malls and anything else involving crowds even after lockdowns end. What’s not priced in is the political backlash as people see big companies getting bailed out — the airlines, notably — and the loans intended for small businesses also going to the big boys. What’s not priced in is the psychological depression that will follow when people in America and Britain go back to work without enough reliable testing or contact-tracing capacity to limit the size of the second wave.
At a mid-March press briefing, Trump was asked: “How are non-symptomatic professional athletes getting tests while others are waiting in line and can’t get them? Do the well-connected go to the front of the line?” The president replied: “No, I wouldn’t say so, but perhaps that’s been the story of life.”
Inequality is just “the story of life” — especially when it comes to US healthcare. Well, maybe so: as my father liked to tell his children, nobody said life was going to be fair. But that doesn’t sound like an election-winning slogan to me. Sometimes it’s possible to echo Calimero without being sarcastic: “It’s an injustice, it is!”
Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford, and managing director of Greenmantle