The Fort Knox of the future. Photographer: Andrey Rudakov/Bloomberg via Getty Images
What is the money of the future? My nine-year-old son thinks it will be Robux. For those of you trapped in the human museum known as adulthood, Robux is the currency used by players of Roblox computer games. If I offer Thomas grimy dollar bills for household chores, he shows an almost complete lack of interest and motivation. But if I offer him Robux, it’s a different story.
The current exchange rate is around 80 to the dollar. So, in order to incentivize my son to do the dishes, I need to go online and buy 2,000 Robux for $24.99. This I do by entering my credit card details on a website, an act of self-exposure that never fails to make me feel sick. However, the dishes get cleaned and, later, my son blows some of his Robux on a cool new outfit and a pair of wings for his avatar, earning the admiration of his friends.
Robux is just one of the new forms of money that exist in the parallel world of online gaming. If your kids play Fortnite, then you’ve probably had to buy them V-Bucks (short for VinderBucks). And gamer money is, in turn, just a subset of the myriad means of payment that now exist on the internet.
Writers of science fiction got many things right about the future, from pandemics to flying cars to artificial intelligence. None, so far as I know, got the future of money exactly right. In William Gibson’s seminal Neuromancer (1984), paper money (the “new yen” or N¥) has survived but is used only for illicit transactions. In Neal Stephenson’s Snow Crash (1992), hyperinflation has ravaged the value of the dollar so much that, in Compton, California, “Street people push … wheelbarrows piled high with dripping clots of million- and billion-dollar bills that they have raked up out of storm sewers.” A trillion-dollar bill is known colloquially as an “Ed Meese.” A quadrillion is a “Gipper.” (Only we Boomers now get the allusions to the former attorney general and the president he served in the 1980s.) In other dystopian futures, readily available commodities such as bullets or bottle caps serve as makeshift money, rather like cigarettes in occupied Germany in the immediate aftermath of World War II. My favorite imagined currency are the “merits” in the British TV show Black Mirror, which have to be earned by pedaling on exercise bikes.
If some other author predicted the future of money accurately, I missed it. Unfortunately, this lack of foresight now seems also to afflict U.S. policymakers, leaving the world’s financial hegemon vulnerable to a potentially fatal challenge. Not only are the American monetary authorities underestimating the threat posed to dollar dominance by China’s pioneering combination of digital currency and electronic payments. They are also treating the blockchain-based financial innovations that offer the best alternative to China’s e-yuan like gatecrashers at their own exclusive party.
Let’s begin with the future of money that no one foresaw.
In 2008, in a wonkish paper that bore no relation to any sci-fi, the enigmatic Satoshi Nakamoto launched Bitcoin, “a purely peer-to-peer version of electronic cash” that allows “online payments to be sent directly from one party to another without going through a financial institution.” In essence, Bitcoin is a public ledger shared by an acephalous (leaderless) network of computers. To pay with bitcoins, you send a signed message transferring ownership to a receiver’s public key. Transactions are grouped together and added to the ledger in blocks, and every node in the network has an entire copy of this blockchain at all times. A node can add a block to the chain (and receive a bitcoin reward) only by solving a cryptographic puzzle chosen by the Bitcoin protocol, which consumes processing power.
Nodes that have solved the cryptographic puzzle — “miners” — are rewarded not only with transaction fees, but also with more bitcoins. This reward will get cut in half every four years until the total number of bitcoins reaches 21 million, after which no new Bitcoins will be created. As I argued here last November, there were good reasons why Bitcoin left gold for dead as the pandemic was wreaking havoc last year. Scarcely over a year ago, when just about every financial asset sold off as the full magnitude of the pandemic sank in, the dollar price of a Bitcoin fell to $3,858. As I write, the price is $58,746.
The reasons for Bitcoin’s success are that it is sovereign (no one controls it, not the “whales” who own a lot, and not the miners who mine a lot), scarce (that 21 million number is final), and — above all — smart. With every day that the system works — not being hacked, not crashing — the predictions that it would prove to be a “shitcoin” look dumber, and the pressure on people to affirm their smartness by owning bitcoins grows stronger. Last year, a bunch of tech companies, including Square, PayPal and Tesla, bought a pile. Several legendary investors — Paul Tudor Jones, Stan Druckenmiller, Bill Miller — came out as long Bitcoin. Perhaps most importantly, Bitcoin began to be treated like a legitimate part of the financial system. BNY Mellon now handles Bitcoin. So does Mastercard. There are now well functioning Bitcoin futures and options markets. This kind of adoption and integration is what has driven the price upward — a process that has much further to run. My $75,000 target price back in 2018 (assuming that every millionaire would one day want 1% of his or her portfolio in XBT) now looks a bit conservative.
Meanwhile, as Bitcoin has grown more respectable, the cool kids have moved on to decentralized finance (“DeFi”), “an open, permissionless, and highly interoperable protocol stack built on public smart contract platforms” such as the Ethereum blockchain, to quote a recent and excellent St. Louis Fed paper by Fabian Schaer. Like Bitcoin, DeFi has no centralized third-party system of verification and regulation. But it is a much looser, more variegated system, with multiple coins, tokens, exchanges, debt markets, derivatives and asset management protocols. As Schaer puts it:
This architecture can create an immutable and highly interoperable financial system with unprecedented transparency, equal access rights, and little need for custodians, central clearing houses, or escrow services, as most of these roles can be assumed by ‘smart contracts.’ … Atomic swaps, autonomous liquidity pools, decentralized stablecoins, and flash loans are just a few of many examples that show the great potential of this ecosystem. … DeFi may lead to a paradigm shift in the financial industry and potentially contribute toward a more robust, open, and transparent financial infrastructure.
(I told you it was cool.)
For the true believers, Bitcoin and DeFi are the first steps toward a libertarian Nirvana. In a widely quoted tweet, crypto guru Naval Ravikant added steps three to seven:
Bitcoin is an exit from the Fed.
DeFi is an exit from Wall Street.
Social media is an exit from mass media.
Homeschooling is an exit from industrial education.
Remote work is an exit from 9-5.
Creator economy is an exit from employment.
Individuals are leaving institutions.
We are on our way, according to Pier Kicks, to the “Metaverse” — a “self-sovereign financial system, an open creator economy, and a universal digital representation and ownership layer via NFTs (non-fungible tokens).” Yes, even art is now on the blockchain: Witness the sale by Christie’s last month of “Everydays: the First 5000 Days,” by Mike Winkelmann, aka Beeple, for $69.3 million.
What is the right historical analogy for all this? Allen Farrington argues that Bitcoin is to the system of fiat currencies centered around the dollar what medieval Venice once was to the remnants of the western Roman Empire, as superior an economic operating system as commercial capitalism was to feudalism. Another possibility is that the advent of blockchain-based finance is as revolutionary as that of fractional reserve banking, bond and stock markets in the great Anglo-Dutch financial revolution of the 18th century.
Like all such revolutions, however, this one, too, has produced its haters. Well-known economists such as Nouriel Roubini continue to predict Bitcoin’s demise. Bridgewater founder Ray Dalio has warned that, just as the U.S. government prohibited the private ownership of gold by executive order in April 1933, so the same fate could befall Bitcoin. Perhaps most ominously, the central bankers of the western world remain sniffy. A new line of attack (highly appealing to monetary officials eager to affirm their greenness) is that the electricity consumed by Bitcoin miners makes crypto dirty money.
Are we therefore heading for a collision between the old money and the new? Perhaps. As we approach the end of the first 100 days of Joe Biden’s presidency, I am tempted to paraphrase his former boss’s jab at Mitt Romney back in 2012: “The twentieth century is calling to ask for its economic policy back.” There is something very old-school about the Biden administration.
It believes in Keynesian demand management and stimulus. It is proposing a massive infrastructure investment plan. The result is that fiscal and monetary expansion triggered by a public health emergency seems set to continue beyond the duration of the emergency. The administration’s economists tell us there is nothing to fear from inflation. Meanwhile, in foreign policy, Team Biden seems committed to Cold War II against China. All of this hinges on the enduring credibility of the U.S. dollar as the preeminent international reserve currency and U.S. Treasury bonds as the safest of all financial assets — not to mention the enduring effectiveness of financial sanctions as the ultimate economic weapon. Yet precisely these things are threatened by the rise of an alternative financial system that essentially bypasses the Federal Reserve and potentially also the U.S. Treasury.
So you can see why Ray Dalio might expect the U.S. government at some point to outlaw Bitcoin and other cryptocurrency. The last administration occasionally muttered threats. “Cryptocurrency … provides bad actors and rogue nation states with the means to earn profits,” stated the report of Attorney General William Barr’s Cyber-Digital Task Force last year. Treasury Secretary Steven Mnuchin considered forcing U.S. exchanges to gather more information about individuals withdrawing their Bitcoin. Pro-Bitcoin politicians, such as Miami mayor Francis Suarez, are still in a minority.
Abroad, too, there are plenty of examples of governments moving to limit cryptocurrencies or ban them altogether. “We must do everything possible to make sure the currency monopoly remains in the hands of states,” declared German Finance Minister Olaf Scholz at a G-7 finance ministers meeting in December. The European Commission shows every sign of regulating the fledgling sector with its customary zeal. In particular, the European Central Bank has stablecoins (crypto tokens pegged to fiat currencies) in its sights. China is even more stringent. In 2017, the Chinese Communist Party restricted the ability of its citizens to buy Bitcoin, though Bitcoin mining continues to thrive close to sources of cheap hydroelectricity like the Three Gorges Dam.
But is it actually true that the state should have a monopoly on money? That is a distinctly German notion, stated most explicitly in Georg Friedrich Knapp’s State Theory of Money (1905). History begs to differ. Although states have sometimes sought to monopolize money creation, and although a state monopoly on the enforcement of debt contracts is preferable, a monopoly on money is far from natural or even necessary. For most of history, states have been satisfied with determining what is legal tender — that is, what can be used to discharge contractual obligations, including tax payments. This power to specify legal tender drove the great monetization of economy and society in Ming China and in Europe after the Black Death.
Money, it is conventional to argue, is a medium of exchange, which has the advantage of eliminating inefficiencies of barter; a unit of account, which facilitates valuation and calculation; and a store of value, which allows economic transactions to be conducted over long time periods as well as geographical distances. To perform all these functions optimally, the ideal form of money has to be available, affordable, durable, fungible, portable and reliable. Because they fulfill most of these criteria, metals such as gold, silver and bronze were for millennia regarded as the ideal monetary raw material. Rulers liked to stamp coins with images (often crowned heads) that advertised their authority. But in ancient Mesopotamia, beginning around five thousand years ago, people used clay tokens to record transactions involving agricultural produce like barley or wool, or metals such as silver. Such tablets performed much the same function as a banknote. Often, through the centuries, traders have devised such tokens or bills without government involvement, especially at times when coins have been in short supply or debased and devalued.
In the modern fiat monetary system, the central bank, itself supposedly independent of the state, can influence the money supply, but it does not monopolize money creation. In addition to state-created cash — the so-called high-powered money or monetary base — most money is digital credits from commercial banks to individuals and firms. As I argued in The Ascent of Money (2008), money is trust inscribed, and it does not seem to matter much whether it is inscribed on silver, on clay, on paper — or on a liquid crystal display. All kinds of things have served as money, from the cowrie shells of the Maldives to the stone discs used on the Pacific island of Yap.
Although Bitcoin currently looks to outsiders like a speculative asset, in practice it performs at least two of the three classic functions of money quite well, or soon will, as adoption continues. It can be (like gold) both a store of value and a unit of account. And, as my Hoover Institution colleague Manny Rincon-Cruz has suggested, it may be that the three classic functions of money are in fact something of a trilemma. Most forms of money can perform at least two of the three; it’s impossible or very hard to do all three. Bitcoin is not an ideal medium of exchange precisely because its ultimate supply is fixed and not adaptive, but that’s not a fatal limitation. In many ways, it is Bitcoin’s unique advantage.
In other words, Bitcoin and Ethereum, as well as a great many other digital coins and tokens, are stateless money. And the more they can perform at least two out of three monetary functions tolerably well, the less that banning them is going to work — unless every government agrees to do so simultaneously, which seems like a stretch. The U.S. isn’t going to ban Bitcoin, just tax it whenever you convert bitcoins into dollars.
The right question to ask is therefore whether or not the state can offer comparably appealing forms of digital money. And this is where the Chinese government has been thinking a lot more creatively than its American or European counterparts. As is well known, China has led the world in electronic payments, thanks to the vision of Alibaba and Tencent in building their Alipay and WeChat Pay platforms. In 2020, some 58% of Chinese used mobile payments, up from 33% in 2016, and mobile payments accounted for nearly two-thirds of all personal consumption PBOC payments. Banknotes and credit cards have largely yielded to QR codes on smartphones. The financial subsidiary of Alibaba, Ant Group, was poised last year to become one of the world’s biggest financial companies.
Yet the Communist Party became nervous about the scale of electronic payment platforms and sought to clip their wings by cancelling Ant’s planned IPO in November and tightening regulation. At the same time, the People’s Bank of China has accelerated the implementation of its plan for a central bank digital currency (CBDC). In a fascinating article in February, former PBOC governor Zhou Xiaochuan explained the fundamentally defensive character of this initiative. “Blockchain technology features decentralization, but decentralization is not a necessity for modernizing the payments system. It even has some drawbacks,” he wrote. “The possible application of blockchain … is still being researched, but is not ready at this time.”
Last year, the PBOC seized the opportunity presented by the pandemic to rush its CBDC into the hands of Chinese consumers, conducting trials in three cities — Shenzhen, Suzhou and Chengdu — as well as the Xiong’an New Area near Beijing. Crucially, its design is two-tier, with the PBOC dealing with the existing state-owned commercial banks and other entities (including telecom and tech companies), not directly with households and firms. The abbreviation “DC/EP” (with the slash) captures this dual structure. The central bank controls the digital currency, but the electronic payment platforms can participate in the system, alongside the banks, as intermediaries to consumers and businesses. However, the easiest option for consumers will clearly be to withdraw “e-CNY” from bank ATM machines onto their smartphones’ e-wallets. The system even allows transactions to happen in the absence of an internet connection via “dual offline technology.” In 2018 I predicted there would soon be “bityuan.” I only got the name wrong.
This new Chinese system not only defends the CCP against the twin threats of crypto and big tech, while ensuring that all Chinese citizens’ transactions are under surveillance; it also includes an offensive capability to challenge the U.S. dollar’s dominance in cross-border payments. And this is where the story gets seriously interesting. Today, as is well known, the dollar dominates the renminbi in foreign exchange markets, central bank reserves, trade finance and bank-to-bank payments through the Belgium-based Society for Worldwide Interbank Financial Telecommunication (SWIFT). This financial superpower, fully appreciated and utilized only after 9/11, is what makes U.S. financial sanctions so effective and far-reaching.
The Chinese are creatively exploring ways to change that. Exhibit A is the Finance Gateway Information Service, a joint venture between SWIFT and the China National Clearing Center within the PBOC, which aims to direct all cross-border yuan payments through China’s own settlement system, Cross-Border Interbank Payment and Clearing. Exhibit B is the Multiple CBDC (mCBDC) Bridge project by the Hong Kong Monetary Authority and the Bank of Thailand to implement a cross-border payments system based on distributed ledgers, again using a two-tier system. Exhibit C are the cross-border transfers between Hong Kong and Shenzhen currently being piloted. According to Sahil Mahtani of the South African investment manager Ninety One, the ultimate goal of Chinese policy is “to create a parallel payments network — one beyond American oversight — thereby crippling U.S. sanctions policy.” In Mahtani’s words:
The expansion of a Chinese digital currency will ultimately pry open the U.S. grip over global payments, and therefore compromise U.S. sanctions policy and a significant measure of U.S. power in the world. … It is not that China’s digital currency is going to become the dominant standard of payments … But it could become one standard, creating a parallel system with which to avoid the long arm of U.S. regulation.
What does the United States have to offer in response? When Mark Carney, the former Governor of the Bank of England, argued for a “synthetic hegemonic currency” at Jackson Hole in 2019, he was politely ignored. When Mark Zuckerberg proposed a Facebook stablecoin, Libra, he was impolitely rebuffed. (Libra has been renamed “Diem,” but getting regulatory approval still looks like an uphill struggle. According to Tyler Goodspeed, who recently left the Council of Economic Advisers to join us at Hoover, “If you’re issuing very short-term liquid liabilities that are redeemable on demand for, say, dollars or euros, and you’re backing that commitment by holding highly liquid dollar- or euro-denominated bills, then I’m sorry to say it but you are a bank.”
Other countries are exploring creating their own CBDCs — 60% of more than 60 central banks surveyed by the Bank for International Settlements last year. Cambodia and the Bahamas are already there. Even the European Central Bank has not said “non” or “nein,” though Bundesbank head Jens Weidmann is not alone in worrying that an e-euro might disintermediate Europe’s already ailing banks unless the Chinese two-tier model is adopted.
And the Fed? According to Chair Jay Powell, some of his officials are working with economists at the Massachusetts Institute of Technology to explore the feasibility of a U.S. CBDC. But, says Powell, “there is no need to rush.” Like his “What me, worry?” approach to inflation, this smacks of insouciance. China is seeking in plain sight to build an alternative international payments system to that of the U.S. dollar, and there’s no need to rush to meet this challenge? Nor any thought of actively integrating Bitcoin — a tried and tested decentralized form of “digital gold” — into the U.S. financial system, rather than treating it as a rather suspect parvenu?
If the future of money arrives as rapidly as I think it will, in the form of a widely adopted e-CNY, do not be surprised if all we can offer our kids are Robux.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Diplomacy to a tee. Photographer: Tim Rue/Bloomberg
In a famous essay, the philosopher Isaiah Berlin borrowed a distinction from the ancient Greek poet Archilochus: “The fox knows many things, but the hedgehog knows one big thing.”
“There exists,” wrote Berlin, “a great chasm between those, on one side, who relate everything to … a single, universal, organizing principle in terms of which alone all that they are and say has significance” — the hedgehogs — “and, on the other side, those who pursue many ends, often unrelated and even contradictory” — the foxes.
Berlin was talking about writers. But the same distinction can be drawn in the realm of great-power politics. Today, there are two superpowers in the world, the U.S. and China. The former is a fox. American foreign policy is, to borrow Berlin’s terms, “scattered or diffused, moving on many levels.” China, by contrast, is a hedgehog: it relates everything to “one unchanging, all-embracing, sometimes self-contradictory and incomplete, at times fanatical, unitary inner vision.”
Fifty years ago this July, the arch-fox of American diplomacy, Henry Kissinger, flew to Beijing on a secret mission that would fundamentally alter the global balance of power. The strategic backdrop was the administration of Richard Nixon’s struggle to extricate the U.S. from the Vietnam War with its honor and credibility so far as possible intact.
The domestic context was dissension more profound and violent than anything we have seen in the past year. In March 1971, Lieutenant William Calley was found guilty of 22 murders in the My Lai massacre. In April, half a million people marched through Washington to protest against the war in Vietnam. In June, the New York Times began publishing the Pentagon Papers.
Kissinger’s meetings with Zhou Enlai, the Chinese premier, were perhaps the most momentous of his career. As a fox, the U.S. national security adviser had multiple objectives. The principal goal was to secure a public Chinese invitation for his boss, Nixon, to visit Beijing the following year.
But Kissinger was also seeking Chinese help in getting America out of Vietnam, as well as hoping to exploit the Sino-Soviet split in a way that would put pressure on the Soviet Union, America’s principal Cold War adversary, to slow down the nuclear arms race. In his opening remarks, Kissinger listed no fewer than six issues for discussion, including the raging conflict in South Asia that would culminate in the independence of Bangladesh.
Zhou’s response was that of a hedgehog. He had just one issue: Taiwan. “If this crucial question is not solved,” he told Kissinger at the outset, “then the whole question [of U.S.-China relations] will be difficult to resolve.”
To an extent that is striking to the modern-day reader of the transcripts of this and the subsequent meetings, Zhou’s principal goal was to persuade Kissinger to agree to “recognize the PRC as the sole legitimate government in China” and “Taiwan Province” as “an inalienable part of Chinese territory which must be restored to the motherland,” from which the U.S. must “withdraw all its armed forces and dismantle all its military installations.” (Since the Communists’ triumph in the Chinese civil war in 1949, the island of Taiwan had been the last outpost of the nationalist Kuomintang. And since the Korean War, the U.S. had defended its autonomy.)
With his eyes on so many prizes, Kissinger was prepared to make the key concessions the Chinese sought. “We are not advocating a ‘two China’ solution or a ‘one China, one Taiwan’ solution,” he told Zhou. “As a student of history,” he went on, “one’s prediction would have to be that the political evolution is likely to be in the direction which [the] Prime Minister … indicated to me.” Moreover, “We can settle the major part of the military question within this term of the president if the war in Southeast Asia [i.e. Vietnam] is ended.”
Asked by Zhou for his view of the Taiwanese independence movement, Kissinger dismissed it out of hand. No matter what other issues Kissinger raised — Vietnam, Korea, the Soviets — Zhou steered the conversation back to Taiwan, “the only question between us two.” Would the U.S. recognize the People’s Republic as the sole government of China and normalize diplomatic relations? Yes, after the 1972 election. Would Taiwan be expelled from the United Nations and its seat on the Security Council given to Beijing? Again, yes.
Fast forward half a century, and the same issue — Taiwan — remains Beijing’s No. 1 priority. History did not evolve in quite the way Kissinger had foreseen. True, Nixon went to China as planned, Taiwan was booted out of the U.N. and, under President Jimmy Carter, the U.S. abrogated its 1954 mutual defense treaty with Taiwan. But the pro-Taiwan lobby in Congress was able to throw Taipei a lifeline in 1979, the Taiwan Relations Act.
The act states that the U.S. will consider “any effort to determine the future of Taiwan by other than peaceful means, including by boycotts or embargoes, a threat to the peace and security of the Western Pacific area and of grave concern to the United States.” It also commits the U.S. government to “make available to Taiwan such defense articles and … services in such quantity as may be necessary to enable Taiwan to maintain a sufficient self-defense capacity,” as well as to “maintain the capacity of the United States to resist any resort to force or other forms of coercion that would jeopardize the security, or the social or economic system, of the people on Taiwan.”
For the Chinese hedgehog, this ambiguity — whereby the U.S. does not recognize Taiwan as an independent state but at the same time underwrites its security and de facto autonomy — remains an intolerable state of affairs.
Yet the balance of power has been transformed since 1971 — and much more profoundly than Kissinger could have foreseen. China 50 years ago was dirt poor: despite its huge population, its economy was a tiny fraction of U.S. gross domestic product. This year, the International Monetary Fund projects that, in current dollar terms, Chinese GDP will be three quarters of U.S. GDP. On a purchasing power parity basis, China overtook the U.S. in 2017.
Middle Kingdom, Top Performer
Shares of global GDP based on purchasing power parity
Source: International Monetary Fund, World Economic Outlook Database, October 2020
*Estimates start after 2019
In the same time frame, Taiwan, too, has prospered. Not only has it emerged as one of Asia’s most advanced economies, with Taiwan Semiconductor Manufacturing Co. the world’s top chip manufacturer. Taiwan has also become living proof that an ethnically Chinese people can thrive under democracy. The authoritarian regime that ran Taipei in the 1970s is a distant memory. Today, it is a shining example of how a free society can use technology to empower its citizens — which explains why its response to the Covid-19 pandemic was by any measure the most successful in the world (total deaths: 10).
As Harvard University’s Graham Allison argued in his hugely influential book, “Destined for War: Can America and China Escape Thucydides's Trap?”, China’s economic rise — which was at first welcomed by American policymakers — was bound eventually to look like a threat to the U.S. Conflicts between incumbent powers and rising powers have been a feature of world politics since 431 BC, when it was the “growth in power of Athens, and the alarm which this inspired in Sparta” that led to war. The only surprising thing was that it took President Donald Trump, of all people, to waken Americans up to the threat posed by the growth in the power of the People’s Republic.
Trump campaigned against China as a threat mainly to U.S. manufacturing jobs. Once in the White House, he took his time before acting, but in 2018 began imposing tariffs on Chinese imports. Yet he could not prevent his preferred trade war from escalating rapidly into something more like Cold War II — a contest that was at once technological, ideological and geopolitical. The foreign policy “blob” picked up the anti-China ball and ran with it. The public cheered them on, with anti-China sentiment surging among both Republicans and Democrats.
Trump himself may have been a hedgehog with a one-track mind: tariffs. But under Secretary of State Mike Pompeo, U.S. policy soon reverted to its foxy norm. Pompeo threw every imaginable issue at Beijing, from the reliance of Huawei Technologies Co. on imported semiconductors, to the suppression of the pro-democracy movement in Hong Kong, to the murky origins of Covid-19 in Wuhan.
Inevitably, Taiwan was added to the list, but the increased arms sales and diplomatic contacts were not given top billing. When Richard Haass, the grand panjandrum of the Council on Foreign Relations, argued last year for ending “strategic ambiguity” and wholeheartedly committing the U.S. to upholding Taiwan’s autonomy, no one in the Trump administration said, “Great idea!”
Yet when Pompeo met the director of the Communist Party office of foreign affairs, Yang Jiechi, in Hawaii last June, guess where the Chinese side began? “There is only one China in the world and Taiwan is an inalienable part of China. The one-China principle is the political foundation of China-U.S. relations.”
So successful was Trump in leading elite and popular opinion to a more anti-China stance that President Joe Biden had no alternative but to fall in line last year. The somewhat surprising outcome is that he is now leading an administration that is in many ways more hawkish than its predecessor.
Trump was no cold warrior. According to former National Security Adviser John Bolton’s memoir, the president liked to point to the tip of one of his Sharpies and say, “This is Taiwan,” then point to the Resolute desk in the Oval Office and say, “This is China.” “Taiwan is like two feet from China,” Trump told one Republican senator. “We are 8,000 miles away. If they invade, there isn’t a f***ing thing we can do about it.”
Unlike others in his national security team, Trump cared little about human rights issues. On Hong Kong, he said: “I don’t want to get involved,” and, “We have human-rights problems too.” When President Xi Jinping informed him about the labor camps for the Muslim Uighurs of Xinjiang in western China, Trump essentially told him “No problemo.” On the 30th anniversary of the 1989 Tiananmen Square massacre, Trump asked: “Who cares about it? I’m trying to make a deal.”
The Biden administration, by contrast, means what it says on such issues. In every statement since taking over as secretary of state, Antony Blinken has referred to China not only as a strategic rival but also as violator of human rights. In January, he called China’s treatment of the Uighurs “an effort to commit genocide” and pledged to continue Pompeo’s policy of increasing U.S. engagement with Taiwan. In February, he gave Yang an earful on Hong Kong, Xinjiang, Tibet and even Myanmar, where China backs the recent military coup. Earlier this month, the administration imposed sanctions on Chinese officials it holds responsible for sweeping away Hong Kong’s autonomy.
In his last Foreign Affairs magazine article before joining the administration as its Asia “tsar,” Kurt Campbell argued for “a conscious effort to deter Chinese adventurism … This means investing in long-range conventional cruise and ballistic missiles, unmanned carrier-based strike aircraft and underwater vehicles, guided-missile submarines, and high-speed strike weapons.” He added that Washington needs to work with other states to disperse U.S. forces across Southeast Asia and the Indian Ocean and “to reshore sensitive industries and pursue a ‘managed decoupling’ from China.”
In many respects, the continuity with the Trump China strategy is startling. The trade war has not been ended, nor the tech war. Aside from actually meaning the human rights stuff, the only other big difference between Biden and Trump is the former’s far stronger emphasis on the importance of allies in this process of deterring China — in particular, the so-called Quad the U.S. has formed with Australia, India and Japan. As Blinken said in a keynote speech on March 3, for the U.S. “to engage China from a position of strength … requires working with allies and partners … because our combined weight is much harder for China to ignore.”
This argument took concrete form last week, when Campbell told the Sydney Morning Herald that the U.S. was “not going to leave Australia alone on the field” if Beijing continued its current economic squeeze on Canberra (retaliation for the Australian government’s call for an independent inquiry into the origins of the pandemic). National Security Adviser Jake Sullivan has been singing from much the same hymn-sheet. Biden himself hosted a virtual summit for the Quad’s heads of state on March 12.
The Chinese approach remains that of the hedgehog. Several years ago, I was told by one of Xi’s economic advisers that bringing Taiwan back under the mainland’s control was his president’s most cherished objective — and the reason he had secured an end to the informal rule that had confined previous Chinese presidents to two terms. It is for this reason, above all others, that Xi has presided over a huge expansion of China’s land, sea and air forces, including the land-based DF‑21D missiles that could sink American aircraft carriers.
While America’s multitasking foxes have been adding to their laundry list of grievances, the Chinese hedgehog has steadily been building its capacity to take over Taiwan. In the words of Tanner Greer, a journalist who writes knowledgably on Taiwanese security, the People’s Liberation Army “has parity on just about every system the Taiwanese can field (or buy from us in the future), and for some systems they simply outclass the Taiwanese altogether.” More importantly, China has created what’s known as an “Anti Access/Area Denial bubble” to keep U.S. forces away from Taiwan. As Lonnie Henley of George Washington University pointed out in congressional testimony last month, “if we can disable [China’s integrated air defense system], we can win militarily. If not, we probably cannot.”
As a student of history, to quote Kissinger, I see a very dangerous situation. The U.S. commitment to Taiwan has grown verbally stronger even as it has become militarily weaker. When a commitment is said to be “rock-solid” but in reality has the consistency of fine sand, there is a danger that both sides miscalculate.
I am not alone in worrying. Admiral Phil Davidson, the head of U.S. forces in the Indo-Pacific, warned in his February testimony before Congress that China could invade Taiwan by 2027. Earlier this month, my Bloomberg Opinion colleague Max Hastings noted that “Taiwan evokes the sort of sentiment among [the Chinese] people that Cuba did among Americans 60 years ago.”
Admiral James Stavridis, also a Bloomberg Opinion columnist, has just published “2034: A Novel of the Next World War,” in which a surprise Chinese naval encirclement of Taiwan is one of the opening ploys of World War III. (The U.S. sustains such heavy naval losses that it is driven to nuke Zhanjiang, which leads in turn to the obliteration of San Diego and Galveston.) Perhaps the most questionable part of this scenario is its date, 13 years hence. My Hoover Institution colleague Misha Auslin has imagined a U.S.-China naval war as soon as 2025.
In an important new study of the Taiwan question for the Council on Foreign Relations, Robert Blackwill and Philip Zelikow — veteran students and practitioners of U.S. foreign policy — lay out the four options they see for U.S. policy, of which their preferred is the last:
The United States should … rehearse — at least with Japan and Taiwan — a parallel plan to challenge any Chinese denial of international access to Taiwan and prepare, including with pre-positioned U.S. supplies, including war reserve stocks, shipments of vitally needed supplies to help Taiwan defend itself. … The United States and its allies would credibly and visibly plan to react to the attack on their forces by breaking all financial relations with China, freezing or seizing Chinese assets.
Blackwill and Zelikow are right that the status quo is unsustainable. But there are three core problems with all arguments to make deterrence more persuasive. The first is that any steps to strengthen Taiwan’s defenses will inevitably elicit an angry response from China, increasing the likelihood that the Cold War turns hot — especially if Japan is explicitly involved. The second problem is that such steps create a closing window of opportunity for China to act before the U.S. upgrade of deterrence is complete. The third is the reluctance of the Taiwanese themselves to treat their national security with the same seriousness that Israelis take the survival of their state.
Thursday’s meeting in Alaska between Blinken, Sullivan, Yang and Chinese Foreign Minister Wang Yi — following hard on the heels of Blinken’s visits to Japan and South Korea — was never likely to restart the process of Sino-American strategic dialogue that characterized the era of “Chimerica” under George W. Bush and Barack Obama. The days of “win-win” diplomacy are long gone.
During the opening exchanges before the media, Yang illustrated that hedgehogs not only have one big idea – they are also very prickly. The U.S. was being “condescending,” he declared, in remarks that overshot the prescribed two minutes by a factor of eight; it would do better to address its own “deep-seated” human rights problems, such as racism (a “long history of killing blacks”), rather than to lecture China.
The question that remains is how quickly the Biden administration could find itself confronted with a Taiwan Crisis, whether a light “quarantine,” a full-scale blockade or a surprise amphibious invasion? If Hastings is right, this would be the Cuban Missile Crisis of Cold War II, but with the roles reversed, as the contested island is even further from the U.S. than Cuba is from Russia. If Stavridis is right, Taiwan would be more like Belgium in 1914 or Poland in 1939.
But I have another analogy in mind. Perhaps Taiwan will turn out to be to the American empire what Suez was to the British Empire in 1956: the moment when the imperial lion is exposed as a paper tiger. When the Egyptian president Gamal Abdel Nasser nationalized the Suez Canal, Prime Minister Anthony Eden joined forces with France and Israel to try to take it back by force. American opposition precipitated a run on the pound and British humiliation.
I, for one, struggle to see the Biden administration responding to a Chinese attack on Taiwan with the combination of military force and financial sanctions envisaged by Blackwill and Zelikow. Sullivan has written eloquently of the need for a foreign policy that Middle America can get behind. Getting torched for Taipei does not seem to fit that bill.
As for Biden himself, would he really be willing to jeopardize the post-pandemic boom his economic policies are fueling for the sake of an island Kissinger was once prepared quietly to trade in pursuit of Cold War detente? Who would be hurt more by the financial crisis Blackwill and Zelikow imagine in the event of war for Taiwan – China, or the U.S. itself? One of the two superpowers has a current account deficit of 3.5% of GDP (Q2 2020) and a net international investment position of nearly minus-$14 trillion, and it’s not China. The surname of the secretary of state would certainly be an irresistible temptation to headline writers if the U.S. blinked in what would be the fourth and biggest Taiwan Crisis since 1954.
Yet think what that would mean. Losing in Vietnam five decades ago turned out not to matter much, other than to the unfortunate inhabitants of South Vietnam. There was barely any domino effect in Asia as a whole, aside from the human catastrophe of Cambodia. Yet losing — or not even fighting for — Taiwan would be seen all over Asia as the end of American predominance in the region we now call the “Indo-Pacific.” It would confirm the long-standing hypothesis of China’s return to primacy in Asia after two centuries of eclipse and “humiliation.” It would mean a breach of the “first island chain” that Chinese strategists believe encircles them, as well as handing Beijing control of the microchip Mecca that is TSMC (remember, semiconductors, not data, are the new oil). It would surely cause a run on the dollar and U.S. Treasuries. It would be the American Suez.
The fox has had a good run. But the danger of foxy foreign policy is that you care about so many issues you risk losing focus. The hedgehog, by contrast, knows one big thing. That big thing may be that he who rules Taiwan rules the world.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
High velocity. Photographer: Andrew Harrer/Bloomberg
“Inflation is always and everywhere a monetary phenomenon in the sense that it is and can be produced only by a more rapid increase in the quantity of money than in output.”
It was in a lecture delivered in London in 1970 that Milton Friedman uttered those famous words, the credo of monetarism.
Over the previous five years, inflation in most countries had been on the rise. In the first half of the 1960s, U.S. consumer prices had never gone up by more than 2% in any 12-month period. The average inflation rate from January 1960 until December 1965 had been just 1.3%. But thereafter it moved upward in two jumps, reaching 3.8% in October 1966 and 6.4% in February 1970.
For Friedman, this had been the more or less inevitable consequence of allowing the money supply to grow too rapidly. The monetary aggregate known as M2 (cash in public hands, plus checking and savings accounts, as well as money market funds) grew at an average annual rate of 7% throughout the 1960s. Moreover, as Friedman pointed out in his lecture, the velocity of circulation had not moved in the opposite direction.
What no one knew in 1970, though Friedman certainly suspected it, was that much worse lay ahead. By the end of 1974, U.S. consumer prices were rising at more than 12% a year. The “great inflation” of the 1970s (which was only really great by North American standards) peaked in early 1980 at 14%. Friedman’s London audience had an even rougher ride in store for them. U.K. annual inflation hit 23% in 1975. That year, as an 11-year-old schoolboy, I wrote a letter to the Glasgow Herald (my first ever publication) that bemoaned the price of shoes, because I could see my mother’s sticker shock each time I needed a new pair. Prices were rising significantly faster than my feet were growing — and that was saying something.
In recent weeks, investors have been acting in ways that suggest they fear a repeat of at least the first part of that history — the 1960s, if not the 1970s. On Thursday, Federal Reserve Chair Jerome Powell made the latest of multiple attempts by Fed officials to reassure markets that they have nothing to fear from a temporary bout of higher inflation as the U.S. economy emerges from the Covid-19 pandemic. In response, you could almost hear the chants of “always and everywhere a monetary phenomenon.” After all, the latest M2 growth rate (for January) is 25.8% — roughly twice the rate at inflation’s peaks in the 1970s. (Yes, I know velocity is way down.)
The crucial indicator in this debate is inflation expectations. These can be measured in various ways, but one of the best is the so-called breakeven inflation rate, which is derived from five-year Treasury securities and five-year Treasury inflation-indexed Securities, and tells us what market participants expect inflation to be on average in the next five years. Less than a year ago, that expected inflation rate was down to 0.14%. Last Wednesday it was at 2.45%. The last time it was that high was in April 2011.
Highest in a Decade
Another indicator of market anxiety is the steepening of the yield curve (though that could well be capturing growth expectations as well as inflation fears). In the shock of the pandemic, the yield on 10-year Treasuries fell as low as 0.6%. Now it is up to 1.56%. Because the yields of government bonds with shorter maturities have not moved up so much, the widening spread can be seen as a further sign that markets expect inflation.
Yields Bounce Back
To some observers, including Fed economists, all this seems like market overreaction. The Fed’s preferred measures of inflation, derived from the price indices of personal consumer expenditures, have consistently undershot the 2% inflation target for most of the period since the global financial crisis. In only 10 months out of the 149 since Lehman Brothers Inc. went bust has core PCE (excluding the volatile costs of energy and food) exceeded 2%. The latest reading is 1.5%. Indeed, average core inflation has been 1.9% for the past 30 years — since the presidency of George H. W. Bush. In any case, the economy is only just emerging from one of the biggest supply shocks in economic history — the lockdowns and other “non-pharmaceutical interventions” to which we resorted to limit the spread of the SARS-CoV-2.
The Decline of Inflation
Looking at the past three decades, you can see why the Fed subscribes to what might be called the Mad Magazine view of inflation: “What, me worry?” Last month, Powell said, “Frankly we welcome slightly higher … inflation. The kind of troubling inflation people like me grew up with seems unlikely in the domestic and global context we’ve been in for some time.”
Since last September, the Fed has pledged to keep its Fed funds rate at near zero and its bond purchases (quantitative easing) going until the labor market has made “significant progress” in recovering from the Covid shock. In very similar speeches last week, Fed Governor Lael Brainard and Mary Daly, president of the San Francisco Fed, reiterated this commitment. It’s not just that they don’t worry about inflation above 2%. They actively want inflation above 2% because they are now targeting an average rate of 2%.
In making this argument, the Fed folks are telling us that post-pandemic inflation will be so fleeting as to leave expectations essentially unchanged. “A burst of transitory inflation,” in Brainard’s words, “seems more probable than a durable shift above target in the inflation trend and an unmooring of inflation expectations to the upside.” Those who worry about such an unmooring, argued Daly, are succumbing to “the tug of fear … a memory of high and rising inflation, an inexorable link between unemployment, wages and prices, and a Federal Reserve that once fell behind the policy curve. But the world today is different, and we can’t let those memories, those scars, dictate current and future policy … That was more than three decades ago, and times have changed.”
Now, I plead guilty to having worried about inflation prematurely in the past, something for which I was vehemently (and unfairly) criticized by Paul Krugman and others. Eleven years later, I am not about to repeat that mistake. Yes, the administration of President Donald Trump ran the economy hot with big tax cuts and browbeat the Fed to abandon its planned normalization of monetary policy — and even at 3.5% unemployment, inflation barely moved. Yes, as Skanda Amarnath and Alex Williams very reasonably argue, the reopening of services such as bars and restaurants will likely push up PCE inflation, but not by much and only temporarily. Only if inflation is sustained and accompanied by equally sustained wage inflation would the Fed need to change its stance.
Yet this entire debate has been turned on its head by the intervention of former Treasury secretary and Harvard University President Lawrence Summers. Back in 2014, it was Summers who resurrected the idea of “secular stagnation,” predicting (correctly, as it proved) that the period after the global financial crisis would be characterized by sluggish economic performance and very low interest rates. There was therefore some consternation in the world of economics when Summers published a stinging critique of President Joe Biden’s proposed $1.9 trillion fiscal stimulus on Feb. 4.
“There is a chance,” warned Summers, “that macroeconomic stimulus on a scale closer to World War II levels than normal recession levels will set off inflationary pressures of a kind we have not seen in a generation, with consequences for the value of the dollar and financial stability.”
At this point, we need to make our first qualification of Friedman’s monetarist credo. Actually, inflation is often and in many places a fiscal phenomenon — or at least, you don’t get inflation without a combination of fiscal and monetary expansion. Summers’s point is that the proposed fiscal stimulus is far larger than the likely output gap, insofar as that can be estimated. Even before the additional stimulus, Summers wrote, “unemployment is falling, rather than skyrocketing as it was in 2009, and the economy is likely before too long to receive a major boost as Covid-19 comes under control. ... Monetary conditions are [also] far looser today than in 2009. … There is likely to be further strengthening of demand as consumers spend down the approximately $1.5 trillion they accumulated last year.”
Although economists working for the Biden administration and Democratic Party operatives shot back, Summers’s argument was endorsed by other big hitters, notably Olivier Blanchard, only recently a proponent of active fiscal policy. Martin Wolf, a rampant Keynesian in the period after the financial crisis, called the stimulus plan “a risky experiment.”
Even investors who don’t share my respect for these academic economists could hear a version of the same argument from two of the great financial market players. “Bonds are not the place to be these days,” wrote Warren Buffet in the latest Berkshire Hathaway report. “My overriding theme is inflation relative to what the policymakers think,” Stan Druckenmiller said in an interview. “Basically the play is inflation. I have a short Treasury position, primarily at the long end.”
Time for a further amendment to Friedman’s credo. Like the equation that encapsulates the quantity theory of money (MV=PQ), the assertion that inflation is always a monetary phenomenon verges on being a tautology. In truth, monetary expansions, like the fiscal deficits with which they are often associated, are the result of policy decisions, which are rooted in decision-makers’ mental models, which originate in some combination of experience and study of history. The Fed folks are telling us that inflation expectations will stay anchored, even if inflation jumps above 2% for a time. The big beasts of economics and investment may just have longer memories. Both Blanchard (72) and Wolf (74) are old enough to remember the 1960s, and both refer to what happened then with good reason.
Our own time has quite a lot in common with the 1960s, as I argued last June in the first column I wrote for Bloomberg Opinion. True, the Woodstock generation was into free speech, whereas the Wokestock generation wants to cancel it, but there’s the same sense of a generation war. There’s a crazy right, too, as we saw on Jan. 6. It’s just that today the arguments for separating black and white students are made on the crazy left.
The economic similarities are there, too. The economists who served in the John F. Kennedy and Lyndon B. Johnson administrations — such as Walter Heller of the University of Minnesota — had as much faith in the power of fiscal policy as those now serving under Biden.
“Our present choice,” declared Kennedy, “is not between a tax cut and a balanced budget. The choice, rather, is between chronic deficits arising out of a slow rate of economic growth, and temporary deficits stemming from a tax program designed to promote … more rapid economic growth.” The 1964 budget, which cut both individual and corporate tax rates, testified to the dominance of Keynesian ideas at that time. The only difference is that by today’s standards, the deficits of the 1960s were tiny, peaking at 2.8% of gross domestic product — a figure regarded as so excessive that in 1968 Congress passed the Revenue and Expenditure Control Act, effectively reversing the 1964 tax cuts.
As in our time, the Fed was confident that there was a stable tradeoff to be exploited between inflation and unemployment. As Allan Meltzer showed in his history of the Fed, the easing of monetary policy in 1967 was a grave error, one recognized by Fed Chair William McChesney Martin by the end of that year (“the horse of inflation not only was out of the barn but was already well down the road”). An important difference was the distorting effect of the Fed’s Regulation Q, which imposed interest rate ceilings on savings accounts in 1965, discouraging saving, boosting consumption, and limiting the effective transmission of monetary policy.
Then, as now, the global financial system revolved around the dollar, to the annoyance of European leaders such as President Charles de Gaulle of France, who complained of the American currency’s “exorbitant privilege.” The difference was that the dollar was still linked to gold under the Bretton Woods rules of (mostly) fixed exchange rates. Fears that the U.S. might break the link to gold and devalue the dollar — which were fulfilled by Richard Nixon in August 1971 — may have played a part in pushing up inflation expectations.
In a seminal paper first published in 1981, the economist Thomas Sargent argued that “big inflations” ended only when there was “an abrupt change in the continuing government policy, or strategy, for setting deficits now and in the future that is sufficiently binding as to be widely believed.” The corollary of this insight must be that inflations begin with a comparable regime change, but one that is imperceptible rather than abrupt.
Sargent elaborated on this point in his 2008 presidential address to the American Economic Association, in which he argued that policymakers might easily form “incorrect views about events that are rarely observed.”
The situation that we are always in [is] … that our probability models are misspecified. … The possibility [exists] that learning has propelled us to a self-confirming equilibrium in which the government chooses an optimal policy based on a wrong model … Misguided governments [fall into] lack-of-experimentation traps to which self-confirming equilibria confine them.
This nicely encapsulates the mistakes made at the Fed in the 1960s. It might well turn out to describe the mistakes being made at the Fed right now. Thirty years of very low inflation seems like the perfect basis for a wrong model.
There is one important caveat, nevertheless. The biggest difference between our own time and the 1960s is that we are coming out of a pandemic, whereas then the U.S. was sliding deeper into a disastrous war. Economic historians have long been aware that, for most of history, war has been the principal driver of moves in inflation expectations. Pandemics have generally not had this effect. The reason for this is clear. Over the long run, wars are the most common reason why governments run large budget deficits and are tempted to debase the currency. And wars that go wrong are especially likely to end in either debt default or inflation or both.
Thanks to the Bank of England, we can take a long, hard look at the history of British inflation expectations since the late 17th century. The striking point is that five out of the six biggest moves in expectations occurred in time of war — especially (as in 1917 and 1940) when the war was going badly.
Britain’s Ups and Downs
Source: Bank of England
Though the economics literature has little to say on the subject, I find it hard to believe that television news coverage of the deteriorating situation in Vietnam — for example, the Tet Offensive of 1968, which the U.S. networks misrepresented as a triumph for the North Vietnamese army and the Vietcong — played no part in the upward shift in American inflation expectations.
I would become a lot more worried about the prospects for U.S. inflation if our current Cold War II with China escalated into a full-blown hot war or even a serious diplomatic crisis over, say, Taiwan — which is a good deal more likely than I suspect most investors appreciate, as Robert Blackwill and Philip Zelikow pointed out last week.
Still, the British experience in the mid-1970s is a reminder that war is not a sine qua non for inflationary liftoff — or that the wars can be someone else’s, as was the case when the Arab states attacked Israel in 1973, the trigger for the oil shock that most people wrongly think of as the principal cause of the great inflation. Ultimately, inflation expectations can be untethered by a combination of excessive fiscal and monetary laxity without a shot being fired. If a pandemic has the financial consequences of a major war, that may suffice.
Lest anyone doubt the scale of the fiscal shock attributable to Covid-19, the latest projections from the Congressional Budget Office are now available. Even if the short-run outlook is less dire than last year’s exercise, the reality is inescapable: Not only is the federal debt in public hands now at its highest level relative to GDP since the year after World War II, but it is also forecast to soar to double that level by 2050. These are drastically worse projections than we were contemplating in 2009 and 2012.
Washington's Skyrocketing IOUs
Federal debt in public hands as a share of GDP
Source: Congressional Budget Office
The conclusion is not that inflation is inevitable. The conclusion is that the current path of policy is unsustainable. The Fed may control short-term rates but it cannot really allow long-term interest rates to rise rapidly because of the problems this would create for highly leveraged entities, including the federal government itself. This is the “unpleasant fiscal arithmetic” that inevitably arises when the stock of debt rises to approximately the level of total economic output.
On the other hand, the Fed cannot comfortably engage in full-spectrum yield-curve control without creating a situation of financial repression and fiscal dominance reminiscent of the late 1940s, another time of rapid inflation. To quote a recent paper from the St. Louis Fed, “if the Fed were to adopt such a policy and if the public perceives that the Fed is engaged in deficit financing, then it is possible that inflation expectations could rise.”
In the late 1940s and in the late 1960s, economic cooling was done by raising taxes. But no one in the new administration is talking about that, though the progressives in Congress are itching to tax the rich. On the contrary, the key members of Team Biden, notably Treasury Secretary Janet Yellen, all think the lesson of history is to “go large or go home” with deficit spending: the $1.9 trillion stimulus is the first of a number of big spending measures in prospect, with green new infrastructure next up. But that’s only the lesson of very recent history — to be precise, the first term of the Barack Obama administration, in which so many of today’s key players served.
In Charles Dickens’s “Great Expectations,” the orphan Pip comes into a fortune from an anonymous benefactor and embarks on the life of a gentleman — hence his great expectations. Only later does it become clear that the money comes from a dubious source and it ends up being lost altogether: “My great expectations had all dissolved, like our own marsh mists before the sun.”
It may ultimately be that our great expectations of inflation will dissolve in a similar way, vindicating Powell and making fools of aged economists and bond vigilantes alike. But the resemblances between our situation and the one Milton Friedman described in 1970 are striking — even if it is not quite true that inflation is always and everywhere a monetary phenomenon.
Niall Ferguson is the Milbank Family Senior Fellow at the Hoover Institution at Stanford University and a Bloomberg Opinion columnist. He was previously a professor of history at Harvard, New York University and Oxford. He is the founder and managing director of Greenmantle LLC, a New York-based advisory firm.
Crumbling? Photographer: Carl Court/Getty Images
In Philip Pullman’s series of fantasy novels, “His Dark Materials,” we enter a universe containing an infinity of parallel worlds. In the most important of these worlds, which is similar to ours in many respects, evolution and history have had subtly different outcomes. Human beings have visible souls — small, semi-autonomous “daemons” that take the shapes of animals. And the Reformation has failed, leaving Europe still under the dominance of an obscurantist and oppressive “Magisterium.”
The home of the indomitably mendacious young heroine, Lyra Silvertongue, is an Oxford in which the nearest thing to physics is “experimental theology.” The Scientific Revolution has not been fully achieved and the Industrial Revolution looks equally incomplete. Lyra’s is a world that remains in many ways early modern. There are no planes, only balloons and airships. There is a primitive form of electricity but “anbaric” light is a luxury. The social order too lags behind our own. Servants rather than machines still perform most menial tasks. There are priories full of nuns. Politics remains an aristocratic preserve.
You are unlikely to have read “His Dark Materials” unless you have children, but perhaps you saw some of the recent HBO adaptation, notable for an unexpected and not wholly convincing appearance by Lin-Manuel Miranda as the Texan balloonist Lee Scoresby. If you’ve never even heard of Pullman, educate yourself. For he is not only as significant an author as that other great Oxonian C.S. Lewis (in many ways, the initial book in the series, “The Golden Compass,” is the atheist’s answer to “The Lion, the Witch and the Wardrobe”). Pullman also seems unwittingly to have written an intimation of the post-pandemic world.
There are a great many of us who still want to believe that at some point this year we shall all get back to normal — meaning life will resume more or less exactly as it was at the end of 2019. I am sure that by the summer it will feel in most developed countries that the worst of Covid-19 is over, thanks to a combination of mass vaccination and naturally acquired immunity. I am also confident that there will be a protracted global party to celebrate the reopening of bars and restaurants and the easing of at least some travel restrictions.
I fully expect a bout of rapid economic growth to ensue: Consumers have accumulated a vast sum of forced savings, a large proportion of which they are poised to spend — even before governments add fuel to the fire in the form of yet more fiscal stimulus.
Nevertheless, I am doubtful that our near-term future is going to revert entirely to the pre-pandemic normal. First, the SARS-CoV-2 virus is mutating in ways that few people foresaw a year ago, becoming more transmissible or more vaccine-resistant.
Second, the dominance of vaccine nationalism over global inoculation, combined with significant “anti-vaxxer” sentiment in countries that will soon have more vaccine shots than they require, will leave the virus with plenty of time and space to evolve further, especially in the Southern hemisphere. The elimination of Covid seems a distant prospect. A more likely scenario is that it will become an endemic, seasonal disease, requiring annual shots and causing recurrent waves of excess mortality.
Third, the hypermobility of the pre-pandemic era is highly unlikely to resume any time soon. Many countries that have managed to suppress the disease (e.g. Australia and New Zealand) will maintain travel restrictions. Few large businesses will return to their previous volume of corporate travel: Many meetings that would previously have necessitated long-haul flights will continue to happen over Zoom. A significant proportion of relatively high-skilled people will continue to work from home at least part of the time. And will you throw away all those masks? Will you resume hugs and handshakes of people outside your innermost circle? I know I won’t.
Fourth, Covid-19 has exposed how very poor our preparedness for disasters of all kinds has become, the central theme of my forthcoming book, “Doom: The Politics of Catastrophe.” There is a consensus that the next disaster we shall have to contend with will be related to climate change: That is Bill Gates’s story. Yet there are other disasters lurking out there to which we attach much lower probabilities. The eruption of Mount Etna last week is a reminder that the world has not seen a really large volcanic eruption since Mount Tambora in Indonesia in 1815. It has been two centuries since the annual amount of sulphate aerosol injected into the atmosphere by volcanic eruptions exceeded 50 million tons. We have forgotten how severe volcanic global cooling can be.
There’s a pervasive darkness to Philip Pullman’s worlds that I cannot help suspecting may characterize our world in the years ahead. Much of “The Golden Compass” is set in “the North,” including the frigid Norwegian archipelago of Svalbard. Texan readers are discovering that the weather of the north can now reach a lot further south than we are used to.
In “The Subtle Knife,” we encounter a beautiful Mediterranean country where specters hunt down adults and suck the vitality out of them — where only children are oblivious to and safe from the danger. It is remarkable how Pullman anticipated our ageist pandemic. In “The Golden Compass,” kids are cruelly separated from their daemons. In the Covid world, they are cruelly separated from their friends.
“The Book of Dust” depicts Lyra’s Oxford devastated by a disastrous flood. Those who live there can easily imagine such an inundation, having seen the Thames so often burst its banks and submerge Port Meadow in recent years.
Nowhere does the future look less like the recent past and more like Pullman’s parallel universe than in his own country, England. True, 2021 has got off to a much better start for the U.K. than 2020 did. Thanks to world-class research at Oxford and elsewhere, bold procurement decisions led by Kate Bingham, head of the government’s task force, and the experience of the National Health Service in mass vaccination, the U.K. has surged ahead of the European Union in the vaccination race.
The European Commission has handled the vaccine challenge so badly — simultaneously centralizing procurement and slowing it down, then lashing out at the U.K. with empty threats to close the border between Northern Ireland and the South — that even the most ardent proponents of Brexit can scarcely believe it. (“I understand Brexit better now,” a pro-EU source at the drug company AstraZeneca told the Spectator last month.) Close to a quarter of the U.K. population has now received at least once vaccine dose, compared with 12% in the U.S. and less than 4% in Germany.
However, this success story comes after an annus horribilis. Excluding tiny Gibraltar and San Marino, the U.K. has the third-worst Covid mortality rate of any country in the world, exceeded only by Slovenia and Belgium. The country saw two of the world’s worst waves of excess mortality, in April last year and again over the Christmas holidays.
The U.K.’s gross domestic product shrank by 9.9% last year, the worst performance of any major economy apart from Spain, according to the International Monetary Fund. The last annual contraction larger than that was in 1709, when economic activity was steeply reduced throughout Europe by the “Great Frost,” the coldest winter in five hundred years. This has been attributed by modern research to the exceptionally low sunspot activity known as the Maunder Minimum, as well as to volcanic eruptions in the two preceding years at Mount Fuji, in Japan, and Santorini and Vesuvius, in Europe.
The worst years in English economic history, according to the Bank of England, were 1629, when the economy contracted by 25%, and 1349, when it shrank by 23%. The 1340s were the decade of the Black Death. I still cannot work out what went wrong in 1629, a year best known to political historians as the beginning of Charles I’s 11-year “Personal Rule” without a parliament.
Rise and Fall
Annual change of real GDP of England at market prices
Source: Bank of England
A contraction of nearly 10% turns the economic clock of a country back around six years. Also turning the clock back is the effect of Brexit, which formally came into effect at the beginning of this year, after four and half years of divorce negotiations. As I warned back after the June 2016 referendum, Brexit was always going to be one of those divorces that takes a lot longer and costs a lot more than the exiting spouse imagined at the outset. Sure enough, now that Britain has its decree nisi, the true costs of splitting up can no longer be glossed over. The U.K. has opted to phase in border checks on EU imports gradually until July 1, whereas U.K. exports to the EU have faced the full suite of new restrictions since Jan. 1. The Road Haulage Association has reported that U.K. exports to the EU have fallen by more than two thirds, though the government does not accept this claim.
Not only is trade in goods suddenly a lot more difficult than it was before — cue a hundred press stories about customs paperwork crushing small businesses whose owners voted for Brexit — there simply is no agreement on trade in services. Nor are the Europeans in any hurry to recognize London as having equivalent regulatory status with euro area financial centers.
As people anticipate the new world in which London is no longer both de facto and de jure the EU’s principal financial center, the City is losing out. A chunk of London’s swaps business has migrated. There is already talk of changing the rules that allow asset management funds based in the EU to be managed from the U.K. Most startling of all, Amsterdam overtook London as a stock trading center in December. That probably hasn’t been true since 1709, if not 1629.
It’s sometimes forgotten that the 17th century Dutch Republic led England in terms of financial development: the former had a “golden age” while the latter was wracked by a religious and political civil war. Only with the ouster of James II and the installation of William of Orange as king of England in 1688 — the so-called “Glorious Revolution” — were Dutch economic institutions imported to London from Amsterdam.
The English already had their own East India Company, but before 1688 it was commercially inferior to its Dutch counterpart. In 1694 the Bank of England was founded to manage the government’s borrowings as well as the national currency, similar (though not identical) to the successful Amsterdam Wisselbank founded 85 years before. London was also able to import the Dutch system of a national debt, funded through a stock exchange, where long-term bonds could easily be bought and sold.
Brexit has also turned the clock back demographically. According to the Migration Observatory at Oxford, the U.K.’s foreign-born population shrank by just over 1 million in the first three quarters of 2020. About 481,000 of those departing were born in the EU, reversing an influx that began in 2004, when the U.K. was one of only three established EU countries (the others were Ireland and Sweden) to allow immediate free movement by the citizens of the 10 Eastern European states that had just joined the union.
Of all the political mistakes that led to Brexit, this is the one that attracts the least attention, because it was made by a Labour government, based on civil servants’ disastrous underestimates of the likely westward migration flows. Those who voted for Brexit to reverse these flows are seeing their wish come true, especially in London, from which there has been a veritable exodus of migrants.
I used to worry, half-jokingly, that the net result of Brexit would be turn the social and economic clock back to before 1973, the year Britain joined the European Economic Community. I am old enough to remember the shabbiness of the country in those days: the inefficiency of nationalized industries, the excessive power of trade unions, the pervasive mood of cynicism that was good for sitcom scripts but not much else. Economically, however, not even the combination of Brexit and Covid-19 could return living standards to the low levels of those days.
Yet culturally the country seems to be lurching even further backwards. It is not just those on the right who quietly craved a less cosmopolitan country. It is also those on the left who seek to repudiate almost all of British history since 1709. The “woke” elements on British campuses took this repudiation to new depths earlier this month with a conference on the “racial consequences” of Winston Churchill at the Cambridge college that bears his name.
Speaking at this event, Kehinde Andrews, author of “The Psychosis of Whiteness,” described Churchill as “the perfect embodiment of white supremacy” and the British Empire as having been “far worse than the Nazis and lasted far longer.” Madhusree Mukerjee argued that “militarism is the core of the British identity” and called for statues celebrating British militarism to be taken down. No Churchill defenders were among the panelists.
Education Secretary Gavin Williamson last week announced new measures to uphold free speech at British universities. That may provide protection to the conservative thinkers who have recently been subjected to various forms of “cancellation” at Cambridge and elsewhere, but it will do nothing to stem the tide of wokeism. The government may stand, as Williamson said, “unequivocally on the side of free speech and academic freedom, on the side of liberty, and of the values of the Enlightenment.” But the academic left repudiates the Enlightenment as a mere helpmeet of imperialism and likes nothing better than to claim that conservatives are “weaponizing” free speech. So dominant are such ideas on some U.K. campuses that the clock appears to have been turned all the way back to the mid-17th century, when it was routine to denounce one’s ideological enemies as heretics and to condemn their ideas as blasphemy.
Pullman’s England seemed a through-the-looking-glass place when I first read his novels to my older children. I now realize we are hurtling towards it.
Nothing would turn the clock back further than the breakup of Britain — an eventuality predicted many times over the years. The New Left writer Tom Nairn published a book of that title in 1977. I remember being briefly converted to the cause of Scottish independence by the arguments of that book, combined with renditions of the Corries’ faux national anthem, “Flower of Scotland,” at international rugby matches. (I was 13 and soon grew out if it.)
But Scottish independence, which I have opposed throughout my adult life, may now be inevitable. Elections to the Scottish Parliament at Holyrood are scheduled for May 2021. The Scottish National Party, which is campaigning for a second independence referendum, is on track to win a comfortable majority. Scotland’s first minister, Nicola Sturgeon, does not want to proceed to a “indyref 2” without the U.K. government’s consent. However, on Jan. 24, her party published a “Roadmap to a Referendum,” which stated that if the party wins a majority in May, it will hold an independence referendum regardless of the U.K. government’s consent — the same strategy that led to chaos in Catalonia in 2017.
My favorite cartoon of the year so far was by Graeme Keyes. It depicted an Englishman striding resolutely westward with a suitcase labeled “BREXIT,” while a kilt-wearing Scotsman sauntered in the opposite direction with a suitcase labeled “EXBRIT.”
Recent polling points to a wider disintegration of the country officially known as the United Kingdom of Great Britain and Northern Ireland. Not only do 49% of Scottish voters now favor independence over 44% who oppose it; 42% of voters in Northern Ireland also support a United Ireland. Moreover, the English themselves are becoming resigned to these outcomes. Some 49% of voters in England now think Scottish independence is likely; 45% would either be “pleased” or “not bothered” by it; and an amazing 57% would be “pleased” or “not bothered” by Irish reunification.
No one knows how exactly the Scottish economy could cope with independence, especially if it were to apply to join the EU as many nationalists would like (remember, Scotland voted emphatically against Brexit). But practicalities are no more in focus than they were in 2016, when the English voted to leave the EU — despite the fact that this would be a much more momentous divorce, ending the union of parliaments of 1707 and potentially even the union of crowns of 1603.
You see what I mean about turning the clock back? And yet it may be time for me to accept this process of historical shape-shifting instead of trying to fight it. Reading the novels of Walter Scott, the first of which appeared the year before Tambora erupted, I am reminded time and again how contingent the Anglo-Scottish union was at its outset, and how hotly contested, sparking major rebellions in 1715 and 1745. Just as Pullman’s novels conjure up an imagined England, in which modernity does not quite come together as it has in our world, so Scott’s remind us of what the pre-modernity was like — a world not just lacking in modern technology, but afflicted by religious zealotry and intolerance.
The thought that we might be on our way back to that world would make my daemon — if I had one — shudder.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.