“Bliss was it in that dawn to be alive,/But to be young was very heaven!” Wordsworth was talking about France in 1789, but the line applies better to the America of 1957. That summer, Elvis Presley topped the charts with “(Let Me Be Your) Teddy Bear.” But we tend to forget that 1957 also saw the outbreak of one of the biggest pandemics of the modern era. Not coincidentally, another hit of that year was “Rockin’ Pneumonia and the Boogie Woogie Flu” by Huey “Piano” Smith & the Clowns.
When seeking historical analogies for Covid-19, commentators have referred more often to the catastrophic 1918-19 “Spanish influenza” than to the flu pandemic of 1957-58. Yet the later episode deserves to be much better known, not just because the public health threat was a closer match to our own but because American society at the time was better prepared—culturally, institutionally and politically—to deal with it.
The “Asian flu”—as it was then uncontroversial to call a contagious disease that originated in Asia—was a novel strain (H2N2) of influenza A. It was first reported in Hong Kong in April 1957, having originated in mainland China two months before, and—like Covid-19—it swiftly went global.
Unlike Covid-19, the Asian flu killed appreciable numbers of young people. The age group that suffered the heaviest losses globally was 15- to 24-year-olds.
Like Covid-19, the Asian flu led to significant excess mortality. The most recent research concludes that between 700,000 and 1.5 million people worldwide died in the pandemic. A pre-Covid study of the 1957-58 pandemic concluded that if “a virus of similar severity” were to strike in our time, around 2.7 million deaths might be anticipated worldwide. The current Covid-19 death toll is 3 million, about the same percentage of world population as were killed in 1957–58 (0.04%, compared with 1.7% in 1918-19).
True, excess mortality in the U.S.—now around 550,000—has been significantly higher in relative terms in 2020-21 than in 1957-58 (at most 116,000). Unlike Covid-19, however, the Asian flu killed appreciable numbers of young people. In terms of excess mortality relative to baseline expected mortality rates, the age groups that suffered the heaviest losses globally were 15- to 24-year-olds (34% above average mortality rates) followed by 5- to 14-year-olds (27% above average). In total years of life lost in the U.S., adjusted for population, Covid has been roughly 40% worse than the Asian flu.
The Asian flu and Covid-19 are very different diseases, in other words. The Asian flu’s basic reproduction number—the average number of people that one person was likely to infect in a population without any immunity—was around 1.65. For Covid-19, it is likely higher, perhaps 2.5 or 3.0. Superspreader events probably played a bigger role in 2020 than in 1957: Covid has a lower dispersion factor—that is, a minority of carriers do most of the transmission. On the other hand, people had more reason to be afraid of a new strain of influenza in 1957 than of a novel coronavirus in 2020. The disastrous pandemic of 1918 was still within living memory, whereas neither SARS nor MERS had produced pandemics.
High school students in Washington, D.C., September 1957.
PHOTO: EVERETT COLLECTION
The first cases of Asian flu in the U.S. occurred early in June 1957, among the crews of ships berthed at Newport, R.I. Cases also appeared among the 53,000 boys attending the Boy Scout Jamboree at Valley Forge, Penn. As Scout troops traveled around the country in July and August, they spread the flu. In July there was a massive outbreak in Tangipahoa Parish, La. By the end of the summer, cases had also appeared in California, Ohio, Kentucky and Utah.
It was the start of the school year that made the Asian flu an epidemic. The Communicable Disease Center, as the CDC was then called, estimated that approximately 45 million people—about 25% of the population—became infected with the new virus in October and November 1957. Younger people experienced the highest infection rates, from school-age children up to adults age 35-40. Adults over 65 accounted for 60% of influenza deaths, an abnormally low share.
Why were young Americans disproportionately vulnerable to the Asian flu? Part of the explanation is that they had not been as exposed as older Americans to earlier strains of influenza. But the scale and incidence of any contagion are functions of both the properties of the pathogen itself and the structure of the social network that it attacks. The year 1957 was in many ways the dawn of the American teenager. The first baby boomers born after the end of World War II turned 13 the following year. Summer camps, school buses and unprecedented social mingling after school ensured that between September 1957 and March 1958 the proportion of teenagers infected with the virus rose from 5% to 75%.
The policy response of President Dwight Eisenhower could hardly have been more different from the response of 2020. Eisenhower did not declare a state of emergency. There were no state lockdowns and, despite the first wave of teenage illness, no school closures. Sick students simply stayed at home, as they usually did. Work continued more or less uninterrupted.
With workplaces open, the Eisenhower administration saw no need to borrow to the hilt to fund transfers and loans to citizens and businesses. The president asked Congress for a mere $2.5 million ($23 million in today’s inflation-adjusted terms) to provide additional support to the Public Health Service. There was a recession that year, but it had little if anything to do with the pandemic. The Congressional Budget Office has described the Asian flu as an event that “might not be distinguishable from the normal variation in economic activity.”
President Eisenhower’s decision to keep the country open in 1957-58 was based on expert advice. When the Association of State and Territorial Health Officials (ASTHO) concluded in August 1957 that “there is no practical advantage in the closing of schools or the curtailment of public gatherings as it relates to the spread of this disease,” Eisenhower listened. As a CDC official later recalled: “Measures were generally not taken to close schools, restrict travel, close borders or recommend wearing masks….ASTHO encouraged home care for uncomplicated influenza cases to reduce the hospital burden and recommended limitations on hospital admissions to the sickest patients….Most were advised simply to stay home, rest and drink plenty of water and fruit juices.”
Dr. Maurice Hilleman, seen here in the lab in 1963, played a key role in the development of a vaccine for the Asian flu in 1957.
PHOTO: ASSOCIATED PRESS
This decision meant that the onus shifted entirely to pharmaceutical interventions. As in 2020, there was a race to find a vaccine. Unlike in 2020, however, the U.S. had no real competition, thanks to the acumen of one exceptionally talented and prescient scientist. From 1948 to 1957, Maurice Hilleman—born in Miles City, Mont., in 1919—was chief of the Department of Respiratory Diseases at the Army Medical Center (now the Walter Reed Army Institute of Research).
Early in his career, Hilleman had discovered the genetic changes that occur when the influenza virus mutates, known as “shift and drift.” It was this work that enabled him to recognize, when reading reports in the press of “glassy-eyed children” in Hong Kong, that the outbreak had the potential to become a disastrous pandemic. He and a colleague worked nine 14-hour days to confirm that this was a new and potentially deadly strain of flu.
Speed was of the essence, as in 2020. Hilleman was able to work directly with vaccine manufacturers, bypassing “the bureaucratic red tape,” as he put it. The Public Health Service released the first cultures of the Asian influenza virus to manufacturers even before Hilleman had finished his analysis. By the late summer, six companies were producing his vaccine.
It has become commonplace to describe the speed with which vaccines were devised for Covid-19 as unprecedented. But it was not. The first New York Times report of the outbreak in Hong Kong—three paragraphs on page 3—was on April 17, 1957. By July 26, little more than three months later, doctors at Fort Ord, Calif., began to inoculate recruits to the military.
Surgeon General Leroy Burney announced on August 15 that the vaccine was to be allocated to states according to population size but distributed by the manufacturers through their customary commercial networks. Approximately 4 million one-milliliter doses were released in August, 9 million in September and 17 million in October.
This amounted to enough vaccine for just 17% of the population, and vaccine efficacy was found to range from 53% to 60%. But the net result of Hilleman’s rapid response to the Asian flu was to limit the excess mortality suffered in the U.S.
A striking contrast between 1957 and the present is that Americans today appear to have a much lower tolerance for risk than their grandparents and great-grandparents. As one contemporary recalled, “For those who grew up in the 1930s and 1940s, there was nothing unusual about finding yourself threatened by contagious disease. Mumps, measles, chicken pox and German measles swept through entire schools and towns; I had all four….We took the Asian flu in stride. We said our prayers and took our chances.”
D.A. Henderson, who as a young doctor was responsible for establishing the CDC Influenza Surveillance Unit, recalled a similar sangfroid in the medical profession: “From one watching the pandemic from very close range…it was a transiently disturbing event for the population, albeit stressful for schools and health clinics and disruptive to school football schedules.”
Compare these stoical attitudes with the strange political bifurcation of reactions we saw last year, with Democrats embracing drastic restrictions on social and economic activity, while many Republicans acted as if the virus was a hoax. Perhaps a society with a stronger fabric of family life, community life and church life was better equipped to withstand the anguish of untimely deaths than a society that has, in so many ways, come apart.
A further contrast between 1957 and 2020 is that the competence of government would appear to have diminished even as its size has expanded. The number of government employees in the U.S., including those in federal, state and local governments, numbered 7.8 million in November 1957 and reached around 22 million in 2020—a nearly threefold increase, compared with a doubling of the population. Federal net outlays were 16.2% of GDP in 1957 versus 20.8% in 2019.
The Department of Health, Education and Welfare was just four years old in 1957. The CDC had been established in 1946, with the eradication of malaria as its principal objective. These relatively young institutions appear to have done what little was required of them in 1957, namely to reassure the public that the disastrous pandemic of 1918-19 was not about to be repeated, while helping the private sector to test, manufacture and distribute the vaccine. The contrast with the events of 2020 is once again striking.
It was widely accepted last year that economic lockdowns—including shelter-in-place orders confining people to their homes—were warranted by the magnitude of the threat posed to healthcare systems. But the U.S. hospital system was not overwhelmed in 1957-58 for the simple reason that it had vastly more capacity than today. Hospital beds per thousand people were approaching their all-time high of 9.18 per 1,000 people in 1960, compared with 2.77 in 2016.
In addition, the U.S. working population simply did not have the option to work from home in 1957. In the absence of a telecommunications infrastructure more sophisticated than the telephone (and a quarter of U.S. households still did not have a landline in 1957), the choice was between working at one’s workplace or not working at all.
Last year, the combination of insufficient hospital capacity and abundant communications capacity made something both necessary and possible that would have been unthinkable two generations ago: a temporary shutdown of a substantial proportion of economic activity, offset by massive debt-financed government transfers to compensate for the loss of household income. That this approach will have a great many unintended adverse consequences already seems clear. We are fortunate indeed that the spirit of the vaccine king Maurice Hilleman has lived on at Moderna and Pfizer, because much else of the spirit of 1957 would appear to have vanished.
Despite the pandemic, people thronged the beach and boardwalk at Coney Island in July 1957.
PHOTO: ASSOCIATED PRESS
“To be young was very heaven” in 1957—even with a serious risk of infectious disease (and not just flu; there was also polio and much else). By contrast, to be young in 2020 was—for most American teenagers—rather hellish. Stuck indoors, struggling to concentrate on “distance learning” with irritable parents working from home in the next room, young people experienced at best frustration and at worst mental illness.
We have done a great deal over the past year (not all of it effective) to protect the groups most vulnerable to Covid-19, which has overwhelmingly meant the elderly: 80.4% of U.S. Covid deaths, according to the CDC, have been among people 65 and older, compared with 0.2% among those under 25. But the economic and social costs, in terms of lost education and employment, have been disproportionately shouldered by the young.
The novel that captured the ebullience of the Beat Generation was Jack Kerouac’s “On the Road,” another hit of 1957. It begins, “I had just gotten over a serious illness that I won’t bother to talk about.” Stand by for “Off the Road,” the novel that will sum up the despondency of the Beaten Generation. As we dare to hope that we have gotten over our own pandemic, someone out there must be writing it.
This essay is adapted from Mr. Ferguson’s new book, “Doom: The Politics of Catastrophe,” which will be published by Penguin Press on May 4. He is a senior fellow at the Hoover Institution at Stanford University.
Braveheartfelt. Photographer: Loic Venance/AFP/Getty Images
Scottish nationalism was my gateway drug to politics. The year was 1979. I was 15 — an age when it is easy to confuse the mood of a crowd of football fans with the case for a constitutional change. The year before, Scotland had made it to the soccer World Cup finals in Argentina. They had not progressed beyond the first round, having lost to Peru and drawn with Iran, but they had somehow managed to beat the Netherlands 3-2, and the beautiful goal scored by Archie Gemmill was engraved in all our memories.
Even more importantly, just weeks before the first referendum on Scottish devolution — which was held on March 1, 1979 — Scotland’s rugby team had held England to a 7-7 draw at Twickenham Stadium. (In Scotland a tie with the English is a “moral victory.”) I still recall the rush of adrenaline, the tingling of the spine, the readiness to fix bayonets and charge that I felt in those days when my school friends and I sang “Flower of Scotland,” the unofficial national anthem. Marx famously wrote that religion is the opium of the people. But nationalism is the cocaine of the bourgeoisie.
What happened next was cold turkey; my painful introduction to the prosaic realities of politics. The 1979 referendum asked: “Do you want the provisions of the Scotland Act 1978 to be put into effect?” This was a reference to legislation passed by the moribund Labour government of Prime Minister James Callaghan, which envisaged the creation of a Scottish Assembly. Just over half of those who voted (51.6%) said “Yes.”
However, an amendment to the act (the work of George Cunningham, a Scot who was member of Parliament for a London constituency) had required at least 40% of the total electorate to vote “Yes” or the entire act would be repealed. As turnout was just 64%, the yes vote fell short, as it represented less than a third of those eligible to vote. I was in equal measures dejected and indignant. I dimly recall at least one fistfight on the train to school with a boy who expressed satisfaction with the result.
Those days are past now, in the mawkish words of “Flower of Scotland,” and in the past they must remain. Political sentiments such as these — the intoxicating cocktail of patriotism and pugnacity — are things to be grown out of, like binge drinking and attention-seeking clothes. I went to university in England, studied history, and realized what utter rubbish it all was.
The poor, oppressed Scots, ground down by the English? King James VI of Scotland could not have been more delighted when he inherited the English crown in 1603, becoming James I.
The Scottish elite embraced the union of the nations’ parliaments in 1707 because it gave them access to English wealth after the disastrous losses of the Darien scheme (to colonize what is now Panama). As the British Empire expanded in the 18th and 19th centuries, Scots were overrepresented in almost every role, staffing the East India Company, running the Caribbean plantations, and settling the Americas and the antipodes — a central theme of my book “Empire.”
In the world wars of the 20th century, the Scottish regiments did a disproportionate share of the fighting for king and country. “Not f***ing likely, you yellow bastard!” was the reaction of one member of the 51st (Highland) Division when ordered to lay down his arms by an English officer in June 1940. Such evidence lends credence to the story about the two Highlanders watching the evacuation of the beaches at Dunkirk. “Aye, Jock,” said one to the other, “If the English surrender, it’ll be a long war.”
In peacetime, too, the U.K. has given generations of Scots far greater opportunities for advancement than they would otherwise have enjoyed. In all, 11 prime ministers can be counted as Scots (Bute, Aberdeen, Gladstone, Rosebery, Balfour, Campbell-Bannerman, Law, MacDonald, Douglas-Home, Blair and Brown), while David Cameron’s father was Ian Donald Cameron, born in Aberdeenshire. One reason the current prime minister, Boris Johnson, polls very badly in Scotland is that, although he is the American-born great-grandson of an Ottoman pasha, he has risen to power by playing the part of a caricature English toff.
Unfortunately, and partly because of Boris-itis, the utter rubbish of Scottish nationalism is back — yet again. On May 6, voters in Scotland go to the polls. The latest opinion surveys suggest that the Scottish National Party could win an outright majority. If so, they will demand a referendum on independence, which would bring the total number of referenda on Scotland’s constitutional status to four in the space of five decades.
Scotland got “devolution” at the second attempt in 1997. Soon after his landslide victory in May of that year, Tony Blair (born in Edinburgh, educated at Fettes College there) fulfilled a manifesto commitment by allowing another referendum, this time posing two questions: whether or not there should be a Scottish Parliament; and whether or not such an institution should have “tax-varying powers.”
Both questions were answered decisively in the affirmative (with 74.3% and 63.5%, respectively), and this time turnout didn’t matter. In 1999, a Scottish election was held and, for the first time since 1707, a parliament met in Edinburgh.
The canny Scots who ran the Labour Party and thus the entire U.K. in those days — not only Blair but also Gordon Brown, Donald Dewar, Robin Cook and many others — fondly imagined that devolution would draw the sting of Scottish nationalism. They were catastrophically mistaken.
Having won 56 seats out of 129 in the first Scottish election in 1999, Labour saw its representation shrivel to just 24 in 2016. The Scottish National Party took power in 2007, when it won 47 seats, and gained a majority in 2011 (69 seats). It was in the wake of that victory that Cameron agreed to a third referendum in 2014, this time on the question, “Should Scotland be an independent country?”
Full disclosure: I campaigned for a “No” vote seven years ago and was relieved when we won, 55.3% to 44.7%, despite or perhaps because of a remarkably high turnout (84.6%). That should have settled the matter, especially after the Scottish National Party lost its majority in 2016, but no.
Earlier this year, Johnson seemed determined not to yield to SNP pressure for another referendum on independence. Now, however, the Scottish Conservatives have muddied the waters: “With just four more seats,” they tweeted on April 8, “the SNP will win a majority and hold another divisive independence referendum. YOU can stop it — but ONLY by giving your party list vote to the @ScotTories.” The obvious problem with this argument is that, if it doesn’t work — if people still vote SNP in sufficient numbers to give them a majority — then another independence referendum will be rather hard to refuse.
This helps to explain the recent warning of my Bloomberg Opinion colleague Max Hastings that the U.K. is “dangerously close to an existential crisis,” not only because of the rising risk of Scottish secession, but also because of the increasingly fraught atmosphere on the border between Northern Ireland and the Republic of Ireland. (Hastings foresees “Irish reunification … within a generation.”)
For those who have lost their copies of Tom Nairn’s “The Break-Up of Britain,” published in 1977 when I was a teenage “Scot Nat,” there is a new and updated work: the Scottish journalist Gavin Esler’s “How Britain Ends.” Recent polling bears out Esler’s analysis. Support for Scottish independence surged last year, though the separatists have seen their lead dwindle in recent months, from an average of 6.3% in January to just 0.6% in March.
More striking is the evidence that the British population as a whole would not mind very much if the U.K. fell apart. In a Sunday Times poll in January, nearly half (49%) of English and Welsh voters and 60% of Northern Irish voters said they thought Scottish independence was likely in the next 10 years, and 45% of English voters said they would be either “pleased” or “not bothered” if it happened. No fewer than 71% of Scottish voters and 57% of English voters would be either “pleased” or “not bothered” by Irish reunification.
These shifts in attitudes would seem to illustrate that the only law of history is the law of unintended consequences. When the votes were cast for and against Brexit back in 2016, the divergences were almost as startling as the overall result. England and Wales voted to leave the European Union by roughly the same margin (53% to 47%). But Scotland voted to remain in the EU by 62% to 38%, and Northern Ireland by 56% to 44%.
The conventional view is that this divergence presented the SNP with a perfect opportunity to resuscitate the dream of Scottish independence. It is not just that Brexit is posing real problems for parts of Scotland’s economy. It is the fact that staying in the U.K. no longer looks as economically safe an option as it seemed to be in 2014. And now independence can be represented as a way back into Europe.
And yet, I wonder just how convincing this argument really is. Unlike in 2014, Scotland is already out of the EU but, just as in 2014, it would have to apply to rejoin it after an independence vote. There is enough opposition to separatism per se among existing member states to make this hard or, at least, slow (step forward Spain, which has its own separatists to contend with in Catalonia). And Scotland would find the waiting room already crowded with “candidate countries”: Albania, the Republic of North Macedonia, Montenegro, Serbia and Turkey. (Some would say the SNP would feel right at home in such company, as the party’s culture is much more Balkan than Baltic, much less Scandinavian.)
Independence would also create a significant headache from the point of view of national security. Since 1998, when the U.K. decommissioned its tactical WE.177 bombs, the Trident program has been Britain’s only operational nuclear weapons system. It consists of four Vanguard-class submarines based at Faslane on Gare Loch, 40 miles northwest of Glasgow. It is also rather hard to imagine the British army without the Scottish regiments — even if, as Simon Akam shows in his unsentimental book “The Changing of the Guard,” the Caledonian martial tradition today is a shadow of its old self.
As for the monetary and fiscal difficulties of independence, they are all but insuperable. What share of the U.K. national debt would an independent Scotland inherit? What currency would it use? The country’s biggest banks already issue their own distinctive banknotes, but they are entirely backed by Bank of England notes. Scotland could not easily adopt the euro while outside the EU (though Kosovo has done this).
And let’s not forget that the pre-1707 “pound Scots” was worth one-twelfth of an English pound. How would Scotland fare without its current subsidy from the South? “It’s Scotland’s oil” is not much of a slogan when fossil fuels are supposedly being phased out, and North Sea oil is running out anyway (production peaked in 1999, and has declined steeply since then).
Then there is the small matter of the SNP’s far-from-impressive record in running those public services that have been its responsibility for the past 14 years. First Minister Nicola Sturgeon’s boasts last year that Scotland was handling Covid-19 better than England now look hollow, while credit for Britain’s successful vaccine rollout is justly going to the U.K. government. A new report by Oxford Economics for The Hunter Foundation sheds unflattering light on Scotland’s recent economic performance, pointing to low productivity, too few startups and even less success in scaling businesses.
As for education, where the country once led both England and Europe, today Scottish pupils do worse in mathematics than those in the Czech Republic, Estonia and Slovenia — and England. Funnily enough, an independent Organization for Economic Cooperation and Development report on the state of Scottish schools won’t be appearing until after next month’s election.
If the Scots were taught their own history better, they might also be less attracted by the idea of independence. For the country’s experience prior to the Union was hardly one of unalloyed happiness. To read the historical novels of Walter Scott — notably “Waverley” (1814), “Old Mortality” (1816), “Rob Roy” (1817), “A Legend of Montrose” (1819) and “The Abbot” (1820) — is to be reminded vividly that Scotland up until the defeat of the Jacobites at Culloden in 1746 was an exceptionally violent country, characterized by bitter internecine strife between Highlanders and Lowlanders, Catholics and Calvinists, MacDonalds and Campbells.
The miracle of Scottish history is that a country that for centuries so closely resembled Afghanistan in our own time — torn apart by compulsively warring mountain clans and religious fanatics, and subject to recurrent foreign interference — should have transformed itself in the space of a generation into a cradle of the Enlightenment. As Scott observed in the postscript to “Waverley”: “There is no European nation which, within the course of half a century or little more, has undergone so complete a change as this kingdom of Scotland.”
Even in 1884, Robert Louis Stevenson (in many ways, Scott’s spiritual heir) was still struck by the fissures within Scottish society. “Two languages, many dialects, innumerable forms of piety, and countless local patriotisms and prejudices, part us among ourselves more widely than the extreme east and west of that great continent of America,” he wrote, from the safety of California. “When I am at home, I feel a man from Glasgow to be something like a rival, a man from Barra to be more than half a foreigner.” Only when the Scots were abroad, Stevenson observed, were these divisions set aside.
Recent events illustrate that Scotland remains as much a land of brutal rifts and feuds as of bonnie lochs and glens. Scott and Stevenson would surely have relished the schism that has opened up within Scottish nationalism itself between Sturgeon and her predecessor as first minister and SNP leader, Alex Salmond. Once, Sturgeon and her husband were Salmond’s proteges. But when Salmond was accused of sexual misconduct in January 2018, the dirks came out.
A governmental probe into Salmond’s conduct was dismissed by a judge as “tainted with apparent bias.” Salmond was charged with 14 offenses, including attempted rape and sexual assault, only to be acquitted. Earlier this year, Salmond told a committee of lawmakers that Sturgeon’s inner circle ran a “deliberate, prolonged, malicious and concerted effort” to damage his reputation, “even to the extent of having me imprisoned.”
Although an inquiry concluded that Sturgeon had not breached the ministerial code, enabling her to win a confidence vote comfortably last month, there is a lingering whiff of unpleasantness in the air, faintly reminiscent of a Glasgow pub the morning after a stooshie. There is sawdust on the floor and carbolic soap in the air, but specks of blood and shards of glass give the game away.
The upshot is that Salmond has launched his own party, called Alba (the Gaelic name for Scotland), and — though he insists that his goal is to advance the cause of independence — we can safely assume that it is vengeance that he wakes up in the wee small hours thinking about. For the Scottish temperament thrives on retaliation in a way the Corleone family cannot match and the English cannot fathom.
For all these reasons, the nationalists can be beaten again. But we know now that merely beating them in another referendum will not suffice. And it is at this point that the experience of another country becomes highly relevant — a country that was once second only to New Zealand when it came to the share of Scottish immigrants in its population.
As John Lloyd of the Financial Times argues in his new book, “Should Auld Acquaintance Be Forgot,” the Canadian federalists finally got the better of the Parti Quebecois after two referendums in 1980 and 1995 — the first effectively on devolution, which the separatists lost, the second on independence (“sovereignty”), which was nail-bitingly close (50.6% No vs. 49.4% Yes).
The subsequent “tough love” argument of Prime Minister Jean Chretien’s government was that it could not solely be up to a slim majority of the voters of Quebec if Canada broke up. The key role was played by Chretien’s minister of intergovernmental affairs, Stephane Dion, who in 1996 posed three carefully crafted questions to the Supreme Court of Canada.
First, was it constitutionally possible for the National Assembly, legislature or government of Quebec to effect the secession of Quebec from Canada unilaterally? Second, did international law allow such a unilateral secession? Third, if there was a conflict between the Canadian constitution and international law, which would take precedence?
The Supreme Court’s answer was that Quebec did not have the right to secede unilaterally under either the constitution or international law. Only if Quebecers expressed a clear will to secede would the federal government be obliged to enter into negotiations with their government, but it was up to the Canadian Parliament to determine if a referendum question was sufficiently clear to trigger such negotiations. The subsequent Clarity Act enshrined this principle in legislation, along with the equally important point that a “clear majority” would need to vote in favor of secession, as opposed to “50% plus one.”
This is how to play the game against secessionists — and it is high time that the Johnson government adopted this approach, rather than unthinkingly accepting the SNP’s argument that it has a moral right to a referendum on secession every time it wins a parliamentary election. The most impressive result of Dion’s approach has been an enduring decline in support for secession in Quebec. One recent poll found that 54% of Quebecers would vote against independence, while just 36% would vote for it. Among younger voters, support is even lower. Only those aged between 55 and 64 still narrowly favor secession.
There is, in any case, something oxymoronic about the idea of Scottish nationalism. For centuries, the Scots have been defined as a people by their absence from Scotland. (Think of the Proclaimers’ “Letter from America.”) By one estimate, the number of people outside Scotland who identify as Scots is around 18 million in the New World alone, including 6 million in the U.S. (not counting an almost equal number of Scots-Irish, meaning descendants of Ulstermen), 5 million in Canada and nearly 2 million in Australia.
There are Scots everywhere, from Dunedin to Nova Scotia, from Patagonia to Hong Kong (a city that was of course founded by Scotsmen). There are more people called Ferguson in Kingston, Jamaica, than in Dundee and Aberdeen put together.
The great Scottish historian and philosopher David Hume was always contemptuous of what he called the “vulgar motive of national antipathy.” “I am a Citizen of the World,” he wrote in 1764. That is what I learned to be after I left Scotland at the age of 17. Do I still exult at Scottish sporting victories? Yes, I do — though God knows they are few and far between.
But I no longer confuse that exultation with politics. I long ago kicked the cocaine of the bourgeoisie. And I believe it would greatly benefit the current residents of my native land to do the same.
The Fort Knox of the future. Photographer: Andrey Rudakov/Bloomberg via Getty Images
What is the money of the future? My nine-year-old son thinks it will be Robux. For those of you trapped in the human museum known as adulthood, Robux is the currency used by players of Roblox computer games. If I offer Thomas grimy dollar bills for household chores, he shows an almost complete lack of interest and motivation. But if I offer him Robux, it’s a different story.
The current exchange rate is around 80 to the dollar. So, in order to incentivize my son to do the dishes, I need to go online and buy 2,000 Robux for $24.99. This I do by entering my credit card details on a website, an act of self-exposure that never fails to make me feel sick. However, the dishes get cleaned and, later, my son blows some of his Robux on a cool new outfit and a pair of wings for his avatar, earning the admiration of his friends.
Robux is just one of the new forms of money that exist in the parallel world of online gaming. If your kids play Fortnite, then you’ve probably had to buy them V-Bucks (short for VinderBucks). And gamer money is, in turn, just a subset of the myriad means of payment that now exist on the internet.
Writers of science fiction got many things right about the future, from pandemics to flying cars to artificial intelligence. None, so far as I know, got the future of money exactly right. In William Gibson’s seminal Neuromancer (1984), paper money (the “new yen” or N¥) has survived but is used only for illicit transactions. In Neal Stephenson’s Snow Crash (1992), hyperinflation has ravaged the value of the dollar so much that, in Compton, California, “Street people push … wheelbarrows piled high with dripping clots of million- and billion-dollar bills that they have raked up out of storm sewers.” A trillion-dollar bill is known colloquially as an “Ed Meese.” A quadrillion is a “Gipper.” (Only we Boomers now get the allusions to the former attorney general and the president he served in the 1980s.) In other dystopian futures, readily available commodities such as bullets or bottle caps serve as makeshift money, rather like cigarettes in occupied Germany in the immediate aftermath of World War II. My favorite imagined currency are the “merits” in the British TV show Black Mirror, which have to be earned by pedaling on exercise bikes.
If some other author predicted the future of money accurately, I missed it. Unfortunately, this lack of foresight now seems also to afflict U.S. policymakers, leaving the world’s financial hegemon vulnerable to a potentially fatal challenge. Not only are the American monetary authorities underestimating the threat posed to dollar dominance by China’s pioneering combination of digital currency and electronic payments. They are also treating the blockchain-based financial innovations that offer the best alternative to China’s e-yuan like gatecrashers at their own exclusive party.
Let’s begin with the future of money that no one foresaw.
In 2008, in a wonkish paper that bore no relation to any sci-fi, the enigmatic Satoshi Nakamoto launched Bitcoin, “a purely peer-to-peer version of electronic cash” that allows “online payments to be sent directly from one party to another without going through a financial institution.” In essence, Bitcoin is a public ledger shared by an acephalous (leaderless) network of computers. To pay with bitcoins, you send a signed message transferring ownership to a receiver’s public key. Transactions are grouped together and added to the ledger in blocks, and every node in the network has an entire copy of this blockchain at all times. A node can add a block to the chain (and receive a bitcoin reward) only by solving a cryptographic puzzle chosen by the Bitcoin protocol, which consumes processing power.
Nodes that have solved the cryptographic puzzle — “miners” — are rewarded not only with transaction fees, but also with more bitcoins. This reward will get cut in half every four years until the total number of bitcoins reaches 21 million, after which no new Bitcoins will be created. As I argued here last November, there were good reasons why Bitcoin left gold for dead as the pandemic was wreaking havoc last year. Scarcely over a year ago, when just about every financial asset sold off as the full magnitude of the pandemic sank in, the dollar price of a Bitcoin fell to $3,858. As I write, the price is $58,746.
The reasons for Bitcoin’s success are that it is sovereign (no one controls it, not the “whales” who own a lot, and not the miners who mine a lot), scarce (that 21 million number is final), and — above all — smart. With every day that the system works — not being hacked, not crashing — the predictions that it would prove to be a “shitcoin” look dumber, and the pressure on people to affirm their smartness by owning bitcoins grows stronger. Last year, a bunch of tech companies, including Square, PayPal and Tesla, bought a pile. Several legendary investors — Paul Tudor Jones, Stan Druckenmiller, Bill Miller — came out as long Bitcoin. Perhaps most importantly, Bitcoin began to be treated like a legitimate part of the financial system. BNY Mellon now handles Bitcoin. So does Mastercard. There are now well functioning Bitcoin futures and options markets. This kind of adoption and integration is what has driven the price upward — a process that has much further to run. My $75,000 target price back in 2018 (assuming that every millionaire would one day want 1% of his or her portfolio in XBT) now looks a bit conservative.
Meanwhile, as Bitcoin has grown more respectable, the cool kids have moved on to decentralized finance (“DeFi”), “an open, permissionless, and highly interoperable protocol stack built on public smart contract platforms” such as the Ethereum blockchain, to quote a recent and excellent St. Louis Fed paper by Fabian Schaer. Like Bitcoin, DeFi has no centralized third-party system of verification and regulation. But it is a much looser, more variegated system, with multiple coins, tokens, exchanges, debt markets, derivatives and asset management protocols. As Schaer puts it:
This architecture can create an immutable and highly interoperable financial system with unprecedented transparency, equal access rights, and little need for custodians, central clearing houses, or escrow services, as most of these roles can be assumed by ‘smart contracts.’ … Atomic swaps, autonomous liquidity pools, decentralized stablecoins, and flash loans are just a few of many examples that show the great potential of this ecosystem. … DeFi may lead to a paradigm shift in the financial industry and potentially contribute toward a more robust, open, and transparent financial infrastructure.
(I told you it was cool.)
For the true believers, Bitcoin and DeFi are the first steps toward a libertarian Nirvana. In a widely quoted tweet, crypto guru Naval Ravikant added steps three to seven:
Bitcoin is an exit from the Fed.
DeFi is an exit from Wall Street.
Social media is an exit from mass media.
Homeschooling is an exit from industrial education.
Remote work is an exit from 9-5.
Creator economy is an exit from employment.
Individuals are leaving institutions.
We are on our way, according to Pier Kicks, to the “Metaverse” — a “self-sovereign financial system, an open creator economy, and a universal digital representation and ownership layer via NFTs (non-fungible tokens).” Yes, even art is now on the blockchain: Witness the sale by Christie’s last month of “Everydays: the First 5000 Days,” by Mike Winkelmann, aka Beeple, for $69.3 million.
What is the right historical analogy for all this? Allen Farrington argues that Bitcoin is to the system of fiat currencies centered around the dollar what medieval Venice once was to the remnants of the western Roman Empire, as superior an economic operating system as commercial capitalism was to feudalism. Another possibility is that the advent of blockchain-based finance is as revolutionary as that of fractional reserve banking, bond and stock markets in the great Anglo-Dutch financial revolution of the 18th century.
Like all such revolutions, however, this one, too, has produced its haters. Well-known economists such as Nouriel Roubini continue to predict Bitcoin’s demise. Bridgewater founder Ray Dalio has warned that, just as the U.S. government prohibited the private ownership of gold by executive order in April 1933, so the same fate could befall Bitcoin. Perhaps most ominously, the central bankers of the western world remain sniffy. A new line of attack (highly appealing to monetary officials eager to affirm their greenness) is that the electricity consumed by Bitcoin miners makes crypto dirty money.
Are we therefore heading for a collision between the old money and the new? Perhaps. As we approach the end of the first 100 days of Joe Biden’s presidency, I am tempted to paraphrase his former boss’s jab at Mitt Romney back in 2012: “The twentieth century is calling to ask for its economic policy back.” There is something very old-school about the Biden administration.
It believes in Keynesian demand management and stimulus. It is proposing a massive infrastructure investment plan. The result is that fiscal and monetary expansion triggered by a public health emergency seems set to continue beyond the duration of the emergency. The administration’s economists tell us there is nothing to fear from inflation. Meanwhile, in foreign policy, Team Biden seems committed to Cold War II against China. All of this hinges on the enduring credibility of the U.S. dollar as the preeminent international reserve currency and U.S. Treasury bonds as the safest of all financial assets — not to mention the enduring effectiveness of financial sanctions as the ultimate economic weapon. Yet precisely these things are threatened by the rise of an alternative financial system that essentially bypasses the Federal Reserve and potentially also the U.S. Treasury.
So you can see why Ray Dalio might expect the U.S. government at some point to outlaw Bitcoin and other cryptocurrency. The last administration occasionally muttered threats. “Cryptocurrency … provides bad actors and rogue nation states with the means to earn profits,” stated the report of Attorney General William Barr’s Cyber-Digital Task Force last year. Treasury Secretary Steven Mnuchin considered forcing U.S. exchanges to gather more information about individuals withdrawing their Bitcoin. Pro-Bitcoin politicians, such as Miami mayor Francis Suarez, are still in a minority.
Abroad, too, there are plenty of examples of governments moving to limit cryptocurrencies or ban them altogether. “We must do everything possible to make sure the currency monopoly remains in the hands of states,” declared German Finance Minister Olaf Scholz at a G-7 finance ministers meeting in December. The European Commission shows every sign of regulating the fledgling sector with its customary zeal. In particular, the European Central Bank has stablecoins (crypto tokens pegged to fiat currencies) in its sights. China is even more stringent. In 2017, the Chinese Communist Party restricted the ability of its citizens to buy Bitcoin, though Bitcoin mining continues to thrive close to sources of cheap hydroelectricity like the Three Gorges Dam.
But is it actually true that the state should have a monopoly on money? That is a distinctly German notion, stated most explicitly in Georg Friedrich Knapp’s State Theory of Money (1905). History begs to differ. Although states have sometimes sought to monopolize money creation, and although a state monopoly on the enforcement of debt contracts is preferable, a monopoly on money is far from natural or even necessary. For most of history, states have been satisfied with determining what is legal tender — that is, what can be used to discharge contractual obligations, including tax payments. This power to specify legal tender drove the great monetization of economy and society in Ming China and in Europe after the Black Death.
Money, it is conventional to argue, is a medium of exchange, which has the advantage of eliminating inefficiencies of barter; a unit of account, which facilitates valuation and calculation; and a store of value, which allows economic transactions to be conducted over long time periods as well as geographical distances. To perform all these functions optimally, the ideal form of money has to be available, affordable, durable, fungible, portable and reliable. Because they fulfill most of these criteria, metals such as gold, silver and bronze were for millennia regarded as the ideal monetary raw material. Rulers liked to stamp coins with images (often crowned heads) that advertised their authority. But in ancient Mesopotamia, beginning around five thousand years ago, people used clay tokens to record transactions involving agricultural produce like barley or wool, or metals such as silver. Such tablets performed much the same function as a banknote. Often, through the centuries, traders have devised such tokens or bills without government involvement, especially at times when coins have been in short supply or debased and devalued.
In the modern fiat monetary system, the central bank, itself supposedly independent of the state, can influence the money supply, but it does not monopolize money creation. In addition to state-created cash — the so-called high-powered money or monetary base — most money is digital credits from commercial banks to individuals and firms. As I argued in The Ascent of Money (2008), money is trust inscribed, and it does not seem to matter much whether it is inscribed on silver, on clay, on paper — or on a liquid crystal display. All kinds of things have served as money, from the cowrie shells of the Maldives to the stone discs used on the Pacific island of Yap.
Although Bitcoin currently looks to outsiders like a speculative asset, in practice it performs at least two of the three classic functions of money quite well, or soon will, as adoption continues. It can be (like gold) both a store of value and a unit of account. And, as my Hoover Institution colleague Manny Rincon-Cruz has suggested, it may be that the three classic functions of money are in fact something of a trilemma. Most forms of money can perform at least two of the three; it’s impossible or very hard to do all three. Bitcoin is not an ideal medium of exchange precisely because its ultimate supply is fixed and not adaptive, but that’s not a fatal limitation. In many ways, it is Bitcoin’s unique advantage.
In other words, Bitcoin and Ethereum, as well as a great many other digital coins and tokens, are stateless money. And the more they can perform at least two out of three monetary functions tolerably well, the less that banning them is going to work — unless every government agrees to do so simultaneously, which seems like a stretch. The U.S. isn’t going to ban Bitcoin, just tax it whenever you convert bitcoins into dollars.
The right question to ask is therefore whether or not the state can offer comparably appealing forms of digital money. And this is where the Chinese government has been thinking a lot more creatively than its American or European counterparts. As is well known, China has led the world in electronic payments, thanks to the vision of Alibaba and Tencent in building their Alipay and WeChat Pay platforms. In 2020, some 58% of Chinese used mobile payments, up from 33% in 2016, and mobile payments accounted for nearly two-thirds of all personal consumption PBOC payments. Banknotes and credit cards have largely yielded to QR codes on smartphones. The financial subsidiary of Alibaba, Ant Group, was poised last year to become one of the world’s biggest financial companies.
Yet the Communist Party became nervous about the scale of electronic payment platforms and sought to clip their wings by cancelling Ant’s planned IPO in November and tightening regulation. At the same time, the People’s Bank of China has accelerated the implementation of its plan for a central bank digital currency (CBDC). In a fascinating article in February, former PBOC governor Zhou Xiaochuan explained the fundamentally defensive character of this initiative. “Blockchain technology features decentralization, but decentralization is not a necessity for modernizing the payments system. It even has some drawbacks,” he wrote. “The possible application of blockchain … is still being researched, but is not ready at this time.”
Last year, the PBOC seized the opportunity presented by the pandemic to rush its CBDC into the hands of Chinese consumers, conducting trials in three cities — Shenzhen, Suzhou and Chengdu — as well as the Xiong’an New Area near Beijing. Crucially, its design is two-tier, with the PBOC dealing with the existing state-owned commercial banks and other entities (including telecom and tech companies), not directly with households and firms. The abbreviation “DC/EP” (with the slash) captures this dual structure. The central bank controls the digital currency, but the electronic payment platforms can participate in the system, alongside the banks, as intermediaries to consumers and businesses. However, the easiest option for consumers will clearly be to withdraw “e-CNY” from bank ATM machines onto their smartphones’ e-wallets. The system even allows transactions to happen in the absence of an internet connection via “dual offline technology.” In 2018 I predicted there would soon be “bityuan.” I only got the name wrong.
This new Chinese system not only defends the CCP against the twin threats of crypto and big tech, while ensuring that all Chinese citizens’ transactions are under surveillance; it also includes an offensive capability to challenge the U.S. dollar’s dominance in cross-border payments. And this is where the story gets seriously interesting. Today, as is well known, the dollar dominates the renminbi in foreign exchange markets, central bank reserves, trade finance and bank-to-bank payments through the Belgium-based Society for Worldwide Interbank Financial Telecommunication (SWIFT). This financial superpower, fully appreciated and utilized only after 9/11, is what makes U.S. financial sanctions so effective and far-reaching.
The Chinese are creatively exploring ways to change that. Exhibit A is the Finance Gateway Information Service, a joint venture between SWIFT and the China National Clearing Center within the PBOC, which aims to direct all cross-border yuan payments through China’s own settlement system, Cross-Border Interbank Payment and Clearing. Exhibit B is the Multiple CBDC (mCBDC) Bridge project by the Hong Kong Monetary Authority and the Bank of Thailand to implement a cross-border payments system based on distributed ledgers, again using a two-tier system. Exhibit C are the cross-border transfers between Hong Kong and Shenzhen currently being piloted. According to Sahil Mahtani of the South African investment manager Ninety One, the ultimate goal of Chinese policy is “to create a parallel payments network — one beyond American oversight — thereby crippling U.S. sanctions policy.” In Mahtani’s words:
The expansion of a Chinese digital currency will ultimately pry open the U.S. grip over global payments, and therefore compromise U.S. sanctions policy and a significant measure of U.S. power in the world. … It is not that China’s digital currency is going to become the dominant standard of payments … But it could become one standard, creating a parallel system with which to avoid the long arm of U.S. regulation.
What does the United States have to offer in response? When Mark Carney, the former Governor of the Bank of England, argued for a “synthetic hegemonic currency” at Jackson Hole in 2019, he was politely ignored. When Mark Zuckerberg proposed a Facebook stablecoin, Libra, he was impolitely rebuffed. (Libra has been renamed “Diem,” but getting regulatory approval still looks like an uphill struggle. According to Tyler Goodspeed, who recently left the Council of Economic Advisers to join us at Hoover, “If you’re issuing very short-term liquid liabilities that are redeemable on demand for, say, dollars or euros, and you’re backing that commitment by holding highly liquid dollar- or euro-denominated bills, then I’m sorry to say it but you are a bank.”
Other countries are exploring creating their own CBDCs — 60% of more than 60 central banks surveyed by the Bank for International Settlements last year. Cambodia and the Bahamas are already there. Even the European Central Bank has not said “non” or “nein,” though Bundesbank head Jens Weidmann is not alone in worrying that an e-euro might disintermediate Europe’s already ailing banks unless the Chinese two-tier model is adopted.
And the Fed? According to Chair Jay Powell, some of his officials are working with economists at the Massachusetts Institute of Technology to explore the feasibility of a U.S. CBDC. But, says Powell, “there is no need to rush.” Like his “What me, worry?” approach to inflation, this smacks of insouciance. China is seeking in plain sight to build an alternative international payments system to that of the U.S. dollar, and there’s no need to rush to meet this challenge? Nor any thought of actively integrating Bitcoin — a tried and tested decentralized form of “digital gold” — into the U.S. financial system, rather than treating it as a rather suspect parvenu?
If the future of money arrives as rapidly as I think it will, in the form of a widely adopted e-CNY, do not be surprised if all we can offer our kids are Robux.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Diplomacy to a tee. Photographer: Tim Rue/Bloomberg
In a famous essay, the philosopher Isaiah Berlin borrowed a distinction from the ancient Greek poet Archilochus: “The fox knows many things, but the hedgehog knows one big thing.”
“There exists,” wrote Berlin, “a great chasm between those, on one side, who relate everything to … a single, universal, organizing principle in terms of which alone all that they are and say has significance” — the hedgehogs — “and, on the other side, those who pursue many ends, often unrelated and even contradictory” — the foxes.
Berlin was talking about writers. But the same distinction can be drawn in the realm of great-power politics. Today, there are two superpowers in the world, the U.S. and China. The former is a fox. American foreign policy is, to borrow Berlin’s terms, “scattered or diffused, moving on many levels.” China, by contrast, is a hedgehog: it relates everything to “one unchanging, all-embracing, sometimes self-contradictory and incomplete, at times fanatical, unitary inner vision.”
Fifty years ago this July, the arch-fox of American diplomacy, Henry Kissinger, flew to Beijing on a secret mission that would fundamentally alter the global balance of power. The strategic backdrop was the administration of Richard Nixon’s struggle to extricate the U.S. from the Vietnam War with its honor and credibility so far as possible intact.
The domestic context was dissension more profound and violent than anything we have seen in the past year. In March 1971, Lieutenant William Calley was found guilty of 22 murders in the My Lai massacre. In April, half a million people marched through Washington to protest against the war in Vietnam. In June, the New York Times began publishing the Pentagon Papers.
Kissinger’s meetings with Zhou Enlai, the Chinese premier, were perhaps the most momentous of his career. As a fox, the U.S. national security adviser had multiple objectives. The principal goal was to secure a public Chinese invitation for his boss, Nixon, to visit Beijing the following year.
But Kissinger was also seeking Chinese help in getting America out of Vietnam, as well as hoping to exploit the Sino-Soviet split in a way that would put pressure on the Soviet Union, America’s principal Cold War adversary, to slow down the nuclear arms race. In his opening remarks, Kissinger listed no fewer than six issues for discussion, including the raging conflict in South Asia that would culminate in the independence of Bangladesh.
Zhou’s response was that of a hedgehog. He had just one issue: Taiwan. “If this crucial question is not solved,” he told Kissinger at the outset, “then the whole question [of U.S.-China relations] will be difficult to resolve.”
To an extent that is striking to the modern-day reader of the transcripts of this and the subsequent meetings, Zhou’s principal goal was to persuade Kissinger to agree to “recognize the PRC as the sole legitimate government in China” and “Taiwan Province” as “an inalienable part of Chinese territory which must be restored to the motherland,” from which the U.S. must “withdraw all its armed forces and dismantle all its military installations.” (Since the Communists’ triumph in the Chinese civil war in 1949, the island of Taiwan had been the last outpost of the nationalist Kuomintang. And since the Korean War, the U.S. had defended its autonomy.)
With his eyes on so many prizes, Kissinger was prepared to make the key concessions the Chinese sought. “We are not advocating a ‘two China’ solution or a ‘one China, one Taiwan’ solution,” he told Zhou. “As a student of history,” he went on, “one’s prediction would have to be that the political evolution is likely to be in the direction which [the] Prime Minister … indicated to me.” Moreover, “We can settle the major part of the military question within this term of the president if the war in Southeast Asia [i.e. Vietnam] is ended.”
Asked by Zhou for his view of the Taiwanese independence movement, Kissinger dismissed it out of hand. No matter what other issues Kissinger raised — Vietnam, Korea, the Soviets — Zhou steered the conversation back to Taiwan, “the only question between us two.” Would the U.S. recognize the People’s Republic as the sole government of China and normalize diplomatic relations? Yes, after the 1972 election. Would Taiwan be expelled from the United Nations and its seat on the Security Council given to Beijing? Again, yes.
Fast forward half a century, and the same issue — Taiwan — remains Beijing’s No. 1 priority. History did not evolve in quite the way Kissinger had foreseen. True, Nixon went to China as planned, Taiwan was booted out of the U.N. and, under President Jimmy Carter, the U.S. abrogated its 1954 mutual defense treaty with Taiwan. But the pro-Taiwan lobby in Congress was able to throw Taipei a lifeline in 1979, the Taiwan Relations Act.
The act states that the U.S. will consider “any effort to determine the future of Taiwan by other than peaceful means, including by boycotts or embargoes, a threat to the peace and security of the Western Pacific area and of grave concern to the United States.” It also commits the U.S. government to “make available to Taiwan such defense articles and … services in such quantity as may be necessary to enable Taiwan to maintain a sufficient self-defense capacity,” as well as to “maintain the capacity of the United States to resist any resort to force or other forms of coercion that would jeopardize the security, or the social or economic system, of the people on Taiwan.”
For the Chinese hedgehog, this ambiguity — whereby the U.S. does not recognize Taiwan as an independent state but at the same time underwrites its security and de facto autonomy — remains an intolerable state of affairs.
Yet the balance of power has been transformed since 1971 — and much more profoundly than Kissinger could have foreseen. China 50 years ago was dirt poor: despite its huge population, its economy was a tiny fraction of U.S. gross domestic product. This year, the International Monetary Fund projects that, in current dollar terms, Chinese GDP will be three quarters of U.S. GDP. On a purchasing power parity basis, China overtook the U.S. in 2017.
Middle Kingdom, Top Performer
Shares of global GDP based on purchasing power parity
Source: International Monetary Fund, World Economic Outlook Database, October 2020
*Estimates start after 2019
In the same time frame, Taiwan, too, has prospered. Not only has it emerged as one of Asia’s most advanced economies, with Taiwan Semiconductor Manufacturing Co. the world’s top chip manufacturer. Taiwan has also become living proof that an ethnically Chinese people can thrive under democracy. The authoritarian regime that ran Taipei in the 1970s is a distant memory. Today, it is a shining example of how a free society can use technology to empower its citizens — which explains why its response to the Covid-19 pandemic was by any measure the most successful in the world (total deaths: 10).
As Harvard University’s Graham Allison argued in his hugely influential book, “Destined for War: Can America and China Escape Thucydides's Trap?”, China’s economic rise — which was at first welcomed by American policymakers — was bound eventually to look like a threat to the U.S. Conflicts between incumbent powers and rising powers have been a feature of world politics since 431 BC, when it was the “growth in power of Athens, and the alarm which this inspired in Sparta” that led to war. The only surprising thing was that it took President Donald Trump, of all people, to waken Americans up to the threat posed by the growth in the power of the People’s Republic.
Trump campaigned against China as a threat mainly to U.S. manufacturing jobs. Once in the White House, he took his time before acting, but in 2018 began imposing tariffs on Chinese imports. Yet he could not prevent his preferred trade war from escalating rapidly into something more like Cold War II — a contest that was at once technological, ideological and geopolitical. The foreign policy “blob” picked up the anti-China ball and ran with it. The public cheered them on, with anti-China sentiment surging among both Republicans and Democrats.
Trump himself may have been a hedgehog with a one-track mind: tariffs. But under Secretary of State Mike Pompeo, U.S. policy soon reverted to its foxy norm. Pompeo threw every imaginable issue at Beijing, from the reliance of Huawei Technologies Co. on imported semiconductors, to the suppression of the pro-democracy movement in Hong Kong, to the murky origins of Covid-19 in Wuhan.
Inevitably, Taiwan was added to the list, but the increased arms sales and diplomatic contacts were not given top billing. When Richard Haass, the grand panjandrum of the Council on Foreign Relations, argued last year for ending “strategic ambiguity” and wholeheartedly committing the U.S. to upholding Taiwan’s autonomy, no one in the Trump administration said, “Great idea!”
Yet when Pompeo met the director of the Communist Party office of foreign affairs, Yang Jiechi, in Hawaii last June, guess where the Chinese side began? “There is only one China in the world and Taiwan is an inalienable part of China. The one-China principle is the political foundation of China-U.S. relations.”
So successful was Trump in leading elite and popular opinion to a more anti-China stance that President Joe Biden had no alternative but to fall in line last year. The somewhat surprising outcome is that he is now leading an administration that is in many ways more hawkish than its predecessor.
Trump was no cold warrior. According to former National Security Adviser John Bolton’s memoir, the president liked to point to the tip of one of his Sharpies and say, “This is Taiwan,” then point to the Resolute desk in the Oval Office and say, “This is China.” “Taiwan is like two feet from China,” Trump told one Republican senator. “We are 8,000 miles away. If they invade, there isn’t a f***ing thing we can do about it.”
Unlike others in his national security team, Trump cared little about human rights issues. On Hong Kong, he said: “I don’t want to get involved,” and, “We have human-rights problems too.” When President Xi Jinping informed him about the labor camps for the Muslim Uighurs of Xinjiang in western China, Trump essentially told him “No problemo.” On the 30th anniversary of the 1989 Tiananmen Square massacre, Trump asked: “Who cares about it? I’m trying to make a deal.”
The Biden administration, by contrast, means what it says on such issues. In every statement since taking over as secretary of state, Antony Blinken has referred to China not only as a strategic rival but also as violator of human rights. In January, he called China’s treatment of the Uighurs “an effort to commit genocide” and pledged to continue Pompeo’s policy of increasing U.S. engagement with Taiwan. In February, he gave Yang an earful on Hong Kong, Xinjiang, Tibet and even Myanmar, where China backs the recent military coup. Earlier this month, the administration imposed sanctions on Chinese officials it holds responsible for sweeping away Hong Kong’s autonomy.
In his last Foreign Affairs magazine article before joining the administration as its Asia “tsar,” Kurt Campbell argued for “a conscious effort to deter Chinese adventurism … This means investing in long-range conventional cruise and ballistic missiles, unmanned carrier-based strike aircraft and underwater vehicles, guided-missile submarines, and high-speed strike weapons.” He added that Washington needs to work with other states to disperse U.S. forces across Southeast Asia and the Indian Ocean and “to reshore sensitive industries and pursue a ‘managed decoupling’ from China.”
In many respects, the continuity with the Trump China strategy is startling. The trade war has not been ended, nor the tech war. Aside from actually meaning the human rights stuff, the only other big difference between Biden and Trump is the former’s far stronger emphasis on the importance of allies in this process of deterring China — in particular, the so-called Quad the U.S. has formed with Australia, India and Japan. As Blinken said in a keynote speech on March 3, for the U.S. “to engage China from a position of strength … requires working with allies and partners … because our combined weight is much harder for China to ignore.”
This argument took concrete form last week, when Campbell told the Sydney Morning Herald that the U.S. was “not going to leave Australia alone on the field” if Beijing continued its current economic squeeze on Canberra (retaliation for the Australian government’s call for an independent inquiry into the origins of the pandemic). National Security Adviser Jake Sullivan has been singing from much the same hymn-sheet. Biden himself hosted a virtual summit for the Quad’s heads of state on March 12.
The Chinese approach remains that of the hedgehog. Several years ago, I was told by one of Xi’s economic advisers that bringing Taiwan back under the mainland’s control was his president’s most cherished objective — and the reason he had secured an end to the informal rule that had confined previous Chinese presidents to two terms. It is for this reason, above all others, that Xi has presided over a huge expansion of China’s land, sea and air forces, including the land-based DF‑21D missiles that could sink American aircraft carriers.
While America’s multitasking foxes have been adding to their laundry list of grievances, the Chinese hedgehog has steadily been building its capacity to take over Taiwan. In the words of Tanner Greer, a journalist who writes knowledgably on Taiwanese security, the People’s Liberation Army “has parity on just about every system the Taiwanese can field (or buy from us in the future), and for some systems they simply outclass the Taiwanese altogether.” More importantly, China has created what’s known as an “Anti Access/Area Denial bubble” to keep U.S. forces away from Taiwan. As Lonnie Henley of George Washington University pointed out in congressional testimony last month, “if we can disable [China’s integrated air defense system], we can win militarily. If not, we probably cannot.”
As a student of history, to quote Kissinger, I see a very dangerous situation. The U.S. commitment to Taiwan has grown verbally stronger even as it has become militarily weaker. When a commitment is said to be “rock-solid” but in reality has the consistency of fine sand, there is a danger that both sides miscalculate.
I am not alone in worrying. Admiral Phil Davidson, the head of U.S. forces in the Indo-Pacific, warned in his February testimony before Congress that China could invade Taiwan by 2027. Earlier this month, my Bloomberg Opinion colleague Max Hastings noted that “Taiwan evokes the sort of sentiment among [the Chinese] people that Cuba did among Americans 60 years ago.”
Admiral James Stavridis, also a Bloomberg Opinion columnist, has just published “2034: A Novel of the Next World War,” in which a surprise Chinese naval encirclement of Taiwan is one of the opening ploys of World War III. (The U.S. sustains such heavy naval losses that it is driven to nuke Zhanjiang, which leads in turn to the obliteration of San Diego and Galveston.) Perhaps the most questionable part of this scenario is its date, 13 years hence. My Hoover Institution colleague Misha Auslin has imagined a U.S.-China naval war as soon as 2025.
In an important new study of the Taiwan question for the Council on Foreign Relations, Robert Blackwill and Philip Zelikow — veteran students and practitioners of U.S. foreign policy — lay out the four options they see for U.S. policy, of which their preferred is the last:
The United States should … rehearse — at least with Japan and Taiwan — a parallel plan to challenge any Chinese denial of international access to Taiwan and prepare, including with pre-positioned U.S. supplies, including war reserve stocks, shipments of vitally needed supplies to help Taiwan defend itself. … The United States and its allies would credibly and visibly plan to react to the attack on their forces by breaking all financial relations with China, freezing or seizing Chinese assets.
Blackwill and Zelikow are right that the status quo is unsustainable. But there are three core problems with all arguments to make deterrence more persuasive. The first is that any steps to strengthen Taiwan’s defenses will inevitably elicit an angry response from China, increasing the likelihood that the Cold War turns hot — especially if Japan is explicitly involved. The second problem is that such steps create a closing window of opportunity for China to act before the U.S. upgrade of deterrence is complete. The third is the reluctance of the Taiwanese themselves to treat their national security with the same seriousness that Israelis take the survival of their state.
Thursday’s meeting in Alaska between Blinken, Sullivan, Yang and Chinese Foreign Minister Wang Yi — following hard on the heels of Blinken’s visits to Japan and South Korea — was never likely to restart the process of Sino-American strategic dialogue that characterized the era of “Chimerica” under George W. Bush and Barack Obama. The days of “win-win” diplomacy are long gone.
During the opening exchanges before the media, Yang illustrated that hedgehogs not only have one big idea – they are also very prickly. The U.S. was being “condescending,” he declared, in remarks that overshot the prescribed two minutes by a factor of eight; it would do better to address its own “deep-seated” human rights problems, such as racism (a “long history of killing blacks”), rather than to lecture China.
The question that remains is how quickly the Biden administration could find itself confronted with a Taiwan Crisis, whether a light “quarantine,” a full-scale blockade or a surprise amphibious invasion? If Hastings is right, this would be the Cuban Missile Crisis of Cold War II, but with the roles reversed, as the contested island is even further from the U.S. than Cuba is from Russia. If Stavridis is right, Taiwan would be more like Belgium in 1914 or Poland in 1939.
But I have another analogy in mind. Perhaps Taiwan will turn out to be to the American empire what Suez was to the British Empire in 1956: the moment when the imperial lion is exposed as a paper tiger. When the Egyptian president Gamal Abdel Nasser nationalized the Suez Canal, Prime Minister Anthony Eden joined forces with France and Israel to try to take it back by force. American opposition precipitated a run on the pound and British humiliation.
I, for one, struggle to see the Biden administration responding to a Chinese attack on Taiwan with the combination of military force and financial sanctions envisaged by Blackwill and Zelikow. Sullivan has written eloquently of the need for a foreign policy that Middle America can get behind. Getting torched for Taipei does not seem to fit that bill.
As for Biden himself, would he really be willing to jeopardize the post-pandemic boom his economic policies are fueling for the sake of an island Kissinger was once prepared quietly to trade in pursuit of Cold War detente? Who would be hurt more by the financial crisis Blackwill and Zelikow imagine in the event of war for Taiwan – China, or the U.S. itself? One of the two superpowers has a current account deficit of 3.5% of GDP (Q2 2020) and a net international investment position of nearly minus-$14 trillion, and it’s not China. The surname of the secretary of state would certainly be an irresistible temptation to headline writers if the U.S. blinked in what would be the fourth and biggest Taiwan Crisis since 1954.
Yet think what that would mean. Losing in Vietnam five decades ago turned out not to matter much, other than to the unfortunate inhabitants of South Vietnam. There was barely any domino effect in Asia as a whole, aside from the human catastrophe of Cambodia. Yet losing — or not even fighting for — Taiwan would be seen all over Asia as the end of American predominance in the region we now call the “Indo-Pacific.” It would confirm the long-standing hypothesis of China’s return to primacy in Asia after two centuries of eclipse and “humiliation.” It would mean a breach of the “first island chain” that Chinese strategists believe encircles them, as well as handing Beijing control of the microchip Mecca that is TSMC (remember, semiconductors, not data, are the new oil). It would surely cause a run on the dollar and U.S. Treasuries. It would be the American Suez.
The fox has had a good run. But the danger of foxy foreign policy is that you care about so many issues you risk losing focus. The hedgehog, by contrast, knows one big thing. That big thing may be that he who rules Taiwan rules the world.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.