America Is on the Road to Relapse Not Recovery

 The U.S. isn’t following the example of countries that have shown what a “smart reopening” entails.

America is on the road. But is it on the road to economic recovery or a pandemic relapse?

Fans of “On the Road” — Jack Kerouac’s 1957 classic of beatnik literature — will recall that its giddy, low-punctuation style is sometimes a little hard to follow. The same might be said of the data Americans are currently generating, some of which undoubtedly points to a rapid (if not quite V-shaped) recovery, and some of which seems to indicate either a second wave of Covid-19 infections or simply the continuation of the first wave.

The two are not separate stories, but rather a single, intertwined narrative. The best title for this tale was devised by my Hoover Institution colleague, the economist John Cochrane. He called it “The Dumb Reopening.” A smart reopening is the sort that has been possible in countries such as Taiwan and South Korea, which were so quick to ramp up testing and contact tracing that they didn’t need to do lockdowns in the first place. Among European countries, Germany and Greece have also successfully adopted these methods, which ensure that any new outbreaks of Covid-19 can quickly be detected, so-called super-spreaders isolated, their recent contacts swiftly traced and tested, and the outbreaks snuffed out.

Other signs of smartness are the persistence of behavioral adaptations by ordinary people, such as social distancing and wearing masks. We know that these practices, which can be adopted by citizens without any government decree, are effective in restricting the spread of the virus SARS-CoV-2.

Less widely appreciated is that social distancing is more effective as policy than lockdowns, as a forthcoming paper in the journal Nature shows. This is also the implication of work by researchers at Oxford’s Blavatnik School who show that there is no correlation between the stringency of government measures and containment of Covid-19. Measures designed to protect groups that are especially susceptible and vulnerable to Covid-19 -- notably the elderly, especially those with pre-existing conditions –- are also smart.

A dumb reopening eschews all such precautionary measures. So is that really what the U.S. is doing? The answer is pretty much yes. Testing has improved, but contact tracing is primitive. And social distancing and mask-wearing are least prevalent where reopening is happening fastest.
The economists I like best prefer data to fancy models. These days they are in clover because the age of the Internet and the smartphone is already a golden era of high-frequency data about economic behavior. When I say, “America is on the road,” I can say it with conviction because mobility data generated by Google, Apple and less well-known tech players such as SafeGraph show it.

Recent official statistics on unemployment and retail sales surprised economists, but they shouldn’t have: The mobility data were already pointing to rapid recovery some weeks ago. In the trough of pandemic panic, between mid-March and mid-April, Apple’s Mobility Trends (which track changes in routing requests to Apple Maps since Jan. 13) pointed to declines in driving and walking of around 60%. (For public transport the decline was 89%.) But since late April, the trend for foot and road traffic has been steadily upward. Requests are now up 12% and 33%, respectively, relative to January. (Transit requests are still down 54%.)

SafeGraph offers a more granular view of foot traffic, based on aggregated and anonymized smartphone location data. Relative to Jan. 2-3, Americans were walking between 60% and 70% less by the beginning of April. But in Dallas and Houston, foot traffic is now just a quarter below the start of the year. General merchandise stores, counter-service restaurants and supermarkets are almost back to where they were.

But perhaps the most useful mobility data for economists come from Google’s Community Mobility reports, which show how visits and length of stay at different places have changed relative to a Jan. 3-Feb. 6 baseline. By subdividing destinations into six categories — retail and recreation, grocery and pharmacy, parks, transit stations, workplaces and residential — the Google data help us zero in on what matters economically.

No recovery was ever driven by visits to parks (up 53% since January, not surprising given the improved weather in most places). Grocery and pharmacy visits weren’t much impacted the pandemic, as they were essential. The big story is retail and recreation: down 49% nationwide at the trough (April 5), but now down just 16%.

America Hitting the Stores
Month-over-month change

Mobility data predicted the recent positive statistics on retail sales. In May, the monthly jump in sales reported by the Commerce Department was 17.7%; the monthly jump in Google’s data for retail and recreation visits was 24.4%.
New and old data alike are voraciously devoured by Wall Street analysts. Combined with the Federal Reserve’s multiple liquidity and credit facilities, which are designed to shore up the prices of pretty much all financial assets, they explain why U.S. stocks are back where they were in early March, before the pandemic panic. Mr. Market is acting as if Covid-19 is over. The trend you can infer from the Google data points to July 10 as the date when consumption will be back to normal.

The problem is that Covid-19 isn’t over. As some of us have been warning for some time, the failure to contain the spread of the virus in the U.S. has made a second wave inevitable in many of those places where case numbers had fallen significantly, and a continuation of the first wave inevitable in those places where they had not. The national data for new cases and deaths don’t show this, as they are dominated by improvements in the Northeast (New York and its neighbors).

Eyeballing the latest data on confirmed cases, I see second waves in Arizona, Florida, Idaho, Nevada, Oklahoma, Oregon, Washington and Wyoming, as well as second ripples in Hawaii, Kansas and Montana. First waves continue in California, Mississippi, South Carolina, Tennessee, Texas and Utah. Trends in case numbers, positive tests and hospitalizations look especially worrisome in Arizona, Florida and Texas.
As Americans hit the road in increasing numbers, including longer-range trips to vacation destinations, we can also expect rising numbers of cases in states with hitherto low numbers of Covid-19 infections, such as Montana.

So what’s going to happen next? One possibility is that Americans will recoil from reopening when they see worse data on cases, hospitalizations and mortality in their states — or, more likely, if they see worse cable news reports or Internet clickbait about those things. (Watch for breaking news on Arizona’s intensive care unit capacity.) Any actions by state or municipal authorities to slow down the rush back to normality (mandatory masks in Raleigh, North Carolina, for example) may add to public anxiety.

Polling by Civiqs shows that many Americans — Democrats much more than Republicans — are still “extremely concerned” or “moderately concerned” about Covid-19. Eminent economists — notably Michael Spence, who has sought to match mobility and infection data — look at the dumb reopening and conclude that it will end badly. Spence and his co-author Chen Long warn that “the U.S. is heading for a situation comparable to the Great Depression.” They are in good company: Hardly any leading academic economist believes in the V-shaped recovery story, where output snaps back as far and as fast as it has fallen.

The alternative, and I suspect more likely, scenario is that Americans carry on getting back to normal and tacitly accept further excess mortality as just a cost of doing business until a vaccine is available. That would be bad news for the significant number of Americans who, because of their age and/or pre-existing health problems such as obesity, hypertension or kidney disease, are potentially at serious risk from Covid-19. But it would not be without precedent.

Although many commentators and scholars have looked back to the 1918-19 influenza pandemic for insights into our current predicament, it seems clear by now that SARS-CoV-2 is not as deadly a virus as H1N1 was just over a century ago. Estimates of the infection fatality rate of Covid-19 still range widely, from 0.02% to 0.4%, according to one recent survey (though some recent European serological studies imply higher rates), but the fatality rate of the so-called “Spanish Flu” was probably between 1.8% and 2.2%. Put differently, 675,000 deaths in the U.S. were attributed to influenza and pneumonic complications in 1918-19 of which around 550,000 were “excess deaths.” An equivalent excess death toll in 2020 would be greater than 1.7 million, compared with a figure to date of around 100,000.

Closer in terms of likely mortality is the less well-known “Asian Flu” pandemic of 1957-58. That caused up to 116,000 deaths in the U.S. (the estimates for excess morality vary widely), which would translate into 215,000 deaths in 2020, roughly what I expect the final U.S. Covid-19 death toll to be.

It is quite probable you have never heard of that pandemic, even though its worldwide death toll was between 700,000 and 1.5 million. This is all the more surprising as, unlike SARS-CoV-2, the H2N2 virus of 1957-58 killed young people. As in most influenza pandemics, significant numbers not only of the very old (over 65) but also of the very young (under 5) died. In terms of excess mortality relative to baseline expected mortality rates, however, it was teenagers who suffered the heaviest losses.

The biggest difference between 1957 and 2020, however, lies in the government and public response to the new pathogen. President Dwight D. Eisenhower did not declare a state of emergency in the fall of 1957. There were no state lockdowns and no school closures. Sick students simply stayed at home, as usual. Work continued more or less uninterrupted; AT&T reported peak absenteeism of 8%. Nor did the Eisenhower administration borrow to the hilt to fund transfers and loans to citizens and businesses. The president asked Congress for a mere $2.5 million (around 0.0005% of 1957 GDP) to support the Public Health Service in case of an epidemic.

True, there was a recession that year, but it had little if anything to do with the pandemic. Eisenhower’s job approval rating deteriorated, declining from about 80% to 50% between January 1957 and March 1958, and his Republican Party sustained severe losses in the 1958 midterms, but no serious historian of the period would attribute these setbacks to the pandemic.

The national mood of insouciance in the face of a new and contagious disease might be summed up in the phrase coined the year before by Mad magazine’s second editor, Al Feldstein: “What, Me Worry?” Huey “Piano” Smith and His Clowns even had a minor hit with “Rockin’ Pneumonia and the Boogie Woogie Flu.”

Whereas public health officials reached a consensus in March of this year that only full “lockdowns” could avert disaster, the Association of State and Territorial Health Officers declared on Aug. 27, 1957, that there would be “no practical advantage in the closing of schools or the curtailment of public gatherings as it relates to the spread of this disease.” As a Centers for Disease Control official later recalled, “ASTHO encouraged home care for uncomplicated influenza cases to reduce the hospital burden and recommended limitations on hospital admissions to the sickest patients … most were advised simply to stay home, rest, and drink plenty of water and fruit juices.”

As today, there was a race to find a vaccine. Unlike today, however, the U.S. had a head start, thanks to the acumen of one exceptionally talented and prescient scientist, Maurice Hilleman, who was chief of the Department of Respiratory Diseases at the Army Medical Center (now the Walter Reed Army Institute of Research) from 1948 to 1957. The first New York Times report of the outbreak in Hong Kong — three paragraphs on page 3 — was on April 17. The Army Medical Center received its first influenza specimens from Hong Kong on May 13. Nine days later, Hilleman had identified the new strain. As early as July 26, doctors at Fort Ord in California began to inoculate military recruits. Approximately 4 million one-milliliter doses were released in August, 9 million in September, and 17 million in October.

It was a different America, no question. For one thing, many Americans today would appear to have a much lower tolerance of risk than their grandparents and great-grandparents six decades ago. As Clark Whelton has recalled:

For those who grew up in the 1930s and 1940s, there was nothing unusual about finding yourself threatened by contagious disease. Mumps, measles, chicken pox, and German measles swept through entire schools and towns; I had all four. Polio took a heavy annual toll, leaving thousands of people (mostly children) paralyzed or dead. There were no vaccines. Growing up meant running an unavoidable gauntlet of infectious disease. For college students in 1957, the Asian flu was a familiar hurdle on the road to adulthood … We took the Asian flu in stride. We said our prayers and took our chances.

But the really striking contrast is how much more competently the Eisenhower administration responded to the pandemic of 1957 than its counterpart today, which lurched from insouciance in January and February to panic in mid-March. This is not to suggest that the threats posed by the two pandemics were identical, nor to suggest that we should simply have dusted down the “What, Me Worry?” playbook. The smart playbook for Covid-19, as we have seen, used early, widespread testing and contact tracing to stamp out super-spreader events, while at the same time protecting the vulnerable.

Rather, having inflicted an immense economic shock on ourselves with lockdowns, we have now tacitly decided to act as if the pandemic is over by returning to normal behavior, largely eschewing the social distancing and mask-wearing that could limit further contagion at a minimal cost. We are, in short, hitting the road as if it was 1957 again, implicitly heading for herd immunity — and substantially more excess deaths — until a vaccine turns up.

As it happens, Kerouac’s “On the Road” was first published in 1957. The book’s second line is: “I had just gotten over a serious illness that I won’t bother to talk about.” By the end of this year, the way things are going, several million Americans will be able to say those same words about Covid-19. Sadly, roughly 200,000 won’t be able to.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

2020 Is Not 1968. It May Be Worse.

 Social unrest helped doom Lyndon Johnson's presidency. It may end up saving Trump's.

The American death toll is rising. An unpopular president fears for his re-election chances. The U.S. sends men into space. Down on Earth, the economy is in trouble. Racial tensions boil over into rallies, looting and violent confrontations with police in cities across the nation, intensifying political polarization and widening the generational divide. The president considers invoking the 1807 Insurrection Act, which empowers a president to deploy the armed forces and National Guard in any state.

Yes, as writers across the political spectrum such as David Frum, James Fallows, Max Boot, Julian Zelizer and Zachary Karabell have pointed out, 2020 is looking a lot like 1968. For Vietnam, read Covid-19. For Lyndon B. Johnson, read Donald J. Trump. For Apollo 8’s successful orbit of the moon, read the docking of SpaceX’s Crew Dragon with the Space Station. And for Washington, Chicago and many other cities in 1968, read Minneapolis, Atlanta and many other cities in the last few weeks.

Ah yes, interjected Boston Globe columnist Michael Cohen, but today we are dealing with a pandemic. Actually, they had one in 1968 as well: the Hong Kong flu, caused by the influenza virus A/H3N2, which was ultimately responsible for more than 100,000 excess deaths in the U.S. and a million around the world. It’s easy to forget that Woodstock, the following year, was a super-spreader event.

True, since Derek Chauvin killed George Floyd in the street outside Cup Foods in Minneapolis on the night of Monday May 25, there have been protests and riots in dozens of American cities. More curfews have been imposed than in any year since, you guessed it, 1968. But is this the correct analogy? Or is the baby-boomers’ obsession with their own exciting teenage years leading us, not for the first time, to think too much about the late 20th century and not enough about other, more relevant periods?

Like the over-used Weimar analogy, allusions to 1968 are a kind of shorthand — just a superior way of saying, “This is really bad.” I’m betting that most of the people bandying these analogies about haven’t ever pored over documents from 1968 or 1933.

For millennia, historians have noted that pandemics can destabilize the societies they strike. Of the Athenian plague of 430 BC, Thucydides wrote: “The catastrophe was so overwhelming that men, not knowing what would happen next to them, became indifferent to every rule of religion or law.” Defeat at the hands of Sparta in the Peloponnesian War was followed by a period of political instability, culminating in a temporary breakdown of Athenian democracy in 411 BC.

The two great plagues that struck the Roman Empire — the Antonine Plague (165-180 AD), probably a smallpox pandemic, and the Plague of Justinian (542 AD), which was a bubonic plague — also weakened the structures of Roman rule, allowing barbarian invaders to make significant inroads.

Recent scholarship on England after the Black Death of the 1340s shows that efforts by the landowning class to offset the effects of chronic labor shortages led to escalating tensions that ultimately erupted in the Peasants’ Revolt of 1381.

Across Europe, the Black Death prompted a wave of millenarian movements, notably the flagellant orders, groups of men who roamed from town to town whipping themselves in the belief that acts of penance might ward off the Last Judgment. These religious cults often had a revolutionary undertone and came into conflict with local temporal and spiritual hierarchies.

The devastation caused by waves of bubonic and pneumonic plagues — which killed more than a third of the population in many parts of Europe — also led to widespread violence, particularly outbreaks of anti-Semitism. In 1349, for example, the Jewish communities in Cologne, Frankfurt and Mainz were wiped out. Conspiracy theories circulated widely that the Jews had caused the Black Death by poisoning the water supply. The Jews of Strasbourg were offered a choice between conversion and death. Those who refused to convert were burned alive in the Jewish cemetery.

The recurrence of bubonic plague in the 1890s led to conflicts between British rulers and their subjects from South Africa to India. In Honolulu and San Francisco, it led to measures that discriminated against the local Asian population. Such ethnic scapegoating often occurred in situations where a disease seemed to take an outsized toll on a specific community. The 1907 and 1916 polio epidemics hit wealthy, white New York especially hard. (In poorer populations, infants were routinely exposed because of bad sanitation, and therefore were more likely to have antibodies.) Southern European immigrants, particularly Italians, were blamed for the outbreak.

In short, history shows that pandemics all too often exacerbate existing social tensions between classes and ethnic groups. It also provides numerous examples of quarantines and public social restrictions intensifying citizens’ mistrust of the state. In 19th-century Europe, cholera riots were frequent, from St. Petersburg in 1831 to Donetsk in 1892. In North America, smallpox quarantines led mobs to burn down hospitals and police stations. The residents of Marblehead, near Boston, twice rioted against smallpox inoculation, in 1730 and 1773.

The spread of Covid-19 from China to the rest of the world, and the generally inept responses of the U.S. authorities to the pandemic, have combined to create perfect conditions for urban unrest. The disease has disproportionately hurt minority communities, especially African-Americans. In the U.S., as in the U.K., people of color are more likely than whites to work in contagion-exposed, low-skilled, “essential” occupations; to live in crowded conditions; and to have co-morbidities such as obesity and diabetes. The economic consequences of lockdowns have also hit African-Americans harder than white Americans. You really don’t need 1968 to explain 2020.

As a white, middle-aged, upper-middle-class immigrant, I’m hardly the person to speak to the politics of race in America. So I turned to an African-American friend, the economist Roland Fryer, whom I’ve known since we were colleagues at Harvard. 1

In 2016, he published a brilliant but controversial paper which argued that the police did not disproportionately use lethal violence against black people, though they were more likely to use non-lethal force against them. (Shootings made up more than 90% of fatal incidents.) A paper published last year in the Proceedings of the National Academy of Sciences lent strong support to Fryer’s thesis.

He has a new, unpublished paper that looks at a perverse effect of investigations into police shootings. I asked Fryer to walk me through the argument.

“If you have a police shooting that goes viral online but isn’t investigated,” he explained, “then nothing changes — levels of police activity and crime are about the same. But if you have a viral shooting that is investigated, then police activity plummets, and crime goes up dramatically.” In just five cities – Baltimore; Chicago; Cincinnati; Ferguson, Missouri; and Riverside, California -- this led to excess homicides of almost 900 people in the subsequent 24 months, 80% them black, with an average age of 28. 

It's a dangerous Catch-22: You're damned if you don't investigate “viral” incidents, and in even worse shape if you do.

How does Fryer interpret the current protests? “People are fed up,” he told me. “They are frustrated by the disparities they see in educational outcomes. Frustrated by the disparities they see in criminal justice. Frustrated by racial disparities in life expectancy. We are all to blame — this happened on our watch.” And when you add to that the fact that Covid-19 disproportionately affected the black community: “Folks have had enough. People are very much on edge.”

Such conversations, as much as any article or book, change the way I look at an issue. For years, I have confidently said that 1968 was much worse than the present. But could it be the other way around — not in terms of standards of living or rates of violence, but in terms of politics and the perceptions that shape it? That was a question put to me by Coleman Hughes, another African-American friend, whose recent essays on race in America have been essential reading.

In calling himself the “president of law and order” in the White House Rose Garden last Monday, Trump (or more likely his speechwriter) was echoing a mantra of Richard Nixon’s successful 1968 campaign. But Trump is the incumbent, unlike Nixon in 1968. The pandemic and the recession have hit Americans on his watch, just as surely as the Vietnam War escalated on Lyndon Johnson’s. A pandemic at home is very different from a distant war which, in mid-1968, more than a third of Americans still supported. The devastating economic consequences of the lockdowns make the early signs of inflation in 1968 seem trivial. The electorate is radically different from that of 1968: older, but also more ethnically diverse because of immigration and variations in birth rates. Traditional news media did not cover violent protest sympathetically in 1968. In all these respects, Trump’s chances of re-election should look worse than Johnson’s.

Yet on March 31, 1968, Johnson announced that he would not seek a second term because, as he put it on prime-time television, “There is division in the American house now.” Do not expect any such capitulation from Donald Trump. Division in the American house is precisely what gives him a shot at four more years.

Unlike in 1968, in other words, urban unrest with a racial dimension might actually save a beleaguered incumbent. The current wave of protests is in many ways a repeat of more recent events — Ferguson 2014, Charlottesville 2017 — and its main significance may be to shift the American political conversation away from the Trump administration’s incompetent handling of the Covid-19 pandemic, back to the terrain of the culture war, where Trump is an experienced combatant.

Even in 1968, merely using the phrase “law and order” was to invite accusations of racism. In the case of a president who last week fantasized about a MAGA mob joining the fray outside the White House, the charge of insincerity seems well-founded. Trump is indifferent to the law by nature; he thrives on disorder. And he understands much better than his opponent how to spread his message through the complex networks of online “influencers” — many of whom promote conspiracy theories — through which more and more Americans receive their news.

Finally, and perhaps crucially, unlike 52 years ago, November’s election seems very likely be a two-horse race. There is no George Wallace, the segregationist candidate who took millions of former Democratic votes in 1968, ensuring Nixon’s triumph. Any last-minute third-party candidate -- Green, Libertarian or otherwise -- is unlikely to garner anything like Wallace’s 13.5% of the popular vote.

The result of 2020’s election will look very different from 1968’s. The nightmare is a result like that of 2000: too close to call and decided in the courts.

History strongly suggests that pandemics tend to widen class and ethnic divisions. Covid-19 is no exception. Small wonder a hideous incident of police brutality ignited a wave of outrage: harder hit by the disease, harder hit by the lockdown and the recession, African-American communities were ready to boil over. Scenes of mayhem in nearly every major city in America ought not to bode well for an incumbent on whose watch excess mortality has already surged by 26%, and who is now presiding over an unemployment rate three and a half times higher than in 1968.

Yet the return of the culture war might just prove to be the deus ex machina that extricates Trump from the quagmire of Covid-19. If so, Trump’s many detractors in the commentariat - who have long hoped that he is Richard Nixon without the second term - may come to rue the day they drew the wrong historical analogy.

My crystal ball missed Brexit but got Donald Trump

 Those who make predictions must keep a tally. So how did I do?

It has been nearly 4½ years since I began writing this column, which works out at roughly 240,000 words altogether. As these will be my last words in these pages, it’s time to look back and take stock. If part of your job is to be a pundit then, as the Pennsylvania University political scientist Philip Tetlock argues in Superforecasting: The Art and Science of Prediction, you need to keep score.

As Tetlock had a dig at me in that book — which was published in 2015, before I began writing for The Sunday Times — this is also a good opportunity to settle a score.

Tetlock was, of course, quite right about most public intellectuals (a term that always makes me think of public conveniences). They seldom hold themselves to account. Nor do they get fired if their predictions are consistently wrong, as long as they are entertaining enough to engage readers.

Since I set up an advisory firm nine years ago, though, my approach has been different — out of necessity, as fund managers differ from newspaper editors in their attitude to predictions. Not only do they notice when you’re wrong, because one or more financial indicators make that clear; they also let you know about it (with grim relish, usually). If you’re wrong too often, it’s goodbye.

So at the beginning of each year we at Greenmantle make predictions about the year ahead, and at the end of the year we see — and tell our clients — how we did. Each December we also rate every predictive statement we have made in the previous 12 months, either “true”, “false” or “not proven”. In recent years, we have also forced ourselves to attach probabilities to our predictions — not easy when so much lies in the realm of uncertainty rather than calculable risk. We have, in short, tried to be superforecasters. And with some success.

Now it’s time to apply the same retrospective scoring to this column. So as to meet my deadline, I’ve picked my first full year at The Sunday Times, which was the annus mirabilis — or horribilis, depending on your politics — beginning on November 1, 2015, the date of my first column.

Three minor themes are worth mentioning. I argued repeatedly that the twin problems of Islamic extremist networks and mass migration from the Muslim world were not likely to go away: “Think of Isis as the Facebook of Islamic extremism” (March 27, 2016). I also began warning, as early as May of that year, that the rise of Silicon Valley’s big tech companies was not an unmitigated boon: “What the state knows is just a fraction of what Facebook knows about you” (May 15). I also noted the dire implications for Labour of the antisemitism of Jeremy Corbyn and his circle (May 1).

But by far the biggest issues of my first year on this page — and subsequent years too — were Britain’s vote to leave the EU and the election of Donald Trump. How did I do?

On Brexit, I was wrong. From the outset, I was a remainer. “The idea that we can . . . separate ourselves from Europe is an illusion,” I wrote on February 21. “For the future of Europe without us would be one of escalating instability.” Impolitely, I called Brexiteers “Angloonies” and “happy morons”. When the remain side lost, I predicted a “stairway to hell”— or at least a recession (June 26). Wrong.

At the end of the year, on December 11, 2016, I made a confession. I had been motivated to back remain more because of “my personal friendship with [David] Cameron and George Osborne” than out of any deep allegiance to the EU. I regretted — and still regret — not urging Cameron to reject “the risible terms that the European leaders offered him back in February on EU migrants’ eligibility for benefits”. That was the moment he should have called their bluff by backing Brexit.

Yet the humiliation of Brexit gave me an advantage over American commentators on the 2016 presidential race. I had moments of doubt, admittedly. I compared Trump to unsuccessful Republican candidates Wendell Willkie (December 13, 2015) and Barry Goldwater (January 31, 2016). On April 3, 2016, I predicted the bursting of the Trump bubble in the Wisconsin primary. Ted Cruz won that, but it didn’t burst the bubble. Far more often, I went against the conventional wisdom that Trump was doomed to lose.

“Trump has the face that fits the ugly mood in America,” was my headline on November 1, 2015. “Trump has both the resources and the incentives to press on. In the current national mood of disaffection with professional politicians, he could seem an attractive alternative to Hillary Clinton . . . The point about Trump is that his appeal is overwhelmingly a matter of style over substance. It is not what he says that a great many white Americans like — it is the way that he says it.”

I was against Trump. I was a signatory of a “never Trump” letter. I repeatedly condemned his “open expressions of racial prejudice and xenophobia”, his isolationism (December 13, 2015) and his fishy bromance with Vladimir Putin (May 8 and October 16, 2016). I regretted that Mike Bloomberg chose not to run (October 23).

But I also saw clearly the strength of his appeal. “Trump is winning,” I wrote on February 28, 2016, “because no other candidate has a more convincing explanation of why so many Republican voters genuinely are worse off today than in 2000 . . . But no one can rule out Democratic defections to Trump when it comes to the crunch on November 8.” On March 6, I imagined Trump winning and running for an unconstitutional third term in 2024. “Trump can beat Hillary Clinton,” I wrote on May 8.

“Can Trump succeed where [Mitt] Romney failed?” I asked on July 21. “Yes . . . many young voters will fail to show up for Clinton. Meanwhile, the white lower class, especially the older cohorts, will turn out for Trump in droves, just as their English counterparts turned out for Brexit.”

The choice between Clinton and Trump was a choice between “snafu” and “fubar”, I wrote on September 18, “but wouldn’t you risk being fubar . . . if it was your only shot at avoiding four more years of snafu?”

“This rage against the global,” I wrote a week later, “is why Trump could win this election. It is why Brexit happened. It is why populists are gaining ground wherever free elections are held.”

I marked my first anniversary at this paper with a column that compared Trump to the Chicago Cubs, the outsiders who had just won the baseball World Series. “He can win,” I wrote, “if there is a differential in turnout between his supporters and [Clinton’s] in the battleground states comparable to the age and ethnicity-based differentials in the UK referendum” (November 6).

Now, dear reader, you are burning to know what I think will happen this November. Bad luck. You will have to seek my superforecast in another publication.

Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford, and managing director of Greenmantle

Coronavirus: Aids changed us. Will Covid‑19 do the same?

 This virus is to social life what HIV was to sexual life

Social mores change more than you think. If a time machine could take you back to 1981, you would be shocked by how much people smoked, for example, and how much time they spent standing outside telephone boxes that stank of urine, jingling coins in the pockets of their nasty flares or drainpipes, grinding their nasty yellow teeth. If you are a woman, you would be appalled by the overt sexism of male conversation. If you’re not white, you’d be revolted by the casual racism. And if you’re gay . . . well, more about that later. Not that these prejudices have been eradicated, but they were worse then.

So let’s ask ourselves how much social mores are going to change as a result of the Covid-19 pandemic. In the past week, I’ve had several conversations on the topic of “the world after Covid-19”. My immediate response has been: “Why do you use the word ‘after’? Why not ‘with’?”

Yes, there is undoubtedly a benign scenario in which one of the more than 70 teams working on a vaccine against Sars-CoV-2 collects the prize. If all goes well, that vaccine could jump through all the scientific and regulatory hoops, go into mass production and be available some time in the second half of 2021.

In this same happy-ever-after scenario, there are also breakthroughs in Covid-19 therapies. New research confirms that the disease doesn’t do anything much to endanger the lives and health of younger people, and if they do get infected, they get lasting immunity. Summer comes to the northern hemisphere and the contagion recedes. As lockdowns are lifted and people return to their normal gregarious habits, there is no second wave of the pandemic. Far from devastating the southern hemisphere, the disease proves a minor event in Africa.

And — still looking on the bright side — stock markets rally and economies surge to a high-speed V-shaped recovery that makes me, and others who worry about a protracted depression, look silly.

All this is possible, and devoutly to be hoped for. But it is by no means a 100% certainty. Just consider the odds against a successful vaccination. Do we have one for malaria? No. Tuberculosis? No. HIV-Aids? No. (After 40 years of toil, there have been just a handful of phase 3 clinical trials, one of which made the disease worse. The best had a success rate of just 30%.) How about rotavirus, the most common cause of diarrhoea among infants? Yes, they did find a vaccine for that — after 15 years.

Even if a vaccine is found, there will be multiple risks associated with the current rush to devise and deploy one. And even if there are no setbacks, it might turn out to be like influenza: you can get your flu shot each year, but there’s no guarantee you won’t get some other strain than the ones you were vaccinated against.

That’s why we need to give at least some thought to the not-so-nice scenario of living with Covid-19 — at best, the way we live with flu, which delivers its regular seasonal bump in the mortality rate; at worst, the way we have slowly and painfully learnt to live with HIV-Aids.

Which takes us back to being gay in 1981, the year the New York Native newspaper published the first article about gay men being treated in intensive care units for a strange new illness. (The headline was: “Disease rumours largely unfounded.”) It was more than a year later that the term Aids (acquired immune deficiency syndrome) was proposed for the all too real disease.

Here’s a thought experiment. Imagine a world in which Covid-19 — which still has a long way to go before it catches up with Aids as a killer — has the same effect on social life as Aids had on sexual life. That would be quite a different world, and more visibly so (as changes in sexual behaviour largely take place behind closed doors).

Imagine a world in which we routinely wear masks on public transport and in offices; a world in which we greet one another with a wave, not a hug or a handshake; a world in which grandparents see their grandchildren only on FaceTime; a world in which to cough or sneeze in public is as shameful as to fart; a world in which we rarely eat in restaurants or fly; a world without theatres and cinemas (other than a few retro drive-ins); and a world in which football is played in silent, empty stadiums. (Will there be canned cheering, just as there used to be canned laughter in sitcoms?)

I’m not the first person to notice that there are some lessons to be learnt from the last really lethal pandemic caused by a virus, despite the important differences between HIV and Sars-CoV-2, and between Aids and Covid-19. One nurse has recalled the similar ways the authorities responded — at first with complacency and then by stigmatising victims (for “the Chinese virus”, read “the gay plague”). Last month, The New York Times published an article asking “Are facemasks the new condoms?” — destined to become “ubiquitous, sometimes fashionable [and] promoted with public service announcements”.

Yet the lesson of HIV-Aids is not quite that it “changed everything”. The really striking feature of the history of the Aids pandemic is that behaviour only partly changed after the recognition of a new and deadly disease spread by sex, needle-sharing and blood transfusions. An early American report noted “rapid, profound but . . . incomplete alterations in the behaviour of both homosexual/bisexual males and intravenous drug users”, as well as “considerable instability or recidivism”. By 1998, just 19% of American adults reported some change in their sexual conduct in response to the threat of Aids.

The advent of antiretroviral drugs that stop HIV carriers succumbing to Aids has somewhat diminished the fear factor. Even so, one might have expected a bit more fear to persist. A 2017 paper showed that fewer than half of at-risk men had used a condom last time they had sex. According to a recent British study, sustained campaigns of public and individual education are necessary to discourage gay men from having sex without condoms. In Africa, the “ABC” — abstain, be faithful and “condomise” — approach has had limited success.

Yes, there have been changes in sexual behaviour. According to the psychologist Jean Twenge, millennials have fewer sexual partners on average than earlier generations. Another American study concluded: “Promiscuity hit its modern peak for men born in the 1950s.” And let’s not forget the invaluable UK National Survey of Sexual Attitudes and Lifestyles, the most recent version of which revealed a marked decline in the frequency of sex in Britain.

Yet few if any of these changes can be attributed to HIV-Aids. The return of “No sex, please, we’re British” mainly affects married and cohabiting couples, and, according to the definitive analysis in the BMJ, is most likely due to “the introduction of the iPhone in 2007 and the global recession of 2008”.

Social mores change more than you think. In the face of a deadly disease, however, they also change less than you might expect.

Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford, and managing director of Greenmantle

Filter By
Publication Name
244 Article Results