A 12-Step Guide to Staying Sane During the Plague Year

 How Bruckner, Scott, “Doctor Who” — and tea — helped this columnist survive the pandemic.

Give Proust a chance — as an audiobook. Photographer: THOMAS SAMSON/AFP via Getty Images

Niall Ferguson is the Milbank Family Senior Fellow at the Hoover Institution at Stanford University and a Bloomberg Opinion columnist. He was previously a professor of history at Harvard, New York University and Oxford. He is the founder and managing director of Greenmantle LLC, a New York-based advisory firm.

Maybe you stayed safe in the plague year. Most of us have. But did you stay sane?  A growing body of research shows that the damage to our health caused by COVID-19 went far beyond the disease itself. In addition to multiple physiological conditions that have claimed lives because people eschewed medical care they would normally have sought, a great many of us have suffered psychologically — some from fear of infection, some from protracted incarceration with their nearest and dearest, many from the enforced isolation that does not come naturally to our species. Survey data from the United States, China and other countries point to a pandemic of depression, anxiety and stress.

I’ve had the good fortune to avoid both physical and mental illness in 2020. As a repressed misanthrope — who for many years was forced by circumstances to be much more gregarious than I really am — I have positively relished nine months in one place with a social circle confined to my wife, my two youngest children, and a handful of local friends. (I cannot speak for the other inhabitants of my bubble.)

As the year nears its end — and with the plot twist of a new and more contagious U.K. variant of the SARS-CoV-2 virus, as if to reconcile the Europeans to Brexit — I feel duty bound to share some tips for maintaining mental health. In honor of the process formulated in 1935 by Bill Wilson and Robert Holbrook Smith, the founders of Alcoholics Anonymous, here are my twelve steps to staying sane (or at least getting no more insane) in a pandemic:   

Step One: Drink tea, not booze. I began 2020 with my first ever trip to Taiwan, where I was cured of making tea like a Brit, i.e., chucking a teabag, boiling water and some milk in a mug. Sitting cross-legged in the Shi Yang Shan Fang tea house, which perches on the side of Yangming Mountain to the north of Taipei, on a night of torrential rain, I experienced my first gong fu tea ceremony. A young man conducted the ceremony, which involves multiple pots and cups, all made of delicate, unglazed clay. “Are you a tea master?” I asked him, somewhat crassly. “No,” he replied serenely. “I am the servant of the tea.”

Ever since that evening, I have served tea this way three times a day, beginning with Taiwanese gaoshan (high mountain) tea in the morning, followed by Wazuka Yuki Oolong Cha at lunchtime, and concluding with Japanese sencha (green tea) in the afternoon—all ordered from the wonderful Sazen Tea. More than anything else I have done this year, the tea ceremony has kept me sane in the solitude of my study.

Step Two: Read Walter Scott (ideally with your mother). I had been thoroughly put off the novels of Scott as a schoolboy by adults who dismissed him as boring and stuffy. They lied. By some strange telepathic process, my mother and I—separated by nearly five thousand miles— decided to set aside prejudice and simultaneously begin reading “Waverly” (1814), the glorious, gripping tale of an ingenuous young Englishman who gets mixed up in the Jacobite Rising of 1745. As we progressed, at the rate of roughly one novel every three weeks, we found Scott as gifted a writer as Dickens, but funnier and shrewder. There are unexpected anticipations of Wilkie Collins and R.L. Stevenson in his darker characters — for example, the magnificent madwoman Meg Merrilies in “Guy Mannering” (1815) who recurs as Madge Wildfire in “The Heart of Midlothian” (1818), or the diabolical, dastardly Rashleigh Osbaldistone in “Rob Roy” (1817).

Reading Scott in tandem provided my mother and me with a desperately needed topic of conversation other than the pandemic. Our weekly calls became literary seminars rather than lamentation sessions. By this route of printed pages, each of us was able to revisit our native Scotland in our imaginations and to understand, for the first time, how much that country used to be Scottland — for it was Scott, more than anyone, who made its emergence from Afghan-like misery into Enlightenment dynamism both intelligible and irresistible to the Victorians.

Step Three: Have Proust read to you. On at least four previous occasions, I have tried and failed to get through the first volume of À la recherche du temps perdu. The solution was to listen to “Swann’s Way,” in the C. K. Scott Moncrieff translation, read exquisitely for Audible by John Rowe. If you have ever struggled with the ineffably sensitive Marcel, as I once did, then this is the way. For me, the breakthrough came with Swann’s all-consuming infatuation with the unsuitable but enthralling Odette and his descent into green-eyed jealousy.

Step Four: Listen to Bruckner. This was also the perfect year to immerse yourself in the work of a composer you had previously failed to appreciate. I chose the self-effacing Austrian genius Anton Bruckner, whose Symphony No. 4 in E Flat Major, “Romantic,” provided exhilaration and exaltation — both in short supply in the world at large. Other plague-year discoveries have included Mendelssohn’s “Lieder ohne Worte,” Schubert’s exquisite Piano Sonata No. 18 in G Major, D. 894, and, as I wanted to hear music from the time of the Black Death, the plangent Messe de Nostre Dame of Guillaume de Machaut.

Step Five: Practice a musical instrument. Since I took up playing the double bass at the age of 18, I have learned two important life-lessons. First, ensemble playing is very good for the mind and the soul, though not necessarily for the liver. Second, being mediocre is fine — you really don’t need to strive for perfection in everything you do (just in one thing). The jazz band of which I have been the mediocre member since we played at Oxford back in the 1980s, A Night in Tunisia,  has a tradition of performing together twice a year. The plague put a stop to that this year and our experiments with online collaboration risibly failed. (You cannot jam on Zoom.) The solution was to try to practice in new ways — not easy to sustain through the long days of internal exile, but the payoff will come when the band strikes up again next year. I may rise above mediocrity.

Step Six: Watch “Doctor Who” with your children or grandchildren. I more or less gave up watching television at around the same time I took up bass-playing. There is one exception to this rule: “Doctor Who,” without a doubt the greatest television series of them all, which predates me by a year, having begun in 1963. The revival of “The Doctor” in 2005 was the single best thing the BBC has ever done. With my son Thomas, who turns nine this week, I’ve been catching up with 15 years of the series’ exceptional science fiction — which magically combines time travel, terrifying aliens and British irony — though we still cannot decide who was the best Doctor: David Tennant or Matt Smith? Or was it actually Tom Baker?

Step Seven: Step. Do not fail to go for a walk every day, regardless of the weather. I write these words after an hour in a fully-fledged blizzard. A walk is infinitely preferable to any gym. If no one will come with you, take Proust.

Step Eight: Improve your curry making. If you haven’t been cooking this year, shame on you. I recommend applying some turmeric, cumin, red chile and coriander seeds to some of that leftover turkey.

Step Nine: Dress like an Oxford don, every weekday. Back in the spring, the beard, T-shirt and sweatpants combo was not conducive to the production of great thoughts. And yet I found it hard to take seriously the people who donned suits and ties to broadcast from their bedrooms. After months of slovenliness, I hit on the solution. I purchased a Fair Isle sleeveless sweater and dug out some maroon corduroy trousers, once part of the costume of an Oxford professor. This restored self-discipline and enabled me to finish writing a book. (I couldn’t quite bring myself to go full Tolkien by buying a pipe, but I was sorely tempted.)

Step Ten: Disable notifications on Twitter. It occurred to me with a flash of insight that I don’t in the least care what the people I don’t follow on Twitter think, otherwise I would follow them. “Would you let all these other people into your garden?” I asked my wife one day. “If not, why would you let them inside your head?” Goodbye, snark!

Step Eleven: Do not watch sports. Just don’t. To me, soccer and rugby without fans is about exciting a spectacle as two dozen men playing blind man’s buff. When we watch sport on television, we are imagining ourselves in the crowd, which is the real source of the adrenaline surge — not the flight of the ball from foot to goal. Without the ebb and flow of singing, cheering and booing, there’s just no thrill.  

Step Twelve: OK, drink booze, too. But only after 6 p.m., otherwise you’ll end up like Agnes in Douglas Stuart’s “Shuggie Bain” (without a doubt the best book published this year). Tea’s all very well during the day, but I couldn’t have retained my sanity after dark without the following liquids: Bent Nail IPA, a delicious beer brewed by Red Lodge Ales; the Veneto winemaker Inama’s smooth yet peppery Carmenere Più; and Laphroaig, my favorite peat-infused Scotch, which they began making the same year Scott published “Guy Mannering.”

As I pointed out eight months ago, “all the great pandemics have come in waves.” This one has managed three in the United States and two in Europe, and we’re still at least four or five months away from herd immunity. So, while you await your vaccination this holiday season, don’t go nuts. My twelfth step would have appalled the founders of Alcoholics Anonymous. But just as there are no atheists in a foxhole, there are precious few teetotalers in a pandemic.

Happy New Year!

My crystal ball missed Brexit but got Donald Trump

 Those who make predictions must keep a tally. So how did I do?

It has been nearly 4½ years since I began writing this column, which works out at roughly 240,000 words altogether. As these will be my last words in these pages, it’s time to look back and take stock. If part of your job is to be a pundit then, as the Pennsylvania University political scientist Philip Tetlock argues in Superforecasting: The Art and Science of Prediction, you need to keep score.

As Tetlock had a dig at me in that book — which was published in 2015, before I began writing for The Sunday Times — this is also a good opportunity to settle a score.

Tetlock was, of course, quite right about most public intellectuals (a term that always makes me think of public conveniences). They seldom hold themselves to account. Nor do they get fired if their predictions are consistently wrong, as long as they are entertaining enough to engage readers.

Since I set up an advisory firm nine years ago, though, my approach has been different — out of necessity, as fund managers differ from newspaper editors in their attitude to predictions. Not only do they notice when you’re wrong, because one or more financial indicators make that clear; they also let you know about it (with grim relish, usually). If you’re wrong too often, it’s goodbye.

So at the beginning of each year we at Greenmantle make predictions about the year ahead, and at the end of the year we see — and tell our clients — how we did. Each December we also rate every predictive statement we have made in the previous 12 months, either “true”, “false” or “not proven”. In recent years, we have also forced ourselves to attach probabilities to our predictions — not easy when so much lies in the realm of uncertainty rather than calculable risk. We have, in short, tried to be superforecasters. And with some success.

Now it’s time to apply the same retrospective scoring to this column. So as to meet my deadline, I’ve picked my first full year at The Sunday Times, which was the annus mirabilis — or horribilis, depending on your politics — beginning on November 1, 2015, the date of my first column.

Three minor themes are worth mentioning. I argued repeatedly that the twin problems of Islamic extremist networks and mass migration from the Muslim world were not likely to go away: “Think of Isis as the Facebook of Islamic extremism” (March 27, 2016). I also began warning, as early as May of that year, that the rise of Silicon Valley’s big tech companies was not an unmitigated boon: “What the state knows is just a fraction of what Facebook knows about you” (May 15). I also noted the dire implications for Labour of the antisemitism of Jeremy Corbyn and his circle (May 1).

But by far the biggest issues of my first year on this page — and subsequent years too — were Britain’s vote to leave the EU and the election of Donald Trump. How did I do?

On Brexit, I was wrong. From the outset, I was a remainer. “The idea that we can . . . separate ourselves from Europe is an illusion,” I wrote on February 21. “For the future of Europe without us would be one of escalating instability.” Impolitely, I called Brexiteers “Angloonies” and “happy morons”. When the remain side lost, I predicted a “stairway to hell”— or at least a recession (June 26). Wrong.

At the end of the year, on December 11, 2016, I made a confession. I had been motivated to back remain more because of “my personal friendship with [David] Cameron and George Osborne” than out of any deep allegiance to the EU. I regretted — and still regret — not urging Cameron to reject “the risible terms that the European leaders offered him back in February on EU migrants’ eligibility for benefits”. That was the moment he should have called their bluff by backing Brexit.

Yet the humiliation of Brexit gave me an advantage over American commentators on the 2016 presidential race. I had moments of doubt, admittedly. I compared Trump to unsuccessful Republican candidates Wendell Willkie (December 13, 2015) and Barry Goldwater (January 31, 2016). On April 3, 2016, I predicted the bursting of the Trump bubble in the Wisconsin primary. Ted Cruz won that, but it didn’t burst the bubble. Far more often, I went against the conventional wisdom that Trump was doomed to lose.

“Trump has the face that fits the ugly mood in America,” was my headline on November 1, 2015. “Trump has both the resources and the incentives to press on. In the current national mood of disaffection with professional politicians, he could seem an attractive alternative to Hillary Clinton . . . The point about Trump is that his appeal is overwhelmingly a matter of style over substance. It is not what he says that a great many white Americans like — it is the way that he says it.”

I was against Trump. I was a signatory of a “never Trump” letter. I repeatedly condemned his “open expressions of racial prejudice and xenophobia”, his isolationism (December 13, 2015) and his fishy bromance with Vladimir Putin (May 8 and October 16, 2016). I regretted that Mike Bloomberg chose not to run (October 23).

But I also saw clearly the strength of his appeal. “Trump is winning,” I wrote on February 28, 2016, “because no other candidate has a more convincing explanation of why so many Republican voters genuinely are worse off today than in 2000 . . . But no one can rule out Democratic defections to Trump when it comes to the crunch on November 8.” On March 6, I imagined Trump winning and running for an unconstitutional third term in 2024. “Trump can beat Hillary Clinton,” I wrote on May 8.

“Can Trump succeed where [Mitt] Romney failed?” I asked on July 21. “Yes . . . many young voters will fail to show up for Clinton. Meanwhile, the white lower class, especially the older cohorts, will turn out for Trump in droves, just as their English counterparts turned out for Brexit.”

The choice between Clinton and Trump was a choice between “snafu” and “fubar”, I wrote on September 18, “but wouldn’t you risk being fubar . . . if it was your only shot at avoiding four more years of snafu?”

“This rage against the global,” I wrote a week later, “is why Trump could win this election. It is why Brexit happened. It is why populists are gaining ground wherever free elections are held.”

I marked my first anniversary at this paper with a column that compared Trump to the Chicago Cubs, the outsiders who had just won the baseball World Series. “He can win,” I wrote, “if there is a differential in turnout between his supporters and [Clinton’s] in the battleground states comparable to the age and ethnicity-based differentials in the UK referendum” (November 6).

Now, dear reader, you are burning to know what I think will happen this November. Bad luck. You will have to seek my superforecast in another publication.

Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford, and managing director of Greenmantle

Coronavirus: we should have learnt from Sars, not swine flu

 If H1N1 had been worse, the elderly might not be in such danger today

The word “genocide” — meaning the murder of a tribe or people — was coined in 1944 by Raphael Lemkin, a Polish-Jewish refugee from Nazism, whose family was all but obliterated in the Holocaust. The word “senicide” — meaning the deliberate murder of the elderly — is less well known, though of older provenance. According to the Oxford English Dictionary, it was first used by the Victorian explorer Sir Henry Hamilton Johnston. “The ancient Sardi of Sardinia,” he wrote in 1889, “regarded it as a sacred . . . duty for the young to kill their old relations.”

Lemkin’s word caught on. Not only did the United Nations general assembly unanimously pass a resolution in 1946 condemning genocide; by 1948 it had also approved — again, nem con — a convention on the prevention and punishment of the crime of genocide.

Although America did not ratify that convention until 1985, use of the word grew exponentially from its first publication. (I hesitate to say that it went viral.) Enter “genocide” into Amazon’s search field and you will have more than 10,000 results to trawl through.

Not so “senicide”. There are just two books on that subject on Amazon’s site: The Customary Practice of Senicide. With Special Reference to India by Pyali Chatterjee, and Death Clock Ticking: Senicide, Ageism and Dementia Discrimination in Geriatric Medicine by Itu Taito. The latter has not yet been published. Oh, and there’s a perfectly ghastly song called Senicide by a Californian heavy metal band called Huntress.

There are a few older books that use the word, nearly all in connection with the alleged practices of ancient or obscure tribes (the Padaeans of India, the Votyaks of Russia, the early American Hopi, the Netsilik Inuit of Canada, South Africa’s San people and the Amazonian Bororos). But senicide is so rare a word that Microsoft Word’s spellcheck underlines it in red, itching to auto-correct it to “suicide”.

All that is about to change. If, as seems increasingly likely, a significant number of western countries are going to continue mismanaging the pandemic caused by the virus Sars-CoV-2 — the novel coronavirus that originated in Wuhan, China, in December — then a very large number of old people are going to die before their time.

The statistics are unequivocal. In China, where the epidemic seems for the moment to be under control, the case fatality rate for those under 50 was 0.2%. For those over 60 it was 3.6%, for the over-70s 8% and for the over-80s 14.8%. In Italy — now the country worst affected by Covid-19, the disease the virus carries — the fatality rate for the over-70s thus far has been 11.8%, for the over-80s 18.8% and for the over-90s 21.6%.

It is, in one respect, a blessing Covid-19 seems to be “ageist”. Most pandemics are not so merciful towards children. In America, for example, the 1957-8 influenza pandemic killed the under-5s at an even higher rate than it killed the over-64s.

It is also true that there have never, in all of history, been so many old folk. Today more than a quarter of Japan’s population are aged 65 or older. In 1960, the share was just 5.6%. In the European Union, the share has doubled from 10% to 20%. The world as a whole has gone from 5% elderly to 9%.

And it is true, too, that doctors in an overwhelmed hospital with insufficient intensive care units are correct, from a utilitarian perspective, to give priority to the young over those nearing the end of their natural lives. I do not blame the Italian doctors who have been practising this form of triage.

Yet when this pandemic has run its course — when we have achieved “herd immunity” as a species and when vaccines and therapies have been devised — there will have been a lot more funerals for elderly Italians and, very probably, Americans and Britons than for Taiwanese or South Koreans.

And the reason for this discrepancy will not be bad luck. The reason will be that east Asian countries drew the right conclusions from the searing experiences of Sars in 2003, while most western countries drew the wrong conclusions from their relatively mild encounter with H1N1, commonly known as swine flu, in 2009.

That Covid-19 was both highly contagious (because it is easy to carry and transmit by asymptomatic individuals) and much more deadly than seasonal flu was already obvious as early as January 26, when I first wrote about the coming pandemic in this column. And yet numerous governments — including the American and the British ones — dithered for the better part of two months.

It was not only Donald Trump’s irresponsible nonchalance that did the damage. There were also failures by the very organisations that were supposed to prepare our countries for a threat such as this. In America there has been a scandalous insufficiency of testing kits, so that, as recently as last week, the country was still lagging behind Belarus and Russia in terms of tests per capita.

In the UK, policy was initially based on the notion that the country would be better off aiming for early herd immunity than trying to suppress the spread of the new disease — until epidemiologists such as my near namesake Neil Ferguson (whom we must all wish a swift recovery, as he developed Covid-19-like symptoms last week) pointed out the likely disastrous consequences.

Because of these blunders, America and the UK have moved far too slowly to adopt the combination of mass testing, enforced social distancing and contact tracing that has successfully contained the virus’s spread in east Asian countries. There is a reason the death toll in South Korea is just over 100, while in Italy it is almost 5,000

How many people will die in the end? We do not know. In America, if Italian conditions are replicated in New York and California, we could see between half a million and million deaths by the end of this year. I have seen estimates as high as 1.7 million, even 2.2 million. The other Ferguson’s worst-case scenario for Britain was 510,000 deaths. But the key point is that most of the victims will be old. And most of the deaths could have been avoided with better preparation and earlier action.

The 19th-century Russian historian Nikolai Karamzin defined senicide as “the right of children to murder parents overburdened by senium [old age] and illnesses, onerous to the family and useless to fellow citizens”. The explorers Knud Rasmussen and Gontran de Poncins reported that senicide was still practised by the Netsilik of King William Island as recently as the 1930s.

But senicide will never be tolerated in the 2020s, least of all in modern, developed democracies. Those whose sins of omission and commission lead to nationwide senicides will, like the perpetrators of genocides in the 20th century, be judged harshly, not only by history, but also by voters — and quite possibly by judges too.

Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford

The deadliest virus we face is complacency

 Our vulnerability to pandemics puts climate change fears in the shade

When I was 11 years old, I was scarred for life by the BBC. It was 1975 and the show was called Survivors. The title sequence begins with a masked Chinese scientist dropping a glass flask. It smashes. We then see him boarding a plane to Moscow, where he starts to feel unwell. Suddenly, a naked arm falls lifeless across the screen. We see passport stamps for Berlin, Singapore, New York . . . and finally London. And then a ghastly red stain spreads across the screen.

The genius of the series was that it was set in middle-class England — a serene Herefordshire of tennis courts, boarding schools and stay-at-home wives. Within 10 minutes of episode one, however, that England was spiralling back to the 14th century. For the Chinese scientist’s flask contained a bacterium even more deadly than Yersinia pestis, which is now generally recognised to have caused the Black Death.

The Black Death — mainly bubonic plague but also the even more lethal pneumonic variant — killed between 75 million and 200 million people as it spread eastwards across Eurasia in the 1340s. The disease was transmitted by flea bites; the fleas travelled by rodent. Up to 60% of the population of Europe perished. Survivors imagined an even worse plague, originating, like the Black Death, in China. The BBC scriptwriters did their homework: the dying had all the symptoms of plague — swelling under the armpits, fever, vomiting of blood. Victims, as in the 14th century, died within a week of being infected. Rats had a starring role in the London scenes.

I have long believed that, even with all the subsequent advances of medicine, we are far more vulnerable to a similar pandemic than to, say, climate change. Bubonic plague was a recurrent killer in Europe until the 18th century and devastated China and India once again in the 1850s and 1890s. In 1918-19, the Spanish influenza pandemic killed between 20 million and 50 million people worldwide, roughly 1%-3% of the world’s population. Even in a normal year, respiratory diseases from influenza kill as many as 650,000 people globally.

So you won’t be surprised to hear that I have been obsessively tracking the progress of the Wuhan coronavirus ever since the Chinese authorities belatedly admitted that it can be passed from human to human.

The coronavirus is much scarier than ebola, which has produced outbreaks and epidemics in some African countries but has not produced an international pandemic because transmission via bodily fluid is difficult, its symptoms are too debilitating and it quickly kills most hosts. Viruses such as the one originating in Wuhan are highly infectious because they are airborne. This variant has the especially dangerous quality that symptoms do not manifest themselves until up to two weeks after an individual becomes infected — and contagious.

I have seen a few rash commentators downplaying the danger. But it is much too early to conclude, as Marc Siegel in the Los Angeles Times does, that the coronavirus “does not currently pose a threat [outside China] and may well never do so”. It is surely a mistake to worry, as did Farhad Manjoo in The New York Times, less about the virus than about “the amped-up, ill-considered way our frightened world might respond to it”. As for the complaint of CNN’s Brandon Tensley that the Trump administration’s coronavirus taskforce was insufficiently “diverse” — namely, it has too many white men — heaven preserve us from woke public health policy.

We don’t know enough yet to say how bad this will be. Among the things we don’t know for sure is the virus’s reproduction number (R0) — the number of infections produced by each host — and its mortality rate, or the number of deaths per 100 cases. Early estimates by the World Health Organisation suggest an R0 of between 1.4 and 2.5 — lower than the measles (12-18), but higher than Sars (0.5). According to Johns Hopkins University in Maryland, by Saturday there were 12,024 confirmed cases and 259 deaths, for a mortality rate of 2.2%. But these numbers are likely to be underestimates.

In the initial outbreak, which began in late December, 27 of 41 infected individuals had direct exposure to the Wuhan food market where (incredibly, given the known risks) live bats were being sold for their meat. Since then, in the space of roughly a month, the disease has reached every province of the People’s Republic. This is far more rapid than the spread of Sars in 2002-3.

One explanation is that the volume of air travel in China has ballooned since Sars. Its 100 busiest airports last year handled 1.2 billion passengers, up from 170 million. Wuhan’s Tianhe airport was almost as busy last year as Hong Kong’s was in 2002. Disastrously, the outbreak came not long before the Chinese lunar new year holiday — the peak travel season — and the regional and/or national authorities were slow to acknowledge how contagious the virus was.

At the time of writing, a total of 164 cases have been confirmed in 24 countries other than China, including seven in America, four in Canada and two in the UK. In other words, we are now dealing with an epidemic in the world’s most populous country, which has a significant chance of becoming a global pandemic.

But how big a chance? How big a pandemic? And how lethal? The bad news, as Joseph Norman, Yaneer Bar-Yam and Nassim Nicholas Taleb argue in a new paper for the New England Complex Systems Institute, is that the answers lie in the realm of “asymmetric uncertainty”, because pandemics have so-called “fat-tailed” (as opposed to normal or “bell-curve”) distributions, especially with global connectivity at an all-time high.

Researchers define the severity of pandemics using “standardised mortality units” (SMUs), where one SMU equals a mortality rate of 0.01% or 770,000 deaths worldwide. A “moderate” global pandemic is defined as causing less than 10 SMU; a “severe” pandemic is above 10 SMU. Yet the average excess mortality of a moderate pandemic is 2.5 SMU, compared with 58 SMU for a severe pandemic. In other words, the mortality rate in a severe pandemic would be about 20 times larger, for a death toll of 44.7 million.

“Standard individual-scale policy approaches such as isolation, contact tracing and monitoring are rapidly . . . overwhelmed in the face of mass infection,” Norman, Bar-Yam and Taleb conclude, “and thus . . . cannot be relied upon to stop a pandemic.” Decision-makers must act drastically and swiftly, avoiding “the fallacy that to have an appropriate respect for uncertainty in the face of possible irreversible catastrophe amounts to ‘paranoia’.”

Thanks to the BBC, I have been paranoid about pandemics for more than 40 years. Nevertheless, the challenge is still to resist that strange fatalism that leads most of us not to cancel our travel plans and not to wear uncomfortable masks, even when a dangerous virus is spreading exponentially. Time to watch Survivors again. At home.

Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford

Publication Name
46 Article Results