The deadliest virus we face is complacency

 Our vulnerability to pandemics puts climate change fears in the shade

When I was 11 years old, I was scarred for life by the BBC. It was 1975 and the show was called Survivors. The title sequence begins with a masked Chinese scientist dropping a glass flask. It smashes. We then see him boarding a plane to Moscow, where he starts to feel unwell. Suddenly, a naked arm falls lifeless across the screen. We see passport stamps for Berlin, Singapore, New York . . . and finally London. And then a ghastly red stain spreads across the screen.

The genius of the series was that it was set in middle-class England — a serene Herefordshire of tennis courts, boarding schools and stay-at-home wives. Within 10 minutes of episode one, however, that England was spiralling back to the 14th century. For the Chinese scientist’s flask contained a bacterium even more deadly than Yersinia pestis, which is now generally recognised to have caused the Black Death.

The Black Death — mainly bubonic plague but also the even more lethal pneumonic variant — killed between 75 million and 200 million people as it spread eastwards across Eurasia in the 1340s. The disease was transmitted by flea bites; the fleas travelled by rodent. Up to 60% of the population of Europe perished. Survivors imagined an even worse plague, originating, like the Black Death, in China. The BBC scriptwriters did their homework: the dying had all the symptoms of plague — swelling under the armpits, fever, vomiting of blood. Victims, as in the 14th century, died within a week of being infected. Rats had a starring role in the London scenes.

I have long believed that, even with all the subsequent advances of medicine, we are far more vulnerable to a similar pandemic than to, say, climate change. Bubonic plague was a recurrent killer in Europe until the 18th century and devastated China and India once again in the 1850s and 1890s. In 1918-19, the Spanish influenza pandemic killed between 20 million and 50 million people worldwide, roughly 1%-3% of the world’s population. Even in a normal year, respiratory diseases from influenza kill as many as 650,000 people globally.

So you won’t be surprised to hear that I have been obsessively tracking the progress of the Wuhan coronavirus ever since the Chinese authorities belatedly admitted that it can be passed from human to human.

The coronavirus is much scarier than ebola, which has produced outbreaks and epidemics in some African countries but has not produced an international pandemic because transmission via bodily fluid is difficult, its symptoms are too debilitating and it quickly kills most hosts. Viruses such as the one originating in Wuhan are highly infectious because they are airborne. This variant has the especially dangerous quality that symptoms do not manifest themselves until up to two weeks after an individual becomes infected — and contagious.

I have seen a few rash commentators downplaying the danger. But it is much too early to conclude, as Marc Siegel in the Los Angeles Times does, that the coronavirus “does not currently pose a threat [outside China] and may well never do so”. It is surely a mistake to worry, as did Farhad Manjoo in The New York Times, less about the virus than about “the amped-up, ill-considered way our frightened world might respond to it”. As for the complaint of CNN’s Brandon Tensley that the Trump administration’s coronavirus taskforce was insufficiently “diverse” — namely, it has too many white men — heaven preserve us from woke public health policy.

We don’t know enough yet to say how bad this will be. Among the things we don’t know for sure is the virus’s reproduction number (R0) — the number of infections produced by each host — and its mortality rate, or the number of deaths per 100 cases. Early estimates by the World Health Organisation suggest an R0 of between 1.4 and 2.5 — lower than the measles (12-18), but higher than Sars (0.5). According to Johns Hopkins University in Maryland, by Saturday there were 12,024 confirmed cases and 259 deaths, for a mortality rate of 2.2%. But these numbers are likely to be underestimates.

In the initial outbreak, which began in late December, 27 of 41 infected individuals had direct exposure to the Wuhan food market where (incredibly, given the known risks) live bats were being sold for their meat. Since then, in the space of roughly a month, the disease has reached every province of the People’s Republic. This is far more rapid than the spread of Sars in 2002-3.

One explanation is that the volume of air travel in China has ballooned since Sars. Its 100 busiest airports last year handled 1.2 billion passengers, up from 170 million. Wuhan’s Tianhe airport was almost as busy last year as Hong Kong’s was in 2002. Disastrously, the outbreak came not long before the Chinese lunar new year holiday — the peak travel season — and the regional and/or national authorities were slow to acknowledge how contagious the virus was.

At the time of writing, a total of 164 cases have been confirmed in 24 countries other than China, including seven in America, four in Canada and two in the UK. In other words, we are now dealing with an epidemic in the world’s most populous country, which has a significant chance of becoming a global pandemic.

But how big a chance? How big a pandemic? And how lethal? The bad news, as Joseph Norman, Yaneer Bar-Yam and Nassim Nicholas Taleb argue in a new paper for the New England Complex Systems Institute, is that the answers lie in the realm of “asymmetric uncertainty”, because pandemics have so-called “fat-tailed” (as opposed to normal or “bell-curve”) distributions, especially with global connectivity at an all-time high.

Researchers define the severity of pandemics using “standardised mortality units” (SMUs), where one SMU equals a mortality rate of 0.01% or 770,000 deaths worldwide. A “moderate” global pandemic is defined as causing less than 10 SMU; a “severe” pandemic is above 10 SMU. Yet the average excess mortality of a moderate pandemic is 2.5 SMU, compared with 58 SMU for a severe pandemic. In other words, the mortality rate in a severe pandemic would be about 20 times larger, for a death toll of 44.7 million.

“Standard individual-scale policy approaches such as isolation, contact tracing and monitoring are rapidly . . . overwhelmed in the face of mass infection,” Norman, Bar-Yam and Taleb conclude, “and thus . . . cannot be relied upon to stop a pandemic.” Decision-makers must act drastically and swiftly, avoiding “the fallacy that to have an appropriate respect for uncertainty in the face of possible irreversible catastrophe amounts to ‘paranoia’.”

Thanks to the BBC, I have been paranoid about pandemics for more than 40 years. Nevertheless, the challenge is still to resist that strange fatalism that leads most of us not to cancel our travel plans and not to wear uncomfortable masks, even when a dangerous virus is spreading exponentially. Time to watch Survivors again. At home.

Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford

Davos Man is cooling on Stockholm Girl Greta Thunberg

 Delegates publicly praise the gloomy activist, but privately prefer dirty Don

Do you smoke cigarettes, despite knowing the risk that the habit will give you cancer? Do you drive after drinking alcohol, despite being aware that it is both dangerous and unlawful? Do you give speeches about climate change at international conferences, having flown there by private jet? Do you ever sit in a big black car in a traffic jam, when you could quite easily have walked, despite knowing that this, too, is adding yet more carbon dioxide to the atmosphere?

The term “cognitive dissonance” was coined by the American social psychologist Leon Festinger. In his seminal 1957 book on the subject, however, Festinger argued that “in the presence of an inconsistency there is psychological discomfort” and that therefore “the existence of [cognitive] dissonance . . . will motivate the [affected] person to try to reduce the dissonance and achieve consonance”. Moreover, “when dissonance is present, in addition to trying to reduce it, the person will actively avoid situations and information which would likely increase the dissonance”.

My own observations of the human species strongly suggest otherwise. On the contrary, I see all around me — as well as throughout history — countless people not merely comfortable with cognitive dissonance but positively flocking towards situations that increase it.

Take the World Economic Forum (WEF), the gathering of billionaires, millionaires, world leaders, do-gooders, busybodies and journalists that takes place each January in the Swiss resort of Davos. The overwhelming majority of people attending this year’s conference would, I have no doubt, affirm their commitment to reducing carbon dioxide emissions to avert catastrophic climate change, even while on board their Gulfstreams and in their Range Rovers.

I doubt if a single chief executive present at the WEF last week would dare publicly to challenge the view that a modern corporation should rigorously measure and regulate its behaviour in terms of its environmental and social impact, as well as its quality of governance (ESG, for short). As the US Business Roundtable declared last August, firms must now be run not only for the benefit of their shareholders but also for all their “stakeholders”: customers, employees, suppliers and communities. Milton Friedman is dead. Long live Klaus Schwab — founder of the WEF — who pioneered this notion of stakeholder capitalism.

“ESG-omania” (or “ESG-apism”) meant Davos 2020 was an orgy of virtue-signalling on climate change and diversity. To walk down Davos Promenade, the main drag, was to run a gauntlet of uplifting corporate slogans: “Sustainable solutions for Earth, for life”; “A cohesive and sustainable world starts with data”; “Let’s bring sea level and C-level together”.

Each year the WEF’s global risks report tells us what the business elite is most worried about. Ten years ago, the top five risks were “Asset price collapse”, “China economic slowdown”, “Chronic disease”, “Fiscal crises” and “Global governance gaps”. This year? “Extreme weather”, “Climate action failure”, “Natural disasters”, “Biodiversity loss” and “Human-made environmental disasters”.

In this green new world, Davos Man must now prostrate himself before Stockholm Girl: 17-year-old Greta Thunberg, who delivered her latest tirade on Tuesday morning. “We don’t need a ‘low-carbon economy,’ ” she declared. “We don’t need to ‘lower emissions’. Our emissions have to stop. Any plan or policy of yours that doesn’t include radical emission cuts at the source, starting today, is completely insufficient.” She demanded that all participants “immediately halt all investments in fossil fuel exploration and extraction”.

The only public objection came from the US Treasury secretary, Steve Mnuchin. “Who is she — the chief economist?” he asked. “After she goes and studies economics in college she can come back and explain that to us.” Such blasphemy!

Mnuchin’s boss was also present. Four years ago the very notion of Donald Trump was inconceivable at Davos. Three years ago people were stunned by his election. Two years ago they sniggered at him. Now Trump is treated with more respect — after all, he’s still president, impeachment will not lead to his removal and the Davos consensus is that he’ll get a second term. But the applause for the president’s speech was no more than polite and every European participant complained that it was aimed too much at the American electorate. (That made me laugh. Have they no clue what Trump’s campaign speeches are like?)

This is where you might think — if you had read Professor Festinger — that the cognitive dissonance would become unbearable. For privately, after a glass or two of wine, nine out of 10 business people I spoke to admitted that they thought Greta’s speech impossible and Trump’s not altogether bad.

The fact is that the American economy has been doing rather well under Trump — better, certainly, than its European counterpart. Everyone in business knows this. The latest US growth forecasts may point to a slowdown (from 2.3% last year to 2% this year, according to the International Monetary Fund), but that still beats the eurozone (from 1.2% to 1.3%) and Germany (from 0.5% to 1.1%).

As Trump said, in an uncharacteristically restrained speech, American consumer confidence is buoyant, the unemployment rate is the lowest in nearly half a century, taxes on business are down, as is regulation, and the stock market is at record highs. Since his election, the US economy has added nearly seven million jobs. Housebuilding has just hit a 13-year high. Most strikingly, earnings growth has been especially strong for less skilled, lower-paid and African-American workers.

For middle America, Trump’s populist policy mix — immigration restriction, tariffs, easy money and deficit finance — is delivering. In quiet corners of the Davos congress centre you could hear Europeans wishing they could have at least a piece of this American action — and complaining that Greta’s demand for “zero carbon now” was a recipe for zero growth.

Cognitive dissonance is often like this: you say one thing in public and another in private. It was once the basis of life in communist systems all over the world. It turns out to be something capitalists can do just as easily, with very little of the discomfort predicted by social psychology.

But be warned. It is not always the case that private thoughts are right and public ones wrong. If Davos Man has come around to Trump — enough to expect, if not quite to hope, for his re-election — it is no guarantee that he will win on November 3. If January 2016 is anything to go by, you should probably bet against the Davos consensus and have a flutter on Bernie Sanders.

In the same way, if it’s climate change the WEF-ers are most worried about, you should probably brace yourself for a coronavirus pandemic. Talking of cognitive dissonance, what the hell were we all doing at a massive global conference last week? Fact: at least three of the WEF attendees were from — you guessed it — Wuhan.

Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford

Science fiction has become dystopian fact

 Orwell and Huxley were not the first to fear an age of mass surveillance

So which dystopia are we living in? Most educated people have read George Orwell’s Nineteen Eighty-Four and Aldous Huxley’s Brave New World. So influential have these books been that we are inclined to view all disconcerting new phenomena as either “Orwellian” or “Huxleyan”. If you suspect we shall lose our freedom to a brutally repressive state, grinding its boot into our faces, you think of George. If you think we shall lose it to a hedonistic consumer culture, complete with test-tube designer babies, you quote Aldous.

However, a superior work of science fiction to both is the earlier masterpiece We, by the Russian satirist Yevgeny Zamyatin. Written in 1920-21, in the early, turbulent years of Bolshevik rule, We is astoundingly prescient. In the “One State”, individual humans are mere “ciphers” clad in standardised “unifs”, with numbers instead of names. All apartments are made entirely of glass and curtains can be drawn only when one is having state-licensed sex.

The secret police, the Bureau of Guardians, are ubiquitous. Unlike in Orwell’s Soviet Britain, where there are ways of dodging the telescreens, surveillance in the One State is incessant and inescapable. Unlike in Huxley’s eugenics-based utopia, pleasure is mandatory and joyless.

The central character of We, D-503, is a mathematician and engineer working on the construction of a spaceship, the Integral, but tortured by the suspicion that not all human life can be reduced to mathematical formulae. D-503’s life begins to unravel when he is seduced by a femme fatale, I-330, who introduces him to the forbidden pleasures of alcohol, tobacco and unscheduled sex.

Confronted by a rebellion led by I-330 — which threatens to break down the Green Wall between the One State and a hitherto hidden natural world — the all-powerful Benefactor orders mass lobotomisation of all ciphers. The only way to preserve universal happiness, he argues, is to abolish the imagination.

“What have people — from the very cradle — prayed for, dreamt about, and agonised over?” the Benefactor asks D-503. “They have wanted someone, anyone, to tell them once and for all what happiness is — and then to attach them to this happiness with a chain.”

Orwell frankly acknowledged his debt to Zamyatin; Huxley implausibly denied having read him. At the very least, Zamyatin deserves equal billing with them as one of the masters of dystopian science fiction, not least because he anticipated the nightmare panopticon Stalin would build in the ruins of the Russian empire. (By the time Orwell was writing, the nature of the totalitarian beast was all too apparent.) Jailed twice for his dissident views, Zamyatin was permitted to go into exile in 1931. He was lucky.

I have spent much of my career trying to imagine possible futures by applying history to the present. This year, however, I have been experimenting with an alternative approach, which is to apply science fiction. Sci-fi was a genre I loved as a boy but more or less gave up when I went to university, in the mistaken belief that it was insufficiently serious. In truth, there are few more illuminating literary forms.

From HG Wells to Margaret Atwood, hundreds of great minds have looked into their crystal balls, imagining the possible consequences of vast catastrophes and new technology. Studying the past helps us see ways the world may repeat itself, but we need science fiction to envision what will be novel about the future.

Zamyatin, Huxley and Orwell all essentially agreed that the power of the state would inexorably grow. The only question, as Huxley said to Orwell in a letter he wrote after reading Nineteen Eighty-Four in 1949, was how brutally coercive the state of the future would be.

“The philosophy of the ruling minority in Nineteen Eighty-Four is a sadism which has been carried to its logical conclusion,” wrote Huxley (who, by the way, had taught Orwell French at Eton many years earlier). “Whether in actual fact the policy of the boot-on-the-face can go on indefinitely seems doubtful. My own belief is that the ruling oligarchy will find less arduous and wasteful ways of governing and of satisfying its lust for power . . .

“Within the next generation I believe that the world’s rulers will discover that infant conditioning and narco-hypnosis are more efficient, as instruments of government, than clubs and prisons, and that the lust for power can be just as completely satisfied by suggesting people into loving their servitude as by flogging and kicking them into obedience.”

As I reflect on the world in 2019, I am struck by the wisdom of those words. In Xi Jinping’s China, we see Totalitarianism 2.0. The boot on the face remains a possibility, of course, but it is needed less and less as the system of social credit expands, aggregating and analysing all the digital data that Chinese citizens generate.

“The political and legal system of the future is inseparable from the internet, inseparable from big data,” Alibaba’s Jack Ma told a Communist Party commission overseeing law enforcement in 2017. In future, he said, “bad guys won’t even be able to walk into the square”. Example: some classrooms in China are now equipped with artificial-intelligence cameras and brain-wave trackers to monitor pupils’ concentration levels.

The sole consolation, if it’s human freedom you love, is that democratic states seem less capable of this kind of thing — though I suspect it’s more a result of incompetence than the separation of powers, the rule of law or the spirit of liberty. True, we need to be worried about the private-sector panopticons under construction at Google and Facebook. (If you doubt that the Silicon Valley giants have totalitarian tendencies, just take a look at Google’s leaked presentation The Good Censor.)

But technology in the service of making people money seems ultimately less dangerous than technology in the service of making citizens “happy”. The gaiety of the planet has been much enhanced in recent weeks by the travails of WeWork, a wildly overhyped tech company that rents out shared office space. Supposedly worth $47bn (£38bn) just a few weeks ago, WeWork has postponed its initial public offering. Last week, Larry Ellison, a founder of the tech giant Oracle, called it “almost worthless”.

The long-haired Israeli co-founder of the company, Adam Neumann, once declared that WeWork’s “mission” was “to elevate the world’s consciousness”. Another of his sayings is that “the energy of we [is] greater than any one of us, but inside each of us”.

Ah, yes, the energy of we. While I can just about imagine Zamyatin’s Benefactor saying this — or Huxley’s Mustapha Mond, for that matter — Neumann is ultimately more of a Douglas Adams character. We may well be destined for dystopia, but as long as we’re not all lobotomised, there’s a fighting chance that the future will be more Hitchhiker’s Guide to the Galaxy than hell on earth.

Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford

Adapt and we’ll defy Greta Thunberg’s expectations

 The teenager’s call for panic could do more harm than good

In 15th-century Peru, we learnt last week, children were sacrificed to propitiate the Chimu gods in an attempt to end natural disasters caused by the climatic phenomenon we now call, appropriately enough, El Niño. In our time the roles have been reversed. Now children warning of an impending climate catastrophe are the ones that have to be propitiated. Now it is they who demand sacrifices.

The arrival of Greta Thunberg in New York on Wednesday was one of many recent events that illustrate how rapidly modern environmentalism is degenerating into a millenarian cult.

Greta, 16, is in New York at the invitation of the United Nations, having already established herself as a public figure in Europe by leading mass truancies to protest against climate change (“Fridays for Future”). Rather than flying, she sailed across the Atlantic in an “emissions-free yacht” to spare the Earth’s atmosphere the exhaust from a plane that was flying to New York anyway, with or without her.

“Just before 3pm,” reported The New York Times, “a shout went up from those waiting in the intermittent light rain to greet her . . . most of them young activists. The boat’s black sails had come into sight just blocks from Wall Street, the heart of the global financial system whose investments in fossil fuels are one of the main targets of climate protesters.”

Amid the drizzle, the bankers cowered before the wrath of Greta. From the headquarters of the once-mighty Goldman Sachs came the feeble tweet: “We’re committed to helping win the race and proud to welcome @GretaThunberg to New York.” They’ll be sacrificing the oil company accounts on Tuesday.

“Sea levels are rising and so are we!” the young activists chanted, according to the priceless Times report. Once safely ashore in Manhattan, Greta lost no time in urging Donald Trump “just to listen to the science, [as] he obviously doesn’t do that”.

Science. Or perhaps science fiction. There is at first something unnervingly reminiscent of John Wyndham’s Midwich Cuckoos about Greta. The pigtails. The unsmiling stare. But then you learn that she has struggled with mental health conditions, including high-functioning autism and obsessive-compulsive disorder. This makes it hard to criticise her.

Yet what does it tell us about our world that Greta is about to add the UN general assembly to the list of august bodies she has addressed in the past year, after the Pope, the World Economic Forum and the European parliament? “I want you to panic,” she said at Davos in January. “I want you to feel the fear I feel every day.” That is not the voice of science. It is the voice of a millenarian cult leader.

The end of the world is not nigh, however.

Now, I am not about to deny that climate change is happening or that global warming is going to have adverse effects in the foreseeable future. Not even Bjorn Lomborg, the sceptical Danish economist, says that. The point, as he argued in a recent, brilliant presentation at the Hoover Institution, is that — as in the past — we humans are capable of adapting to climate change in ways that can significantly mitigate its adverse effects.

It would be foolish to do nothing to prepare for a warmer planet. But it would be more foolish to pretend that we are doing something that will significantly reduce carbon dioxide emissions when we are not.

Greta’s carbon-neutral Atlantic crossing is a case in point. As yachts require crews, it is almost certain that more people will end up flying across the Atlantic as a result of her stunt than if she had caught a scheduled flight. The Paris climate accord is a scaled-up version of this. Even if adhered to, it will scarcely increase the share of global energy that comes from renewable sources. The effect on average temperature will be negligible: just 1% of what would be needed to limit the rise in global temperature by 2100 to 1.5C.

It would be even more foolish to take, on the basis of apocalyptic visions, extreme precautions that end up costing more than inaction would. Subsidies to renewable energy have a cost. Cutting CO2 emissions has a cost. Those costs in terms of forgone growth could exceed the costs of climate damage if we overreach in the way that, for example, Alexandria Ocasio-Cortez’s Green New Deal would. The key point, as Lomborg says, is that vastly more people die as a consequence of poverty each year than die as a consequence of global warming. A CO2 emissions target is not the optimal target if meeting it would trap millions in poverty, not to mention ignorance and ill-health.

Back in the 1400s people in Peru believed that sacrificing their children would reduce rainfall. Not only did that not work. Regardless of their grim, murderous rites, they were soon to be hit by a far worse natural disaster than rain, namely the various lethal pandemics that swept the Americas after the arrival of Europeans. We know climate change can happen, because it followed hard on the heels of this “great dying”: the collapsing population in the New World reduced carbon dioxide levels as large areas of land returned to the wild, leading to the so-called Little Ice Age.

I have said more than once in recent years that our era has more in common with the 16th and 17th centuries than with any intervening period — and not just because of the splendidly Stuart-style constitutional crisis currently gripping the British Isles. It is the early-modern world all over again, not least because the effects of the internet on popular belief so closely resemble the effects of the printing press.

The challenge of millenarianism — as Alan Bennett, Peter Cook, Jonathan Miller and Dudley Moore pointed out in my favourite sketch from Beyond the Fringe — is what to do when the end of the world fails to happen.

Greta is right about one thing. The chances are virtually nil that the governments of the world will do as she asks. While the West virtue-signals, China, India, Brazil and others will continue to attach more importance to growth than to curbing emissions. The planet will grow warmer, just as it grew colder in the 1600s. And we shall adapt, taking advantage of the technological innovations that will gradually improve how we generate and store electrical power and ward off flood waters.

It is 2059. To the embarrassment (but, I hope, relief) of Greta Thunberg, now 56, her great expectations of the end of the world have not been fulfilled. Jair Bolsonaro didn’t torch the Amazon. Trump didn’t incinerate the planet. You should come back to New York to celebrate our survival, Greta.

But, this time, fly.

Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford

Publication Name
43 Article Results