It was predictable. Cho Seung-Hui was a taciturn, moody loner. Four of his professors expressed concerns about the content of his work or classroom conduct. After complaints by two female students, the campus police and a college counsellor tried to have him committed to a mental institution. But a doctor didn't agree with the judge that he presented a danger to others. And guns are easy to buy in America (though banned on Virginia campuses). As a result 33 people are dead.
Journalists' efforts to explain the Virginia Tech massacre perfectly illustrate one of the central points of an idiosyncratically brilliant new book by Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable (Penguin/Allen Lane). Having been completely caught out by some random event, we human beings are wonderfully good at retrospectively predicting it. In reality, however, Cho was what Taleb calls a "Black Swan".
Why a black swan? Taleb's starting point is what philosophers call the problem of induction. Suppose you have spent all your life in the northern hemisphere and have only ever seen white swans. You might very well conclude (inductively) that all swans are white. But take a trip to Australia, where swans are black, and your theory will collapse. A "Black Swan" is therefore anything that seems to us, on the basis of our limited experience, to be impossible.
Over 20 years of university teaching, I have seen my fair share of taciturn, moody young men. Many have had difficulties with girls. Some have needed counselling. A few have required psychiatric treatment. The risk that one of my depressive students might commit suicide is one I have often contemplated. But the risk that one might run amok and kill 32 people? Never.
Why, Taleb asks, do we tend to confuse improbability with impossibility? Partly, he suggests, it's because evolution did not favour complex probabilistic thinking. Honed by centuries of hunter-gathering, we are disposed to make snap decisions on the basis of minimal evidence and facile theories - presumably because those who glimpsed a lion and started running, on the crude assumption that all wild animals always eat humans, were more likely to survive than those who preferred to test this hypothesis experimentally. There are friendly lions, just as there are black swans, but better safe than sorry.
Our flawed way of thinking also reflects the development of Western philosophy, social science and history. The Platonic school of philosophy encouraged us to prefer simple theory to messy reality; it also inclines us to select only the data that fit our theories. Taleb especially abhors the tendency of economists and others to assume that everything conforms to what is sometimes called the normal distribution or "bell curve", associated with the German mathematician Carl Friedrich Gauss.
Sure, says Taleb, a chart of the heights of all college students would look like a bell, with most clustered around the average height and only a negligible minority taller than seven feet or shorter than four feet. But it's a fatal mistake to look for bell curves everywhere. The statistical distributions of earthquakes, financial crises, wars and book sales - to name just four examples - obey a quite different set of rules (sometimes known as fractal distribution or "power laws").
In each case, when you plot a chart, there is much less clustering around the average, and there are many more data points at the extremes. Compared with the standard bell curve, these curves have "fat tails" at each end: there are many more really big quakes, economic crashes, wars and bestsellers than the normal distribution would lead you to expect. Put differently, there are many giants (and also many midgets). I suspect a similar pattern would be observed if it were possible to plot all the violent incidents that have taken place at US universities in the past half-century. Clearly, massacres of 32 are less likely than murders of one, but not as much as 60ft tall men are less likely than 6ft tall men.
Yet it is Taleb's assault on traditional historiography that is most relevant here. Since Thucydides, it is true, historians have encouraged us to explain low-probability calamities (like wars) after the fact. Such story-telling helps us to make sense of a random disaster. It also enables us to apportion blame. Generations of historians have toiled in this way to explain the origins of great calamities like, say, the First World War, constructing elegant narrative chains of causes and effects, heaping opprobrium on this or that statesman.
There is something deeply suspect about this procedure, however. It results in what Taleb calls the "retrospective distortion". For these causal chains were quite invisible to contemporaries, to whom the outbreak of war came as a bolt from the blue. The point is that there were umpteen Balkans crises before 1914 that didn't lead to Armageddon. Like Cho Seung-Hui , the Sarajevo assassin Gavrilo Princip was a Black Swan - only vastly bigger.
The same flaw is obvious in the stories currently being told about the Virginia Tech massacre. If we can see the causes of Cho's rampage now, why was it not anticipated at the time? Negligence is not the only possibility. The reality is that for every Cho who runs amok, there are hundreds of thousands of depressive, misanthropic students who don't.
Taleb's central point, then, is that we are too much influenced by instinct, history, Plato and Gauss. We assume the entire world is "Mediocristan", whereas in reality large swathes of it are "Extremistan".
The trouble is that it is much harder to live with this insight than to live without it. As Taleb's critics in the financial world will tell you (and he himself admits), merelyinsuringyourself against fat tail events does not constitute a profitable trading strategy. Knowing that world wars can happen roughly twice a century is like knowing thatastudentcanrun amok roughly once a decade: it doesn't allow you to predict which diplomatic/personality crisis will be the lethal one.
For practical purposes, it turns out, we humans prefer to work with predictions and forecasts, even when theyarenearlyalwayswrong.Weprefertoregardfinancialmarketsascasinos (what Taleb calls "the ludic fallacy" that odds are always calculable), even when they clearly aren't. And we resist payingexcessiveinsurancepremiumstocoverourselves against very remote contingencies. Forcibly committing every disturbed student to a mental hospital might avert another Virginia Techmassacre.Butthehospitalswould be overflowing.
In any case, as President Bush has learned, you don't get rewarded for trying to stop bad things from happening, precisely because if you're successful they don'thappen.Onhis watch, after all, there hasn't been another 9/11 (a classic Black Swan event). And Saddam Hussein will never invade Kuwait again. But is anybody outtheregrateful?Noteven Bush himself can be certain that his strategy of pre-emption deserves the credit for non-events.
Perhaps the most provocative of all Taleb's many provocations is his hypothesis that, as a result of globalisation and the speedofelectroniccommunications,theworldisbecoming more like Extremistan and less like Mediocristan.
Yes, the integration of international markets seems to reduce economic volatility. But by magnifying the effects of herd-like behaviour (anotherofourevolvedtraits),italsoincreases the tendency for winners to take all - the Harry Potter phenom-enon - and for disasters, when they strike, to be comparably huge.
Just as there will be fewer but bigger bestsellers, Taleb argues, so there may also be "fewer but bigger crises" in the realms of finance and geopolitics. Ihavenotquitemadeupmy mind if the Virginia Tech massacre supports his hypothesis.
But it is surely significant that Cho was consciously mimicking the behaviour of the Columbine killers, while at the same time exceeding their toll of victims. Now,thatsuggestsa really chilling possibility: of more and bigger Black Swans.
Not to mention metal detectors in the lecture halls.