Michael Gove's new national curriculum is out, and already the big guns of Oxbridge are blasting the changes it proposes to the way English kids are taught history.
From Cambridge no less a personage than Richard Evans, the Regius Professor of History, condemned Gove's attempt to restore "rote learning of the patriotic stocking-fillers so beloved of traditionalists". According to Evans, the new curriculum was "a Little England version of our national past, linked to an isolationist view of our national future". It constituted "a mindless regression to the patriotic myths of the Edwardian era".
From Oxford came the echo. David Priestland said it was a "depressingly narrow … resolutely insular … politicised and philistine" document. "We are … firmly back in the land of the Edwardian bestseller Our Island Story."
The pomposity of these attacks is in inverse proportion to their accuracy. Indeed, if you want a perfect illustration of how depressingly narrow, resolutely insular and politicised Oxbridge historians can be, read these two. You have to wonder when, if ever, these learned professors last set foot inside a school classroom, or last had a conversation with a history teacher or a pupil about the current key stages 2 and 3.
Evans's enthusiastic allusion to "the existing breadth and ambition of coverage, critical method and historical debate" suggests an almost wilful ignorance of – or indifference to – the parlous state of historical knowledge among young Britons.
If you want to understand what's really wrong with history in English schools, read schoolteacher Matthew Hunter's excellent essay in the latest issue of Standpoint. As Hunter rightly says, it's not just the defective content of the old national curriculum that is the problem. It's the way history has been taught in British schools ever since the advent of the schools history project in the 1970s and the rejection of historical knowledge in favour of "source analysis" and "child-centered" learning ("Imagine you are a Roman centurion …").
Only someone living in a dreaming Oxonian spire could be unaware of how badly this has turned out, despite the best efforts of thousands of hard-working teachers. I know because I have watched three of my children go through the English system, because I have regularly visited schools and talked to history teachers, and because (unlike Evans and Priestland, authors of rather dry works on, respectively, Nazi Germany and Soviet Russia) I have written and presented popular history.
The new national curriculum is not flawless, to be sure. It runs counter to the advice I gave Gove by being much too prescriptive. The 34 topics to be covered by pupils between the ages of seven and 14 already read a bit like chapter titles and, if there is one thing I hope we avoid, it is an official history textbook (even if it's written by Simon Schama).
But to caricature it as an unfunny version of 1066 and All That is the kind of disingenuous misrepresentation of a document that Richard Evans would denounce as professional misconduct if he were not the historian doing it.
Among other things, the national curriculum explicitly aims to ensure that all pupils "know and understand the broad outlines of European and world history: the growth and decline of ancient civilisations; the expansion and dissolution of empires"; that they "understand historical concepts such as continuity and change, cause and consequence, similarity, difference and significance"; and that they "understand how evidence is used rigorously to make historical claims".
At key stage 1, children will be introduced to "basic concepts" such as nation, civilisation, monarchy, parliament, democracy, war and peace. At key stage 2, they will study the ancient civilisations of Greece and Rome. As for "the essential chronology of Britain's history", to which Evans and Priestland object so strongly, it is a model of political correctness: not only Mary Seacole makes the cut, but also Olaudah Equiano – hardly escapees from Our Island Story.
Quite why the professors feel obliged to defend a status quo that so many teachers, parents and pupils agree is indefensible I cannot work out. Is it sheer ignorance? Or partisan prejudice?
Surely they can't sincerely think it's acceptable for children to leave school (as mine have all done) knowing nothing whatever about the Norman conquest, the English civil war or the Glorious Revolution, but plenty (well, a bit) about the Third Reich, the New Deal and the civil rights movement?
This national curriculum isn't perfect, but it's a major improvement. It's supposed to be a "framework document for consultation". At least we now know two people Gove need not consult.
The British—slightly less than a thousand of them—used to govern India. Without air-conditioning.
Conan O’Brien was not the only one who watched the London Olympic opening ceremonies with amazement. “Hard to believe my ancestors were conquered by theirs,” he tweeted. Every Indian watching must have been thinking the very same.
Until their TVs went dark.
The recent power outage in India interested me more than the Olympics. (I had a very British reaction to the opening ceremonies: I found them excruciatingly embarrassing.) The Indian blackout was surely the biggest electricity failure in history, affecting a staggering 640 million people. If you have ever visited Delhi in the summer, you will have some idea what it must have felt like.
“Every door and window was shut,” Rudyard Kipling recalled of summer in the scorched Indian plains, “for the outside air was that of an oven. The atmosphere within was only 104 degrees, as the thermometer bore witness, and heavy with the foul smell of badly-trimmed kerosene lamps; and this stench, combined with that of native tobacco, baked brick, and dried earth, sends the heart of many a strong man down to his boots, for it is the smell of the Great Indian Empire when she turns herself for six months into a house of torment.”
There was a reason the British moved their capital to the cool Himalayan hill station of Simla every summer. Maybe today’s Indian government should consider following their example. Because power failures like this are not about to get less frequent. On the contrary, the outage has exposed the single greatest vulnerability of the Asian economic miracle: it is fundamentally underpowered.
In the past 10 years, according to the energy giant BP, India’s coal consumption has more than doubled, its oil consumption has increased by 52 percent, and its natural-gas consumption has jumped by 131 percent. For China the figures are, respectively, 155 percent, 101 percent, and 376 percent. Asia as a whole is insatiably guzzling fossil fuels. And this is not about to stop. The McKinsey Global Institute expects India’s economy to grow at an average rate of between 7 percent and 8 percent from now until 2030.
The good news is that all this growth will do something (though not enough) to compensate for the depressed state of indebted developed economies like the United States and Europe. The bad news—apart, of course, from the soaring CO2 emissions—is that Asia’s creaking institutions may not be able to cope with the staggering social consequences.
According to McKinsey, India’s urban population will increase from 340 million in 2008 to around 590 million in 2030. By then, India will have 68 cities with populations of more than 1 million, including six megacities with populations of 10 million or more, of which two—Mumbai and Delhi—will be among the five biggest cities in the world.
To cope with this breakneck urbanization, India needs to invest $1.2 trillion over the next 20 years to upgrade the infrastructure of its cities. Mumbai alone needs $220 billion. Will it happen? In India, there is a sideways movement of the head that means neither “Yes” nor “No,” but “Please don’t ask that.”
India’s electricity grid has missed every capacity addition target since 1951. The system is so dilapidated that 27 percent of the power it carries is lost as a result of leakage and theft. Even today, 300 million people—a quarter of the population—don’t have access to the grid. That’s one reason the blackout didn’t spark more public ire.
The root of the problem is one of many leftovers of India’s post-independence experiment with socialism. Half of India’s power stations are coal-fired. Indian coal is produced by a state monopoly (Coal India). The price is controlled by the state, as is the price of electricity itself. The private firms running power stations are trapped between a lump of coal and a hard place. They cannot even trust the regional distributors to order the right amount of power.
In effect, Indians have a National Power Service similar in many ways to the National Health Service their former rulers in Britain are so proud of. Which brings me back to the Olympics. Surely the most embarrassing thing about Danny Boyle’s opening extravaganza was the surreal dance routine involving 1950s-era hospital beds and nurses. Considering just how bad the NHS is in any meaningful international comparison, you have to wonder what the Indian equivalent would be. How about a stadium full of coal-fired power stations, all dancing in the dark?
Something to look forward to at the 2028 Mumbai Olympics.
Are you a technoptimist or a depressimist? This is the question I have been pondering after a weekend hanging with some of the superstars of Silicon Valley.
I had never previously appreciated the immense gap that now exists between technological optimism, on the one hand, and economic pessimism, on the other. Silicon Valley sees a bright and beautiful future ahead. Wall Street and Washington see only storm clouds. The geeks think we’re on the verge of The Singularity. The wonks retort that we’re in the middle of a Depression.
Let’s start with the technoptimists. Last Saturday I listened with fascination as a panel of tech titans debated the question: “Will science and technology produce more dramatic changes in solving the world’s major problems over the next 25 years than have been produced over the last 25 years?”
They all thought so. We heard a description of what Google’s Project Glass, the Internet-enabled spectacles, can already do. (For example, the spectacles can be used to check if another speaker is lying.) Next up: a search engine inside the brain itself. We heard that within the next 25 years, it will be possible to take 1,000-mile journeys by being fired through tubes. We also heard that biotechnology will deliver genetic “photocopies” of human organs that need replacing. And we were promised genetically engineered bugs, capable of excreting clean fuel. The only note of pessimism came from an eminent neuroscientist, who conceded that a major breakthrough in the prevention brain degeneration was unlikely in the next quarter century.
For a historian, all this technoptimism is hard to swallow. The harsh reality, as far as I can see, is that the next 25 years (2013-2038) are highly unlikely to see more dramatic changes than science and technology produced in the last 25 (1987-2012).
For a start, the end of the Cold War and the Asian economic miracle provided one-off, nonrepeatable stimuli to the process of innovation in the form of a massive reduction in labor costs and therefore the price of hardware, not to mention all those ex-Soviet Ph.D.s who could finally do something useful. The IT revolution that began in the 1980s was important in terms of its productivity impact inside the U.S.—though this shouldn’t be exaggerated—but we are surely now in the realm of diminishing returns (the symptoms of which are deflation plus underemployment due partly to automation of unskilled work).
The breakthroughs in medical science we can expect as a result of the successful mapping of the human genome probably will result in further extensions of the average lifespan. But if we make no commensurate advances in neuroscience—if we succeed only in protracting the life of the body, but not the mind—we will simply increase the number of dependent elderly.
My pessimism is supported by a simple historical observation. The achievements of the last 25 years were actually not that big a deal compared with what we did in the preceding 25 years, 1961-1986 (e.g. landing men on the moon). And the 25 years before that, 1935-1960, were even more impressive (e.g. splitting the atom). In the words of Peter Thiel, perhaps the lone skeptic within a hundred miles of Palo Alto: In our youth we were promised flying cars. What did we get? 140 characters.
Moreover, technoptimists have to explain why the rapid scientific technological progress in those earlier periods coincided with massive conflict between armed ideologies. (Which was the most scientifically advanced society in 1932? Germany.)
So let me offer some simple lessons of history: More and faster information is not good in itself. Knowledge is not always the cure. And network effects are not always positive.
In many ways, the discussion I’ve just described followed logically from the previous week’s widely reported spat between Peter Thiel and Eric Schmidt at the “Brainstorm Tech” conference in Aspen, where Schmidt took the technoptimistic line and Thiel responded with a classic depressimistic question: Why, if information technology is so great, have median wages stagnated in the nearly 40 years since 1973, whereas in the previous 40 years, between 1932 and 1972, they went up by a factor of six?
By the same token, there was great technological progress during the 1930s. But it did not end the Depression. That took a world war. So could something comparably grim happen in our own time? Don’t rule it out. Let’s remind ourselves of the sequence of events: economic depression, crisis of democracy, road to war.
Talk to anyone who manages money these days and you will hear a doleful litany: the global economic slowdown, the persistence of unemployment, widening inequality, the problem of excessive debt, the declining effectiveness of monetary policy, and the looming fiscal cliff. Only last week, Ray Dalio—founder of the mega– hedge fund Bridgewater—spoke of a “dangerous dynamic ... making a self-reinforcing global decline more likely.” With good reason, Dalio frets about the dangers of a “debt implosion” or currency breakup in Europe.
Disasters happen. Two hundred and fifty years ago, on November 1, 1755, the Portuguese capital, Lisbon, was flattened by an earthquake that killed thousands of its inhabitants. Like the hurricane that inundated New Orleans last week, the calamity inspired not only awe at the power of nature and sympathy for the helpless victims, but also all kinds of moral commentary. None was more profound than that of the French philosopher Voltaire.
To Voltaire, the destruction of Lisbon was proof that we do not live "in the best of all possible worlds" - a philosophical position associated with Gottfried Leibniz, but most pithily expressed in Alexander Pope's Essay on Man: Whatever is, is right. According to Leibniz, evil and suffering were integral parts of the order God had ordained. Though they might seem inexplicable to us, they were a vital part of the divine plan; the world would, paradoxically, be less perfect without them.
I wonder how many Southern preachers will venture that argument today, at a time when untold numbers of bodies are lying unburied in the streets of what used to be "the Big Easy", or floating in the toxic flood unleashed by Hurricane Katrina? Most, I suspect, will prefer to echo the prayer published by the United Church of Christ not long after the deluge: "Be present, O God, with those who are discovering that loved ones have died, that homes and jobs are gone. Embrace them in your everlasting arms.
"Be present, O God, with those who suffer today in shelters, hot and weary from too little sleep and too much fear. Let them know they are not alone."
No doubt those are appropriate things to be asking of God at a time like this, but they rather beg the question where He was when Katrina burst the levee walls. I must say I prefer clergymen who, like Leibniz, at least address the issue of why God allows such horrors to happen
Voltaire's answer was a classic statement of the atheist position. Disasters happen because there is no God. As he wrote to a friend, the Lisbon earthquake was "a cruel piece of natural philosophy! We shall find it difficult to discover how the laws of movement operate in such fearful disasters in the best of all possible worlds - where a hundred thousand ants, our neighbours, are crushed in a second on our ant-heaps-, half dying, undoubtedly in inexpressible agonies, beneath debris from which it was impossible to extricate them. What a game of chance human life is!
"What will the preachers say?" asked Voltaire and he went on to express the hope that mankind might learn a lesson from the indiscriminate cruelty of the earthquake. It ought, he wrote, "to teach men not to persecute men: for, while a few sanctimonious humbugs are burning a few fanatics, the earth opens and swallows up all alike".
That, unfortunately, was wishful thinking. On the contrary, the most common human response to a natural disaster is to reaffirm rather than to repudiate religious faith. Religion, after all, has its prehistoric origins in man's desire to discern some purposeful agency in the workings of nature. The Old Testament, I need hardly remind you, interprets the flood of Noah's time as a divinely ordained purge of a sinful world. Perhaps predictably, the Methodist John Wesley attributed the Lisbon earthquake to "sin. that curse that was brought upon the earth by the original transgression of Adam and Eve".
In much the same way, religious and secular commentators alike have rushed to attach moral significance to the destruction of New Orleans.
Natural disasters, after all, are not like terrorist attacks. In the wake of 9/11 or the 7/7 bombings in London, we could focus our minds on the human perpetrators, struggling to make sense of their homicidal motives. With a hurricane, we need to be more creative. The banal response was, of course, to blame the city, state or federal authorities for sins of omission - a charge that prompted one of the city's former planning officials to declare defensively: "We are all responsible." For a hurricane?
The old-time hellfire and brimstone reaction would have been to interpret the inundation, John Wesley style, as a judgment on the city that brazenly calls itself "Party Town". But few Christian Churches risk such strong moral medicine these days.
No such inhibitions constrain today's Islamic extremists. The Associated Press reported that they "rejoiced in America's misfortune, giving the storm a military rank and declaring in internet chatter that 'Private' Katrina had joined the global jihad. With God's help, they declared, oil prices would hit $100-a-barrel this year."
It would be hard to get more tasteless. Yet the same underlying impulse - to interpret the disaster as confirmation of one's own ideological position - was at work among many American liberals too. Opponents of the war in Iraq were not slow to point out that National Guardsmen who should have been on hand to rescue hurricane victims were instead failing to prevent lethal stampedes in far-away Baghdad. The usual suspects could not resist pointing out that most of the people trapped in the flooded city were poor African-Americans, who lacked the means to flee the hurricane.
And, inevitably, environmentalists rushed to portray the storm as retribution for the Bush administration's refusal to sign the Kyoto Protocol. After all, they argued, our consumption of fossil fuels causes global warming, and global warming leads to more frequent "extreme weather events", not to mention rising sea levels. Could the prospect of even higher gasoline prices, as a direct result of storm damage, finally bring Americans to their senses about climate change?
Having recently shown one of my classes a map projecting the effects of rising sea levels on the eastern seaboard of the United States (guess which city disappears first?), I must confess that this was also my initial reaction. Only last week, after all, I was fulminating in this column about the way we pollute the world's oceans. It was only with difficulty that I banished the thought of Katrina as Neptune's vengeance.
The reality is, of course, that natural disasters have no moral significance. They just happen, and we can never exactly predict when or where. In 2003 - to take just a single year - 41,000 people died in Iran when an earthquake struck the city of Bam, more than 2,000 died in a smaller earthquake in Algeria, and just under 1,500 died in India in a freak heatwave. Altogether, at least 100 Americans were killed that year as a result of storms or forest fires.
Natural disasters - please, let's not call them "Acts of God" - killed many more people than international terrorism that year (according to the State Department, total global casualties due to terrorism in 2003 were 4,271, of whom precisely none were in North America). On the other hand, disasters kill many fewer people each year than heart disease (around seven million), HIV/Aids (around three million) and road traffic accidents (around one million). No doubt if all the heart attacks or car crashes happened in a single day in a single city, we would pay them more attention than we do.
As Voltaire understood, hurricanes, like earthquakes, should serve to remind us of our common vulnerability as human beings in the face of a pitiless Nature. Too bad that today, just as in 1755, we prefer to interpret them in spurious ways, that divide rather than unite us.