I stopped reading science fiction when I turned 17. I thought reading history would give me better insights into the future. The trouble with sci-fi is that it always predicts 10 out of the next three technological innovations. The future is never as weird in reality as it is in sci-fi. Yet there was a flaw in my strategy. If the future is actually being made by sci-fi readers, ignorance of the genre may be a fatal blind spot.
A couple of weeks ago I had dinner in San Francisco with a bright young crypto crowd. Crypto is short for crypto-currency, which is the best known use for blockchain, or distributed ledger, technology. The most famous crypto-currency is bitcoin. Crypto is the cool thing these days. The cool people who were at Facebook when it was cool and who before that were at Google when it was cool are now into crypto.
Steering the conversation away from blockchain and towards my uncool comfort zone, I asked if crypto people still read books. Yes, of course, but mostly science fiction. Such as? Well, obviously, Snow Crash, my host replied, in the way that I might once have said: “Well, obviously, The Day of the Triffids” — or Fahrenheit 451 or A Clockwork Orange, which were my favourite works of sci-fi when I was a kid.
I’d never heard of Snow Crash because it was published in 1992, long after I kicked the sci-fi habit. It turns out that the novel — by the American author Neal Stephenson — was once required reading for new Facebook recruits. I’ve now read it. So should you. And so should all those senators and representatives who wasted two days last week asking Mark Zuckerberg questions that were either easy for him to answer or easy for him to duck.
Snow Crash is set in an unspecified but not too distant future, so perhaps around now, more than a quarter-century after its publication. As with all sci-fi, quite a bit of the predicted future hasn’t come about (you guessed it: flying cars — still waiting). But an unusually large proportion has.
There are two worlds: reality and the Metaverse. Reality is a decaying, Balkanised America, a land in which the federal government has ceded much of its power to corporations, foreign interests (“Mr Lee’s Greater Hong Kong”) and organised crime. The rich inhabit fortified “burbclaves”. The poor live in containers. The privatised highways are so clogged with traffic that deliveries are made by cyber-punks on skateboards. Everyone’s armed to the teeth. There’s even a character with a nuclear torpedo in his Harley-Davidson’s sidecar.
The Metaverse, by contrast, is the next iteration of the internet: a multiplayer, virtual-reality mega-game populated by avatars, accessible through special goggles.
The defining characteristic of both worlds is that everybody’s private information is readily available, not only to the CIC (Central Intelligence Corporation), the for-profit organisation formed by the CIA’s merger with the Library of Congress, but also to whoever is willing to pay the CIC.
In other words, even if the highways of 2018 are not quite as Mad Max or Blade Runner as Stephenson imagined, much else about Snow Crash is eerily familiar. Globalisation having hollowed out most of the US economy, there are now “only four things we do better than anyone else: music, movies, microcode (software), high-speed pizza delivery”.
With astounding prescience, Stephenson imagines not only virtual reality — those VR goggles now exist and are on sale from Facebook-owned Oculus — but also Google Earth: “a piece of CIC software called, simply, Earth . . . [used] to keep track of every bit of spatial information that it owns — all the maps, weather data, architectural plans and satellite surveillance stuff”.
Artificial intelligence is here too. “For the most part I write myself,” explains an automated online search engine. “That is, I have the innate ability to learn from experience. But this ability was originally coded into me by my creator.”
The plot of Snow Crash hinges on the utter failure of the government to keep up with technology and the ability of bad actors to infect people’s brains (not just their avatars) with malware.
The character of L Bob Rife personifies Big Tech unbound. “Y’know, watching government regulators trying to keep up with the world is my favourite sport. Remember when they busted up Ma Bell [as the US telephone monopoly used to be known]? . . . They were in the same business as me. The information business. Moving phone conversations around on little tiny copper wires, one at a time. Government busted them up — at the same time when I was starting cable TV franchises in thirty states. Haw! Can you believe that? It’s like if they figured out a way to regulate horses at the same time the [Ford] Model T and the airplane were being introduced.”
L Bob’s dastardly masterplan is to infect the minds of all network users with Snow Crash, which is the mental equivalent of a complete hard-drive failure. (Stephenson got the idea when his early Apple Mac “crashed and wrote gibberish”, producing “something that looked vaguely like static on a broken television set”.)
What exactly is Snow Crash, asks the lead male character, Hiro Protagonist. “Is it a virus, a drug or a religion?”
“What’s the difference?” replies his former girlfriend.
I’m not saying that Zuckerberg is L Bob Rife. Nor am I saying Facebook is the CIC. I am saying that the people who questioned Zuckerberg so ineffectually on Capitol Hill last week need to read Snow Crash. And maybe they also need to stop taking campaign contributions from Facebook. Since 2014 Facebook has contributed a total of $641,685 to all but 16 of the 105 members of Congress that Zuckerberg faced last week.
Senator Orrin Hatch (Republican): “How do you sustain a business model in which users don’t pay for your service?”
Zuckerberg: “Senator, we run ads.”
Senator Lindsey Graham (R): “You don’t feel like you have a monopoly?”
Zuckerberg: “It certainly doesn’t feel like that to me.” [Laughter.]
Here are four reasons why much tougher questions were warranted last week. If you ever signed up for Facebook, hundreds of advertisers have your contact information, and Facebook has your entire contacts list, not to mention a complete list of all the times you logged in, which device you used and where you were. If you are an Android user, Facebook also logs your call and text history. If you log off Facebook, it can still track your browsing activity. And even if you never signed up for Facebook, the company may still have a “shadow profile” of you.
“Congress is good at doing two things,” Congressman Billy Long told Zuckerberg last week. “Doing nothing and overreacting.” He forgot about regulating horses.
In my naivety I thought last week that the game was finally up for Facebook. Having effectively turned a blind eye to the antics of both the Russians and Cambridge Analytica, Zuckerberg seemed certain to be roasted alive. Instead we got the political equivalent of Snow Crash.
I love diaries. Unlike memoirs, which are written long after the fact — with the benefit of hindsight and nearly always in a way that flatters the author — diaries are history in real time, as it was lived. True, the diarist also tends to make himself or herself the centre of events and does not always own up to sins of omission and commission, not to mention acts of downright stupidity. But writing soon after the heat of the moment, the diarist crystallises the reality of the human condition — a reality that historians too often understate — that the future is mostly unknowable.
Unlike the biographer, the diarist does not know which side will win the war, which titan will tumble, which bit player will rise to power. As the great historian of English law Frederic William Maitland observed: “We should always be aware that what now lies in the past once lay in the future.” Reading diaries is the best way to appreciate that.
For this reason, I loved Tina Brown’s The Vanity Fair Diaries, which cover the period between 1983 and 1992 when she edited the glossy magazine Vanity Fair. They tell, in prose that veers from the dazzling to the dutiful — depending on her mood and level of fatigue — several gripping stories.
There is the story of one of the first female editors of a top publication; the story of an immigrant making it big in the Big Apple; and the story of the central trilemma of the feminist era — that it is impossible simultaneously to be successful (and contented) as a professional, a mother and a wife. (In a trilemma, you can have two out of three, but not all three.) All of this kept me turning the pages.
The obvious historical value of Brown’s diaries is that they capture, vividly, the high noon and then the dusk of the Reagan era, as viewed from Manhattan. On Wall Street the 1980s were a time of deal-making and high-rolling, of junk bonds and boorish traders who gloried in the nickname “big swinging dicks”. Yet just a short distance away in bohemian neighbourhoods such as Greenwich Village the HIV/Aids epidemic was decimating the gay community.
Brown was an acute observer of such incongruities. Her account of the wedding of her friend Arianna Stassinopoulos to the Texan millionaire Michael Huffington perfectly captures the spirit of the age, with Henry Kissinger and Mort Zuckerman wisecracking their way through the over-the-top nuptials. (“The service swerved from High Church to Greek Orthodox, with crowns held aloft over the bride and groom. ‘What will the psalms be in? Aztec?’ muttered Henry Kissinger.”)
Nevertheless, the true gold buried in Brown’s diary is her portrait of a “sulky, Elvisy” property developer whom she features in her first Vanity Fair Christmas edition in 1984 simply “because he’s a brass act. And he owns his own football team. And he thinks he should negotiate arms control agreements with the Soviet Union.”
Three years later she decides to publish extracts of his autobiography, “which has a crassness I like . . . there is something authentic about [his] bullshit . . . it feels, when you have finished it, as if you’ve been nose to nose for four hours with an entertaining conman and I suspect the American public will like nothing better”.
The conman in question is of course Donald Trump, who flits through Brown’s diary like the Ghost of Christmas Yet to Come. Her first encounter with The Donald is at a dinner on September 25, 1987. “He was all over me,” she writes, “hoping to charm me into favourable presentation in the mag. At dinner [he] started bombarding me with interest. ‘Tina,’ he shouted, ‘what do you think of the Newsweek cover story on me?’
“‘I haven’t read it,’ I told him. ‘You know, Tina, I could have had Time. They wanted me and I saw them, too. But Newsweek scooped them. Who do you think’s better, Tina, Newsweek or Time?’ ‘Time,’ I said mischievously. ‘You really think so, Tina, you really think so?’ His pouty Elvis face folded into a frown of self-castigation. ‘I guess it sells more,’ he said in a tormented tone. ‘I guess it does.’”
It gets better. “‘You know what?’ Trump continued shouting across to me. ‘Went to the opening of the Met last night. Ring Cycle. Placido Domingo. Five hours. Dinner started at twelve. Beat that. I said to Ivana, what, are you crazy? Never again.’”
Brown’s account of the party Trump threw for his book The Art of the Deal is one that future historians will savour. All she gets wrong is to call it “the gaudy postscript to an era’s boom and crash”. For we now know that Trump’s party was the preamble to an even gaudier, boomier, crashier era.
“Trump himself looked sleek and starry as a prosperous young seal in his tux and white evening scarf. ‘Can you believe this party!’ he kept exclaiming. ‘No, seriously, can you believe it? Love your magazine! Beautiful piece on Ivana. Byoodiful!’”
We next encounter Donald and Ivana in the throes of divorce, an event covered in depth by Vanity Fair. The journalist Marie Brenner sides with Ivana, portraying Trump as “a brutish, philandering husband” so addicted to “lying and loud-mouthing . . . that it’s incredible he still prospers and gets banks to loan him money”.
“He’s like some monstrous id creation of his father,” Brown writes, “a cartoon assemblage of all his worst characteristics mixed with the particular excesses of the new media age. The revelation that he has a collection of Hitler’s speeches at the office is going to make a lot of news.”
Trump took his revenge in characteristic fashion. It was December 10, 1991, and the occasion a black-tie gala to mark the opening of Barbra Streisand’s latest film. Brenner was “sitting demurely . . . when she felt something cold and wet running down her back. Unwilling to embarrass the waiter, she didn’t turn around. Until the other guests at the table started pointing and yelping, ‘Oh my God! Look what he just did!’” She was just in time to spot Trump’s “familiar Elvis coif making off across the Crystal Room”. He had poured a glass of wine over her.
For the diarist, this is the last nail in the coffin of Trump’s reputation. “The sneaky, petulant infant,” she scribbles indignantly. “What a coward!” The historian can only doff his cap in gratitude. For here we see, unvarnished and unmistakeable, the man who would be president. And here we also see the fateful inability of the New York elite to foresee his irresistible rise.
It was on September 2, 1987, that Trump took out a full-page newspaper advertisement with the headline: “There’s nothing wrong with America’s foreign defence policy that a little backbone can’t cure.” In it he called on the United States to cease spending money on Middle Eastern peacekeeping that mainly benefited the Saudis and Japanese, to “end our vast deficits by making Japan, and others who can afford it, pay” and to “reduce our taxes, and let America’s economy grow”.
How Vanity Fair laughed derisively at what is now (with China taking the place of Japan) the foreign policy of the 45th president of the United States of America, Donald J Trump: the Elvisy revenge of the 1980s.
Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford
It is not very fashionable to be a man these days, especially a white one. After the exposure of Harvey Weinstein’s record of alleged sexual assault and harassment, The New York Times ran a piece entitled “The unexamined brutality of the male libido” by the Canadian writer Stephen Marche.
“The masculine libido and its accompanying forces and pathologies drive so much of culture and politics and the economy,” wrote Marche, adding that “the point of Freud was not that boys will be boys. Rather the opposite . . . If you let boys be boys, they will murder their fathers and sleep with their mothers.” Right.
“Masculinity, not ideology, drives extremist groups,” was another recent headline that caught my eye, this time in The Washington Post. This was a review of a book by a sociologist named Michael Kimmel, whose theory is that masculinity is both “the psychological inspiration” that leads young men to join Islamist and neo-Nazi groups “and the social glue that keeps them involved”. Got it.
I have had to listen to a variation on this theme rather too much in recent weeks. Last month I organised a small, invitation-only conference of historians who I knew shared my interest in trying to apply historical knowledge to contemporary policy problems. Five of the people I invited to give papers were women, but none was able to attend. I should have tried harder to find other female speakers, no doubt. But my failure to do so elicited a disproportionately vitriolic response.
Under a headline that included the words “Too white and too male”, The New York Times published photographs of all the speakers as if to shame them for having participated. Around a dozen academics took to social media to call the conference a “StanfordSausageFest”.
So outraged were Stanford historians Allyson Hobbs and Priya Satia that they demanded “greater university oversight” of the Hoover Institution, where I work. Other Stanford institutions had embraced diversity, but Hoover had “proved impervious to the demographic changes transpiring in the academy.” It was “an ivory tower in the most literal sense”. The most literal sense?
Now let’s be clear. As I recently and rather vehemently explained to the novelist Will Self, I was raised to believe in the equal rights of all people, regardless of sex, race, creed or any other difference. That the human past was characterised by discrimination of many kinds is not news to me. But does it really constitute progress if the proponents of diversity resort to the behaviour that was previously the preserve of sexists and racists? Publishing the names and mugshots of conference speakers is the kind of thing anti-semites once did to condemn the “over-representation” of Jewish people in academia. Terms such as “SausageFest” belong not in civil academic discourse but in the pages of male-chauvinist comics such as Viz.
What we see here is the sexism of the anti-sexists; the racism of the anti-racists. In this Through the Looking Glass world, diversity means homogeneity. I was struck by the objection of professors Hobbs and Satia that, whereas Stanford has the “high-minded purpose” of “fostering education, research and creativity for the benefit of humanity”, the Hoover Institution’s values are “very different . . . economic freedom, private enterprise, and commitment to facts and reason”. Good grief, not those discredited tenets of white patriarchy!
“The whitesplaining of history is over,” declared another heated article by Satia last week. The historian’s role, she explained, was not to help improve policy but to be a “critic of government . . . to speak to the public, so that people may exert pressure on their elected representatives”. Her exemplar in this regard? Step forward the very white, very male British social historian EP Thompson.
Hideous Newspeak terms such as “whitesplaining” and “mansplaining” are symptoms of the degeneration of humanities in the modern university. Never mind the facts and reason, so the argument runs, all we need to know — if we don’t like what we hear — are the sex and race of the author.
Speaking up against this kind of thing is a risky business. Questioning the new orthodoxy on the identity of the sexes can get you fired — ask James Damore, who lost his job as an engineer at Google for doing just that. Asserting that there may actually be scientifically identifiable differences between the sexes will get you “no-platformed” — ask Christina Hoff Sommers, whose book The War Against Boys (2000) pointed out the growing discrimination against male students in American education, and whose lectures are regularly disrupted by radical feminists.
The process of indoctrination starts early. My six-year-old son stunned his parents the other day when we asked what he had been studying at school. He replied that they had been finding out about the life of Martin Luther King Jr. “What did you learn?” I asked. “That most white people are bad,” he replied. This is America in 2018. The point, as I tried to explain to him, is rather different. Quite a lot of people of all skin colours are bad, but probably not a majority. Most people of all skin colours want to be good, but they are in various ways weak. And a few people of all skin colours are brave.
Courage is not gender-specific, either. My son’s mother, a true feminist, is the bravest person I know. But it is worth pondering why, for most of history, men were encouraged to show physical courage. As the Harvard political theorist Harvey Mansfield has pointed out, the ideal of manliness is perhaps best described as “confidence in a situation of risk”. He argues that the feminist campaign against manliness is misguided. “We are in the process,” Mansfield writes, of making manliness “the essence of the . . . evil we are eradicating”.
Yet manliness has its uses. Just over a week ago, a 44-year-old French policeman named Lieutenant-Colonel Arnaud Beltrame confronted an Islamist who burst into a supermarket in southern France, where he killed two people and took a 40-year-old checkout woman hostage. Beltrame offered to take her place. He gave up his weapon and put himself in the murderer’s power. After a standoff, the terrorist shot him several times, whereupon police stormed the store. Although Beltrame was flown to hospital, he died a few hours later, his wife at his side.
A graduate of the elite military school St Cyr and a special forces gendarme who served with distinction in Iraq, Beltrame was a product of just the kind of education we are supposed now to disdain. But as his wife said after his death: “He was motivated by very high moral values, the values of service, generosity, giving oneself, abnegation.”
Beltrame died that another might live. His sacrifice was not at the level of Christ’s — who “died for all, that they which live should not henceforth live unto themselves” (2 Corinthians 5:15) — but it was and is the essence of heroism. This Easter, let us turn away from all discrimination — the new kind as well as the old — and remember that true courage is remarkable not for its colour, or its sex, but for its rarity.
Niall Ferguson is the Milbank Family senior fellow at the Hoover Institution, Stanford
In George Orwell’s Nineteen Eighty-Four, the telescreen is the primary tool of totalitarian surveillance. It is, in Orwell’s words, “an oblong metal plaque like a dulled mirror which formed part of the surface of the wall . . . The instrument could be dimmed, but there was no way of shutting it off completely.
“The telescreen received and transmitted simultaneously. Any sound that Winston made, above the level of a very low whisper, would be picked up by it; moreover, so long as he remained within the field of vision which the metal plaque commanded, he could be seen as well as heard. There was of course no way of knowing whether you were being watched at any given moment . . . You had to live — did live, from habit that became instinct — in the assumption that every sound you made was overheard and, except in darkness, every movement scrutinised.”
Winston Smith (“6079 Smith W”) knows to keep his back to the telescreen as much as possible (“though, as he well knew, even a back can be revealing”) and, when facing it, to wear an “expression of quiet optimism”. Under its unblinking gaze he has to participate in mandatory physical exercise — and the telescreen shrieks at him if he bends his knees when touching his toes.
“It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen . . . To wear an improper expression on your face was itself a punishable offence. There was even a word for it in Newspeak: FACECRIME, it was called.”
For most of my life, ever since I read Orwell as a teenager, I have thanked God that I didn’t end up as a citizen of Airstrip One, living my life as a helot in thrall to Big Brother. It was not long after the actual year 1984, when I made my first visit to the Soviet Union, that I realised a significant part of humanity was in precisely that situation.
The Soviets lacked the technological skill to create the telescreen, but their system of surveillance — based on countless concealed microphones and cameras — did the job. Everything else about Soviet life was straight out of Nineteen Eighty-Four, particularly the disconnect between the strident propaganda (the pig-iron statistics and the military parades) and the dispiriting shabbiness of everyday life. How relieved I felt to return to capitalism and democracy.
Little did I know that the freest society in history — that of northern California — was already hard at work on the technology that would not only match but exceed the telescreen as a tool of surveillance.
The internet and the worldwide web, according to Silicon Valley pioneers such as John Perry Barlow, were supposed to create a libertarian paradise where netizens could roam free, beyond the reach of Big Brother and his ilk. As for making money . . . dude, the whole idea was just to connect the world.
“Facebook was not originally created to be a company,” wrote Mark Zuckerberg, its chairman and chief executive, on the eve of its initial public offering. “It was built to accomplish a social mission — to make the world more open and connected.” He had told The Harvard Crimson in 2004, only five days after the launch of Thefacebook, that his aim was not to make money: “I’m not going to sell anybody’s email address.”
Five years later, by which time Facebook had about 200m users, Zuckerberg was asked by the BBC: “So who is going to own the Facebook content? The person who puts it there or you?” He replied: “The person who puts the content on Facebook always owns the information.”
BBC: “And you won’t sell it?”
MZ: “No, of course not.”
This was a disingenuous reply. To be sure, Zuckerberg has not — strictly speaking — sold Facebook users’ data. But he did not become a multibillionaire because all 2bn users mailed him 20 bucks to say, “Thanks for making the world more connected!”
In 2007 Facebook allowed users to build apps within its site — a decision that proved hugely popular as Facebook-based games proliferated. At the same time, users could sell their own sponsored advertisements.
Zuckerberg’s pursuit of advertising revenue nearly backfired with the introduction of Beacon, which gave companies direct access to the platform. It was Sheryl Sandberg’s job to make the transition to an advertising revenue model a success, as she had already done at Google.
The crucial difference was that Google simply helped people find the things they had already decided to buy, whereas Facebook enabled advertisers to deliver targeted messages to users, tailored to meet the preferences they had already revealed through their Facebook activity. Once adverts were seamlessly inserted into users’ news feeds on the Facebook mobile phone app, the company was on the path to vast profits, propelled by the explosion of smartphone usage.
The smartphone is our telescreen. And, thanks to it, Big Zucker is watching you — night and day, wherever you go. Unlike the telescreen, your phone is always with you. Unlike the telescreen, it can read your thoughts, predicting your actions before you even carry them out. It’s just that Big Zucker’s 24/7 surveillance isn’t designed to maintain a repressive regime. It’s simply designed to make money.
The only law of history is the law of unintended consequences. Is anyone — apart from Zuckerberg, that is — really surprised that, during the eight-year period when app developers had free access to Facebook users’ data, unscrupulous people downloaded and used as much as they could? Do we seriously believe that Aleksandr Kogan and Cambridge Analytica are the only ones who did this? Can you give me one good reason why, after President Barack Obama and his minions smugly boasted about their use of Facebook in his 2012 re-election campaign, Donald Trump’s campaign was not entitled to try similar methods four years later?
So it goes, Mark. You set out to make the world more connected. You end up helping to elect President Trump, whose goal is — as we saw last week — the exact opposite. And all because you got greedy. You took Trump’s money. And you took Vladimir Putin’s, too.
The reputational damage has now been done. Regulation is coming, not to mention hefty fines. (As Zuckerberg himself said last week: “I actually am not sure we shouldn’t be regulated.”) But the big question is how many people will actually leave Facebook.
In Nineteen Eighty-Four, only members of the party elite are allowed to turn off their telescreens. For everyone else they are compulsory. But our iTelescreens are different, for we are addicted to them. As Sean Parker, the company’s first president, recently admitted, Facebook was set up to exploit “a vulnerability in human psychology” by delivering “a little dopamine hit every once in a while”.
It took torture — followed by copious amounts of gin — finally to convince Winston Smith that “he loved Big Brother”. In that respect, too, Zuckerberg has gone one better than Orwell: “It was all right, everything was all right, the struggle was finished. He had won the victory over himself. He �� liked Big Zucker.”
Niall Ferguson’s The Square and the Tower will soon be available in paperback from Penguin