New Times,
New Thinking.

  1. World
  2. UK
24 March 2021updated 11 Jul 2022 4:59pm

The 1990s: An age without qualities

Often heralded as the best decade ever, the 1990s brought dark warnings about the future – and many have come to pass.  

By Gavin Jacobson

If the Roaring Twenties, the Swinging Sixties and the Decadent Eighties are easily described, the 1990s have proven more elusive. The decade resists characterisation partly because it is such recent history, but contemporaries were also confused about what it was all really about. In 1995, the New York Times invited readers to label the era in which they were living. The paper argued that, more than five years after the fall of the Berlin Wall, the term “post-Cold War era” was tentative and carried an air of “self-doubt”.

The suggestions readers submitted were overwhelmingly pessimistic and included the “Age of Uncertainty”, “Age of Fragmentation”, the “Age That Even Historians from Harvard Can’t Name” and the “Era of Interregnum, an age that cannot last”.

Such mockery and despair now seems excessive when set against the historical record. Squeezed between the class wars of the 1980s and the conflicts of the new millennium, the 1990s were, by historical standards, mild and orderly. In a 1994 essay, “Brave New World”, the sociologist Anthony Giddens proclaimed societies to be increasingly cosmopolitan, individualist and socially progressive. This was the mantra Tony Blair embraced as he reformed the Labour Party and cruised to power in 1997. 

In international affairs, the decade witnessed the diplomatic codas to bloodier, more divisive times: Germany was reunified (1991); apartheid ended in South Africa (1994) and Nelson Mandela was released from prison; the Oslo Accords seemed to have established a framework for peaceful co-existence between Israel and the Palestinians; and the Good Friday Agreement was implemented (1999).

[See also: Populism without the people]

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

The first Gulf War – a spectacle of laser-guided bombs and Apache gunships – is remembered less as a conflict in which 100,000 Iraqi soldiers were killed than a showcase of America’s “smart” materiel. It also turned Norman Schwarzkopf into the first US military celebrity in a generation (“Talk to me, General Schwarzkopf, tell me all about it,” Madonna sang during her performance at the 1991 Academy Awards.)

The 1990s heralded an international order based on privatisation and deregulation of national economies. “Global governance” became the new shibboleth of international relations, reflecting the diminished status of national sovereignty. There were hopes that an order of rules and assemblies, created under the guardianship of US supremacy (what the New York Times columnist Thomas Friedman called “the hidden fist”), would manage interdependencies across borders, as well as create a perpetual peace.

The term “third world” was dropped for “developing world”, signifying the possibility of economic uplift in the Global South. The extension of free markets sustained by efficiencies in transport and communications led to a surge in capital mobility and economic growth.

Speaking before Congress on 11 September 1990, George Bush claimed a new world order was emerging, one that was “freer from the threat of terror, stronger in the pursuit of justice, and more secure in the quest for peace. An era in which nations of the world, east and west, north and south, can prosper and live in harmony.” 

***

It was all premature. Once the euphoria of 1989 had faded, the West became burdened by nightmares of crisis, purposelessness and confusion. Rather than being a moment of triumphant release after a prolonged period of extreme tension, the 1990s became a decade of disorientation and division; a period without any of the anchoring coordinates or intellectual focus of the Cold War. Looking back at that time, there is a distinct split between the decade’s self-understanding – how it was imagined and anticipated – and the material ways in which it was actually experienced.

In Don DeLillo’s epic novel Underworld (1997), Marvin Lundy, a collector of baseball memorabilia, tells Brian Glassic that the Cold War was good because “it’s the one constant thing. It’s honest, it’s dependable. Because when the tension and rivalry come to an end, that’s when your worst nightmares begin.” Lundy says that once the threat of superpower confrontation disappeared, Glassic, like everyone else, would simply become a “lost man of history”.

The West may have cheered the fall of the Berlin Wall. But the image of “the lost man of history” captured a more basic truth about the 1990s. The overwhelming sense was that the new world order bore no resemblance to those dreamlands promised by the Cold War pursuit of freedom. Democracy and the free market may have prevailed, but “what, in the wake of this great ideological victory”, the strategist Zbigniew Brzezinski asked in 1991, “is today the substance of our beliefs?” The fear was: not much.

In the 1980s, despite the prosperity of the Ronald Reagan years and Margaret Thatcher’s dominance, Anglo-American intellectuals were haunted by the prospect of decline. This was especially true in the United States, where the stock market crash of 1987 nearly brought the New York Stock Exchange to the brink of closure and had scarred the economy. The warning signs of a major economic downturn were also visible: declining competitiveness, rising foreign debt, and underinvestment in public services. In Day of Reckoning (1988), the economist Benjamin M Friedman warned that the US could not only expect a decline in international influence, but with a national debt of $2.8trn (most of it foreign-owned) the end of its sovereignty, too. Friedman predicted “dangerous frictions” in society, as the “resentments of renters against landlords and workers against owners increasingly take on nativist dimensions”.

[See also: Alone in the new world]

The historian Paul Kennedy’s The Rise and Fall of Great Powers, in which he argued that the US was cracking under the expense of maintaining an empire, became a surprise bestseller when it was published in 1987. Read by panjandrums in Washington, the cover image depicted what Kennedy described as “medieval wheels of fortune”, with Uncle Sam at the top about to be supplanted by “an Oriental-looking gentleman bearing the flag of the rising sun”.

Eight months before the fall of the Berlin Wall in 1989, Kennedy asked if the US could remain on top. “The end of the century is coming,” he wrote, “and serious citizens of the world’s number one power are beginning to get worried.” The US faced a more competitive international environment. The European Community, with its larger population and gross national product, was preparing to deepen and extend its integration under the Maastricht Treaty (signed in 1992). Japan, the financial and high-tech centre of the world, had a faster-growing economy, while China’s growth rate was accelerating. 

***

In the 1990s this declinism gave way to doomsterism. “We have new visions of choking, collapsing, crime-and-drug ridden cities,” wrote the English critic and novelist Malcolm Bradbury in 1993. It was a time of “wasted landscapes, fundamentalist conflicts and genocidal wars, shrinking ice caps, the widening of the ozone hole. If sensations of transition and nameless uncertainty regularly afflict the ending of centuries and the great turnings of the historical clock, then our own times are no exception.”

In all areas of life – technological, cultural, and political – progress was seen to unfold hand in hand with barbarism. Internet browsers such as Netscape and Yahoo!, and the development of cybernetics, gave people unprecedented ways to communicate and access information. Cyber-utopians believed cyberspace would release people from the oppressions of government, eradicate inequalities, boost democratic participation, strengthen associational life and end war. In 1997 Nicholas Negroponte, the head of MIT’s Media Lab, assured people that the internet would bring peace by eliminating national borders. Twenty years from now, he declared, children who are used to discovering other countries through surfing the web “are not going to know what nationalism is”.

But cyberspace also produced concerns about surveillance, conspiracy theories, the unpoliced gathering of far-right extremists, and corporate power. The web became publicly available in 1991, provoking a kind of existential bewilderment. The traditional master narratives through which societies had made sense of the world – enlightenment and counter-enlightenment, revolution and reaction, left and right, communist and capitalist, modern and postmodern – were deposed by something unprecedented; the laws of history now felt less prescriptive, less tangible and reassuring. The internet represented a global monoculture that exerted greater hegemonic mastery than any previous innovation or ideology since the Enlightenment.

The fear was that individuals now existed in a timeless in-between world, detached from history. People were more connected than ever before, yet the infinite horizons of the web led to feelings of loneliness and alienation. For Marc Augé, the French anthropologist and author of Non-Places (1992), modern life occurred in netherworlds beyond history and social relations. Along with supermarkets, airports, hotels and highways, the internet was another “non-space” that gave the illusion of being part of a global community that was never there.

Openness was increasingly thought of as being tantamount to imprisonment – the idea that the internet was not so much liberating as it was a network from which no one could escape. In 1990 the philosopher Gilles Deleuze warned that the Global Information Society was really a society of control: “We may come to see the harshest confinement as part of a wonderful happy past. The quest for ‘universals of communication’ ought to make us shudder.”

People also began to recognise that cyberspace would not replace political hierarchies with networked communities, but would lead us to sell ourselves.  In 1994 Carmen Hermosillo, a research analyst, denounced the rise of chat rooms and platforms such as AOL and CompuServe. It was “fashionable”, she wrote, “to suggest that cyberspace is some island of the blessed where people are free to indulge and express their individuality. This is not right. I have seen many people spill out their emotions – their guts – online and I did so myself until I started to see that I had commodified myself.” Cyberspace, she argued, was simply where people’s thoughts became commodities for the very websites on which they posted things.

Literature and film became preoccupied with the threats that virtual reality and the web posed to society, from the surveillance terrors in Neal Stephenson’s novel Snow Crash (1992) to machines using virtual reality to imprison humans in the Wachowskis’ The Matrix (1999).

Other technological breakthroughs invited their own horrors. The cloning of Dolly the sheep in 1996 and the Human Genome Project raised ethical questions about experiments with evolution. Anxieties shifted from nuclear war to genetic manipulation, as science fiction about man’s perfectibility became a nightmarish possibility. In 1992, heeding Rosa Luxemburg’s famous slogan, Eric Hobsbawm warned that “the real alternative of 20th-century history was ‘socialism or barbarism’. We don’t have socialism: let us beware of the rise of barbarism, especially barbarism combined with high technology.” 

***

Technology was key to the speed and scale of economic growth during the decade. In The Roaring Nineties (2003), the Nobel Prize-winning economist Joseph Stiglitz recalled how the recovery from a recession in 1991 “seemed to defy what was universally taught in economics courses”. The boom times had returned, as financial services replaced manufacturing and any lingering Keynesianism from the 1980s was erased.

In the US, Bill Clinton declared in his 1996 State of the Union Address that “the era of big government is over”. From the creation of the North American Free Trade Agreement (Nafta) in 1994 to his repealing in 1999 of the Glass-Steagall Act (which had separated commercial banking from investment banking), Clinton became the executor of the age of market-driven prosperity.

In Britain, New Labour embarked on a programme of investment in schools, the health service and benefit and tax credit entitlements for low-income families. But it did all this as the heir to Thatcher. The dominance of finance over manufacturing led to decayed industrial heartlands as economic and political power was concentrated in the south. The City of London became a redoubt of shadow banking, obscure financial alchemy and a place from which American financiers could operate beyond the reach of US law.

The economic boom of the 1990s made some people rich, and others very rich. But it was hardly stable. In False Dawn: The Delusions of Global Capitalism (1998), John Gray condemned “the permanent revolution of the free market”. So too did the strategist Edward Luttwak, who despaired of the idea that “turbo-capitalism” had become the capstone of human achievement.

The emerging market failures of the decade – Mexico (1995); Malaysia, Korea, Thailand, Indonesia (1997); Russia (1998); Brazil (1999); and Argentina (2001) – showed that as much as globalisation was, from the point of view of economic growth, a success story, it was blighted by crises. The more countries that were wired into the global economic system, the more unstable things became.

This created the conditions for political reaction. In India, a programme of neoliberal reforms from 1991 exacerbated the deep inequities and discriminations in society and helped give rise to Hindu nationalism. The novelist Anita Desai wrote that a wave of resentment “swelled even as cities flourished, skyscrapers rose into the sky, and streets resounded with traffic… To live in India today,” she said, “is to live in a constant state of tension, conscious of the explosive forces building up under a surface no longer calm and likely to erupt at any moment.”

The same was true in Europe. Political parties doubled down on the neoliberal policies of the 1970s and 1980s. The detritus of postwar social democracy was cleared away for the privatised initiatives of the market. The gap between rich and poor grew, stable jobs became harder to come by, and fears over immigration and open borders intensified. In a 1988 survey 18 per cent of respondents in the EC countries had wanted the rights of immigrants restricted; by 1991, it was 33 per cent.

New populist parties such as Lega Nord in Italy (founded in 1991), Ukip (founded in 1993), Greece’s Golden Dawn (which registered as a political party in 1993) and the Danish People’s Party (formed in 1995) drew support from this growing disapproval of immigrants and asylum seekers. They attracted some working-class voters who had formerly supported social democrats and socialists. Jean-Marie Le Pen’s National Front began winning support in blue-collar towns in northern France (where the communists had once been strong) and in the industrial valley of the Loire. “We are the party of the working class,” Le Pen bragged in 1995. The same was true in Austria – in the 1986 elections 10 per cent of the far-right Freedom Party’s voters were blue-collar workers; by 1999, 47 per cent were.

The populist backlash in the US was arguably more significant for its lasting effects. From the early 1990s, Washington fell into a state of perma-scandal, partisan attack and obstruction. Under the leadership of Newt Gingrich, the Republican Party abandoned the idealism of late-stage Reaganism. Instead, it channelled reactionary visions of national decline and fall that had been metastasising in America’s churches, gun associations, radio shacks, veteran societies, anti-tax parties and nationalist groups.

Writing in the National Interest in 1990, the conservative pundit and politician Pat Buchanan demanded “a new nationalism… that puts America first and, not only first, but second and third as well”. This was his campaign message during the 1992 Republican primaries. “When we take America back,” he promised supporters, “we are going to make America great again, because there is nothing wrong with putting America first.” 

***

In the US, political life was defined by what James Davison Hunter termed Culture Wars (1991).  As Buchanan described it, the culture war was the war “for the soul of America”. It pitted those who saw morality as progressive and universal against those who saw it as fixed and indigenous.

Violence and polarisation followed: anti-abortion extremists blew up clinics and murdered physicians; there were violent showdowns between federal agents and survivalists such as Randy Weaver and Ted Kaczynski (the Unabomber); in 1995 the Gulf War veteran Timothy McVeigh committed the deadliest act of domestic terrorism in US history when he blew up a building in Oklahoma, killing 168 people; the National Rifle Association (NRA) went from being a sporting organisation to a radical conservative pressure group; and Bill Clinton was impeached. Far from cruising on the calm waters of post-history, American life seethed with fury and division.

So, too, did life in Britain, where New Labour’s “fizzling rhetoric about change and modernisation”, as the theorist Tom Nairn put it, triggered culture wars over everything from fox hunting and crime to homosexuality and the European Union.

These culture wars were the result not of ideological polarisation but of political consensus that, in the US, neither leading Democrats nor Republicans did much to disturb. What the American journalist Steve Kornacki has described as the “tribalism” of the decade was actually fratricide.

Economically, Democrats and Republicans had little to disagree on. Writing in his diary in 1994, the journalist Alexander Cockburn noted that, “on issue after issue – welfare, military spending, crimes – they’re all in sync, which is why… the right have to invent or recycle all the personal gossip about Clinton to show there’s a devil in the White House rather than someone who’s basically doing what they want”.

Culture war and political tribalism was merely the surface noise masking deeper harmonies between left and right within the American system. As Christopher Hitchens put it, the parties resembled “two cosily fused buttocks of the same giant derrière”.

The new populism was not just a reaction to the dislocating effects of globalisation. It was also a response to what Hobsbawm, writing in the New Statesman, called “lost horizons”. The socialist left had been destroyed and with it the ability of societies to imagine progressive alternatives that would fulfil the revolutionary promise of liberty, equality and fraternity.

[See also: The politics of the cattle market]

Similarly, on the international scene, Kantian dreams of bringing peace and justice to bear on the Earth – expounded by liberal thinkers such as Jürgen Habermas and John Rawls – proved to be nothing more than chimeras. The Rwandan and Bosnian genocides in 1994 and 1995 respectively are only the most well known slaughters in a decade that brought civil wars in Sri Lanka, Algeria, and Liberia, wars in the Congo, massacres in Indonesia, coups in Thailand and Haiti, and other conflicts around the world.

But the lost horizons Hobsbawm lamented were manifest in another sense, as people increasingly fled the commons and withdrew into private worlds – private health insurance; private schools; private pensions; private computers – and gated communities. The novelist JG Ballard’s Cocaine Nights (1996) – a story set in the Spanish resort of Estrella de Mar – captured the retreat into affectless realms of “entropic drift” and private security. As one of the protagonists puts it, “we are moving into the age of security and grilles and defensive space. As for living, our surveillance cameras can do that for us. People are locking their doors and switching off their nervous systems.”

To overcome the boredom of their lives, the community of Estrella de Mar was “Valiumed out of its mind”. The age of political and social disengagement – what Robert Putnam termed “bowling alone” (1995) – was not limited to physical flight behind barbed-wired walls, telesurveillance and VR headsets. As it was in the 1960s, drug-taking was also used to find liberation from the dead hand of corporate culture and the sense of historical finitude.

There was also the flourishing of sales of antidepressants such as Zoloft, Ritalin and Prozac. In England, between 1991 and 200, antidepressant prescriptions rose from nine million to 24 million a year. Prozac was especially popular – a Newsweek article published in 1994 said it had attained “the familiarity of Kleenex and the social status of spring water” – and achieved a certain trendiness after the publication of Elizabeth Wurtzel’s bestselling memoir Prozac Nation (1994).

Drugs were also a prominent feature of rave culture. Clubs and outdoor festivals became havens of sonic rapture and chemical intoxication. As Jeremy Deller documented in his film Everybody in the Place (2019), raves were “nothing less than a death ritual to mark the transition of Britain from an industrial to a service economy”. 

***

The progressive left’s total defeat in 1989 did not preclude acts of political defiance. On 30 November 1999, tens of thousands protested against the World Trade Organisation (WTO), which was holding its ministerial conference in Seattle, Washington. The “Battle of Seattle”, as it became known, inspired progressive movements across the world, from Quebec City to Genoa and Cancún. No book captured the prevailing mood of that period among those arrayed against corporate power better than Naomi Klein’s No Logo (1999).

Klein wrote of how “it may be the torch of authoritarianism that is being carried by those determined to go global”. Super-brands such as Nike, Starbucks, McDonald’s and Tommy Hilfiger seemed to be assuming the power and responsibility of governments. As brands and commercial interests started to make incursions into the most intimate recesses of people’s lives, the dividing line between corporate domination and individual consent was no longer clear. People became willing participants in their own consumerist enslavement.

Klein offered a panorama of the “new branded world” and examined places where goods were made, especially the industrial slums of the Philippines where workers were hoarded into free trade zones and obliged to work in conditions that resembled the darkest years of the Industrial Revolution. She also showed how logos had transcended individual products themselves. What was emerging in the 1990s, but is now a near-universal experience, was self-branding: the commodification of the self and the soul, deliberately curated in response to the demands of the market. 

***

How should we understand the 1990s? One way is through a particular metaphor. Writing in the 19th century, the French historian Alexis de Tocqueville assessed the modernising efforts of Frederick the Great in the previous century. “Beneath this completely modern head,” he concluded, “we will see a totally gothic body.”

The new modernity that many heralded at the start of the 1990s – technocratic consensus, hi-tech invention, global governance, economic uplift, international interdependency, the free movement of people, goods, services and capital – constituted the modern head, below which stood a gothic body. This was the primal scene of our present discontents – an age of reactionary populism, unbounded corporate power, economic and international instability, culture war, mass depression, surveillance capitalism and an obsession with celebrity culture (brought together in perfect unity by the launch of Channel 4’s Big Brother in 2000), as well as the illusory faith that only one economic system was compatible with modern life.

For some, it was also a time of boredom and philosophical desolation. The assumption that all political questions – about the distribution of power and resources, and the struggle for equality and justice – had been solved led to a numbing sense of pointlessness.

Writing in Marxism Today in 1998, the cultural theorist Stuart Hall described the New Labour project as “The Great Moving Nowhere Show”. This captured something of the mood of the decade: behind all the talk about change, progress, modernisation and “youthism”, and for all the disruption caused by the free market, life was depthless, static and restive. The decade was supposed to mark what Francis Fukuyama called “the end of history”. A 40-year-old academic, Fukuyama was unknown to the public when he published his essay “The End of History?” in 1989. But three years later, his book The End of History and the Last Man (the question mark was dropped) became the most debated work of non-fiction of the decade. After the conflicts of the 20th century, the absolute victory of liberalism over all competitors meant not just the passing of a particular period of history, but “the end of history as such: that is, the end point of mankind’s ideological evolution and the universalisation of Western liberal democracy as the final form of human government”.

But the end of history, it turned out, was not the hour of humanity’s triumph. The dominant sentiment on the part of Western elites was that the past had nothing to teach the present. But they also possessed no vision of the future beyond maintaining the new status quo. As Fukuyama himself warned, it was a “very sad time” and the “prospect of centuries of boredom at the end of history will serve to get history going again”.

On the eve of the millennium, JG Ballard noted how “everything is clean and shiny but oddly threatening”. In retrospect, the 1990s was the triumph of surface over substance, of replication not creation and of PR over probity. Nothing epitomised this more than the Millennium Dome, a super-totem to elite superficiality and focus group politics; an edifice without function beyond providing a space for corporate sponsorship.

For Tom Nairn, the dome resembled the Austro-Hungarian empire of Robert Musil’s novel The Man Without Qualities (1943). Musil depicted a Vienna preparing, at the turn of the century, for a grand celebration of its empire, unaware of its impending demise. Inside the dome on 31 December 1999, dignitaries from the worlds of politics, business and media crossed arms and sang “Auld Lang Syne”, lost in the spectacle of a world created in their image. Outside, expectations of a global techno-meltdown caused by the millennium bug gripped the social imagination. “Nothing happened on the millennium night,” recalled the Italian philosopher Franco Berardi some years later, “but the global psyche teetered on the brink of an abyss.” Twenty years on, we are gripped by similar fears of the abyss – far from the decade passing into obsolescence, we are all still living in the 1990s, trapped at the end of history. 

Content from our partners
The Circular Economy: Green growth, jobs and resilience
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on

This article appears in the 24 Mar 2021 issue of the New Statesman, Spring special 2021