New Times,
New Thinking.

  1. Culture
  2. Books
28 August 2024

How to be human in an age of AI

New technologies cannot replace the pleasure and self-expression of living.

By Ed Smith

The metaphor of the mirror, which gives this book its title and defining analogy, is designed to show that AI remains miles away from being anything like real human intelligence. By attending to that gap, between reality and reflection, we can find essential clues about what makes us human. Shannon Vallor, a professor in the ethics of data and AI at the University of Edinburgh, invites us to consider the image that appears in the bathroom mirror every morning:

“The body in the mirror is not a second body taking up space in the world with you. It is not a copy of your body. It is not even a pale imitation of your body. The mirror-body is not a body at all. A mirror produces a reflection of your body. Reflections are not bodies. They are their own kind of thing. By the same token, today’s AI systems trained on human thought and behaviour are not minds. They are their own new kind of thing – something like a mirror.”

It’s a useful metaphor. Because it’s easy to mistake clever computational tools – which can rapidly scour enormous data sets, discern patterns and then make projections – for true intelligence. Conflating the two both underplays the complexity of the human mind and overplays the creeping misunderstanding that we are getting close to replicating it.

Worse still, in our fascination with the AI mirror, we risk becoming intoxicated not by an exciting new kind of intelligence, but with an incomplete image of ourselves. Vallor recounts the tale of Narcissus, who becomes entranced and imprisoned by his own reflection in the water, as told in Ovid’s Metamorphoses: “He is seized by the vision of his reflected form. He loves a bodiless dream. He thinks that a body, that is only a shadow.”

Today’s AI tools rely on assimilating and aggregating data from the past which can then be projected forward in the form of probabilities about the future. But is a collection of data anything like complete reality? Vallor says not. “A world is an open-ended, dynamic, and infinitely complex thing,” she concludes. “A data set, even the entire corpus of the internet, is not a world. It’s a flattened, selective digital record of measurements that humans have taken of the world at some point in the past.” The difference between a data set and the real world is like the difference between what Narcissus thought he saw and what he actually found.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

A paradox of AI is that it sounds progressive but is really conservative. AI can’t easily account for the possibility that someone – a genius, for example – might recast the whole nature of a discipline in an unexpected way. By only seeing patterns, AI is skewed against “anti-patterns”. And as Vallor correctly points out, we have an old and inspiring concept for an anti-pattern within the story of an individual life: an epiphany.

This is at the heart of why AI feels so unsettling. It triggers associations about the imminent decline of free will: what if AI, superficially innovative and progressive, ends up hitching us to a digital reinvention of the medieval Wheel of Fortune? “When we do catch sight of our past in the AI mirror,” Vallor warns, “it is essential that we do not mistake those patterns for destiny.”

Vallor’s theme is only partly AI, but also our much wider vulnerability to living badly with new technologies and the resulting disconnectedness from our humanity. As tech has become big business, the human-machine imbalance has intersected with the wider conflation of wealth and value. Big Tech, which is powered by frictionless payments and data donations by the unsuspecting masses, has become a classic tale of savvy corporatism fleecing people by selling them things they don’t need. (“Down here it’s just winners and losers,” in Bruce Springsteen’s phrase in the song “Atlantic City”. “Don’t get caught on the wrong side of that line.”)

Vallor is surely right to place AI within the context of our unhappy relationship with digital “progress”. But, specifically with AI, I think there may be less to worry about than she thinks. It’s an observation rather than a criticism that her book derives more from thinking about AI rather than using it – she’s a professor in a philosophy department and that’s her job. But among people I know who have used AI tools to tackle complex, “real world” problems, none thinks AI can entirely replace human intelligence.

In the pursuit of effective decisions, first in cricket and now in football, my interest in data and AI keeps bringing me back to the (many) things that only human intelligence can do. Overall, I am less gloomy than Vallor. The real opportunity, as I see it, is how human intelligence and AI can augment each other. And the most imaginative types of human intelligence have the least to fear in that interaction.

The AI Mirror is strongest on the more general problem of technology impoverishing life by encouraging the delusion that “efficient” is a synonym for “good”. Far from it. And the growth of this fallacy is even faster than the development of AI.

In 1909, EM Forster predicted what Covid protocols would formalise in 2020: the triumph of Zoom meetings over real life. Forster’s short story “The Machine Stops” begins with a son asking to meet his mother because he wants to see her. As they are already video-conferencing across the globe over plate-sized “receivers”, the mother is baffled. “But I can see you!” the mother replies. “What more do you want?”

“I want to see you not through the Machine,” the son counters. The mother tries to close him down, reminding him that he “mustn’t say anything against the Machine”.

The son, close to Forster’s own voice, presses on: “You talk as if a god had made the Machine… I believe you pray to it when you are unhappy. Men made it, do not forget that… I see something like you in this plate, but I do not see you. I hear something like you through this telephone, but I do not hear you.”

The mother has accepted the technological mirror. (“There was the button that produced literature. And there were of course buttons by which she communicated with friends. The room, though it contained nothing, was in touch with all that she cared for in the world.”) The son is still searching for life. Vallor, who opens her book with a quote from Forster’s story, summarises his allegory: “We strangle in the garments we have woven.”

Almost 80 years after the publication of “The Machine Stops”, Vikram Seth’s verse novel The Golden Gate – written from the front line of Silicon Valley just as it was tightening its stranglehold – anticipates tech morphing into a kind of deity:

Thus files take precedence over friends,
Labour is lauded, leisure riven.
John kneels bareheaded and unshod 
Before the Chip, a jealous god.

These days, we tend to locate a fightback against tech in making new laws and rules about the digital domain. But there is another way of looking at the same challenge: we could think harder about making the real world more alluring, and being more openly confident about what makes us human. In the pre-internet world that I grew up in, I hoped to find brilliant people in interesting rooms set in beautiful places. Now, given our collective retreat into digital narcissism and dissolving texts, the concept of place as the essential context for excitement is at risk of extinction. There is no “there” there any more.

So one paradoxical response to the AI mirrors Vallor describes is to think urgently about improving the physical world around us. Doubtless a clever AI engine can explore the correlation more deeply, but for some unfortunate university students today, incarcerated beside city ring-roads in loveless, lumpen Lego-block halls of residence, is it any wonder the attraction of AI-generated essays proves persuasive? If you don’t invest care in people, they don’t care back.

As I’ve thought more about digital technology in recent years, the metaphor of a renaissance has taken a stronger hold on me. And it does again now. I finished reading The AI Mirror in Vicenza, overlooking Palladio’s famous basilica (a cathedral of kinds, but without the religious bits). Surveying the thrilling public space as a whole – barring some inevitable snaps being taken on iPhones (myself included) – instead of the common spectacle of over-injected faces scrunched up over social media apps, I noticed a high proportion of natural-looking people enjoying walking, talking and existing in the real world. As you would, of course. It’s a place set up for people.

Without careful direction, the digital world is dull, flat and sad. Twitter/X will never be our public square. It has its value and utility (I use it). But a Twitter handle isn’t a person, any more than a tiny glass screen is a basilica. Smartphones – which distract our eyes and halve our dexterity – are the most unsexy things imaginable.

Which is why I’m not convinced that the occasionally bossy moments in this wise and humane book strike the most effective tone. Our best means of countering today’s techno-vacuums might be pleasure and self-expression. Their feeble digital reflection is not a poor relation but a pitiable sideshow. Being online unnecessarily, far from being a brave new dawn, should be recast as self-imprisonment – a voluntary Dark Ages.

An important part of living better is having more fun – including by taking an agreeably superior and amused attitude towards anti-social technologies. In my early twenties, with the partial excuse of being a professional sportsman, I spent a lot of time in gyms. My elder sister (more widely read than me) was baffled and contemptuous. “I always feel pity for gym bunnies,” she explained. “If aliens were looking down on human gyms, they’d assume we’d imprisoned the poor victims and forced them to power our electricity on those boring, cruel machines.” To extend her dystopia: those aliens really got us with the smartphones.

Gyms aren’t real sport, glass screens aren’t real life, and AI isn’t real intelligence. And it’s the unpredictable brilliance of the real thing that we should trust. The bleak vortex of collective digital dependency goes like this: by living worse (spending more time online) we provide ever greater food to bulk up the self-learning algorithms (thanks to the data we carelessly scatter around) while simultaneously neglecting the real world that we are slowly leaving behind. People falling in love with AI chatbots represent just the absurd fringe of a general phenomenon that almost all of us, to a greater or lesser extent, are much too indulgent about. In her final chapter, Vallor explores the virtues of regulation and compares the future of AI to the regulatory advances in late-20th-century commercial aviation. But instead of “we need a more responsible regulatory framework”, there is the simpler and more fundamental rallying cry that might prove our best line of defence against tech hegemony: “Get a life – it’s more fun!”

Ed Smith is director of the Institute of Sports Humanities and the author of “Making Decisions” (William Collins)

The AI Mirror: How to Reclaim Our Humanity in an Age of Machine Thinking
Shannon Vallor
Oxford University Press, 272pp, £22.99

Purchasing a book may earn the NS a commission from Bookshop.org, who support independent bookshops

[See also: When the AI bubble bursts]

Content from our partners
The Circular Economy: Green growth, jobs and resilience
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on

Topics in this article :

This article appears in the 28 Aug 2024 issue of the New Statesman, Trump in turmoil