New Times,
New Thinking.

  1. Spotlight on Policy
  2. Healthcare
6 September 2019updated 02 Sep 2021 4:47pm

If AI is going to have any future in health care, it’s going to have to get a lot smarter

I suspect it will gradually dawn on people quite how hard it would be to replicate what a doctor does.

By Phil Whitaker

For the past 40 years, the UK has been a world leader in developing patient-centred models of medical care. The paternalistic doctor of the postwar era has gradually given way to an expert collaborator – someone who expects patients to have their own ideas, concerns and expectations (ICE) about their health, and seeks to incorporate these, along with the realities of medical practice, into a shared understanding of the problem and what to do about it.

Some patients are upfront about their ICE, but many are reticent about articulating them. They may worry that the doctor will think them stupid, or they’ll cause offence by seeming to tell a professional how to do their job.

There are various ways of drawing these issues out. Prior to consulting a doctor, most people will already have sought multiple opinions from elsewhere – Google, of course; also family and friends, colleagues, newspapers and magazines. Questions such as: “What have you been reading about this kind of problem?”, or “What does your husband/mum say about it all?” can open up fruitful conversations. By letting people know we expect them to have their own model of what’s going on with their health, we give them permission to disclose it.

Some patients aren’t even consciously aware of their ICE. Asking: “Do you know anyone who’s had anything similar?” can bring out the story of the dimly remembered aunt whose brain tumour presented with headaches, or the workmate who was dead six weeks after his backache turned out to be lung cancer. The lid lifted, the connections between their symptoms and subconscious fears can begin to be made.

ICE matter. If a patient can explain their own model, it dramatically increases their confidence in the care they’re receiving. And from a doctor’s point of view, understanding a patient’s ICE is vital to making correct diagnoses, as well as to tailoring management to the individual.

Against this background, the current vogue for “artificial intelligence” in health seems distinctly retrograde. Apps such as Babylon Health’s GP at Hand recreate a bygone era where a somewhat remote-seeming doctor would subject the patient to a barrage of narrow questions about their symptoms, at the end of which they would pronounce their opinion as to what the trouble was and what to do about it.

If AI is to have any future in health care, it’s going to have to get a lot smarter. Algorithms will have to be programmed to draw out patients’ ICE, correctly interpret how they affect the picture being presented, and incorporate them into a mutually acceptable way to proceed.

Give a gift subscription to the New Statesman this Christmas from just £49

And that only covers the words spoken. Most of what we communicate as human beings comes across non-verbally. Our body language – how we sit, what our hands are doing as we talk, where our gaze rests – all underscore what we’re saying. Forget face recognition technology: AI is going to have to be able to analyse facial expression; to detect the rimming of tear fluid in an eye, for example, that speaks of a painful memory having been stirred. And often, the most important communication comes through what is not said – the question not answered, the jump to an apparently unconnected topic. Lots for the coders to get to grips with there too.

Then there are the subtle physical signs. It’s easy enough to plug someone’s pulse and temperature into an algorithm, but what about powering the AI of the future to detect the shuffling gait of Parkinson’s, the pallor of anaemia, the grunting respiration of the baby with pneumonia?

Will 2019 see artificial intelligence beginning to replace doctors? No doubt, spurred by the Labrador-like enthusiasm of Matt Hancock, the Secretary of State for Health, the app developers will be putting in a determined effort. But sometime between now and 2090, I suspect it will gradually dawn on people quite how hard it would be to replicate what a doctor does. Then the NHS might start investing properly in its human resources again. 

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football