1.
Martin Bromiley is a modest man with an immodest ambition: to change the way medicine is practised in the UK.
I first met him in a Birmingham hotel, at a meeting of the Clinical Human Factors Group, or CHFG. Hospital chief executives, senior surgeons, experienced nurses and influential medical researchers met, debated and mingled. Keynote speakers included the former chief medical officer for England Sir Liam Donaldson. In the corridors and meeting rooms, rising above the NHS jargon and acronyms and low-level grumbling about government reforms, there floated a tangible sense of purpose and optimism. This was a meeting of believers.
A slow transformation in the way health care works is finally gaining traction. So far, it has gone largely unnoticed by the media or the public because it hasn’t been the result of government edict or executive order. But as Suren Arul, a consultant paediatric surgeon at Birmingham Children’s Hospital put it to me: “We are undergoing a quiet revolution and Martin Bromiley will, one day, be recognised as the man who showed us the way.”
Although I knew whom to look for, Bromiley was hard to spot at first. He wasn’t on stage, and he didn’t address the full conference. He was, I discovered, sitting at a table at the edge of the hall, in the suburbs of the meeting. You would hardly have guessed that the CHFG was a group he’d founded, or that everyone at the meeting that day was there because of him. Bromiley doesn’t fit with our preconceived ideas of a natural leader. He speaks with a soft voice. He doesn’t command your attention, though you find yourself giving it.
Neither is he a doctor, or a health professional of any kind. Bromiley is an airline pilot. He is also a family man, with a terrible story to tell.
2.
Early on the morning of 29 March 2005, Martin Bromiley kissed his wife goodbye. Along with their two children, Victoria, then six, and Adam, five, he waved as she was wheeled into the operating theatre and she waved back.
Over Christmas, Elaine had suffered a swelling of her face, connected to sinus problems that had troubled her for years. She was advised by a consultant that the only way to deal with the problem once and for all was to undergo a minor operation to straighten the inside of her nose. Bromiley knew of colleagues who had undergone the operation – the sinuses of pilots take a beating from sharp changes in air pressure – so he didn’t feel overly concerned, that morning, as he drove Victoria and Adam back to the family home in a peaceful Buckinghamshire village.
At about 11 that same morning he received a call from the ear, nose and throat consultant. “Elaine isn’t waking up properly from the anaesthetic,” said the doctor. “Can you come back in?” At the hospital, Bromiley was met by the consultant, who explained that there had been a problem keeping Elaine’s airway open after she had been anaesthetised and her oxygen levels had fallen to dangerously low levels. A decision had been taken to move her to the intensive-care unit.
Grasping for medical knowledge from half-remembered episodes of Casualty, Bromiley asked if the doctors had attempted a tracheotomy – a cut to the throat to allow air in. They explained that the safer option had been to let her wake up naturally. He made his way to the intensive-care unit. When he got there, the first person to approach him was the consultant anaesthetist, who, without saying anything, gave him a hug. Bromiley found himself trying to console him. “I said, ‘I know these things happen.’ ”
He took a seat and waited for news. After ten minutes, two doctors emerged and took seats opposite him. In sombre tones, they told Bromiley that Elaine had been without oxygen for a long period of time and, as a consequence, had suffered severe brain damage. He could hardly process what they were saying. “I just thought, ‘Fuck. What? How?’ I was stunned. My whole world changed.”
An hour later, Bromiley was allowed to see his wife. “She didn’t look any different,” he told me. But she was different. After finally settling her oxygen levels, the doctors had put her into a coma to prevent her brain from swelling to the point where it crushed itself against the top of her spine.
It soon became apparent that it was a coma from which she would never recover. Days later, after a series of discussions with the doctors, he consented to having her life support switched off. The doctors were surprised at the strength of her heart, which continued to beat for another week until, on 11 April 2005, Elaine Bromiley died.
3.
How could this have happened? When he surfaced from the shock, that was the question to which Bromiley wanted an answer. At first, he accepted the word of the ENT consultant, who told him that the doctors had made all the right decisions but had simply come up against an emergency for which nobody could have planned: the exceptional difficulty of getting a tube down Elaine’s throat.
Still, he assumed that the next step would be an investigation – standard practice in the airline industry after every accident. “You get an independent team in. You investigate. You learn.” When he asked the head of the intensive-care unit about this, the doctor shook his head. “That’s not how we do things in the health service. Not unless somebody complains or sues.”
This doctor was privately sympathetic to Bromiley’s question, however. Shortly after Elaine’s death, he got in touch with Bromiley to say that he had asked a friend of his, Professor Michael Harmer, an eminent anaesthetist, if he would be prepared to lead an investigation. Harmer had said yes. After Bromiley gained the hospital’s consent, Harmer set to work, interviewing everyone involved, from the consultants to the nursing team.
In July that year, he submitted his report. As Bromiley read it, his mind went back to one of the last nights he had spent in the hospital during his wife’s coma, and to something the duty nurse had said to him: “It’s terrible. I can’t believe that happened.” With hindsight, that was a hint.
Harmer’s minute-by-minute narrative of the operation revealed a different story from the one Bromiley had heard when he spoke with the ENT surgeon. The truth was that Elaine had died at the hands of highly accomplished, technically proficient doctors with 60 years of experience between them, in a fine, well-equipped modern hospital, because of a simple error.
4.
Doctors make mistakes. A woman undergoing surgery for an ectopic pregnancy had the wrong tube removed, rendering her infertile. Another had her Fallopian tube removed instead of her appendix. A cardiac operation was performed on the wrong patient. Some 69 patients left surgery with needles, swabs or, in one case, a glove left inside them. These are just some of the
incidents that occurred in English hospitals in the six months between April and September 2013.
Naturally, we respect and admire doctors. We believe that health care is scientific. We think of hospitals as places of safety. For all these reasons, it comes as something of a shock to realise that errors still play such a significant role in whether we leave a hospital better or worse, alive or dead.
The National Audit Office estimates that there may be 34,000 deaths annually as a result of patient safety incidents. When he was medical director, Liam Donaldson warned that the chances of dying as a result of a clinical error in hospital are 33,000 times higher than dying in an air crash. This isn’t a problem peculiar to our health-care system. In the United States, errors are estimated to be the third most common cause of deaths in health care, after cancer and heart disease. Globally, there is a one-in-ten chance that, owing to preventable mistakes or oversights, a patient will leave a hospital in a worse state than when she entered it.
There are other industries where mistakes carry grave consequences, but the mistakes of doctors carry a particular moral charge because their job is to make us better, and we place infinite trust in the expectation they
will do so. When you think about it, it’s extraordinary we’re prepared to give a virtual stranger permission to cut us open with a knife and rearrange our insides as we sleep.
Perhaps because of the almost superstitious faith we need to place in surgeons, we hate to think of them as fallible; to think that they perform worse when they are tired, or that some are much better at the job than others, or that hands can slip because of nerves, or that bad decisions get taken because of overconfidence, or stress, or poor communication. But all of these things happen, because doctors are human.
5.
Within two minutes of Elaine Bromiley’s operation beginning, the anaesthetic consultant realised that the patient’s airway had collapsed, hindering her supply of oxygen. After repeatedly trying and failing to ventilate the airway, he issued a call for help. An ENT surgeon answered it, as did another senior anaesthetist. The three consultants struggled to get a tube down Elaine’s throat, a procedure known as intubation, but encountered a mysterious blockage. So they tried again.
“Can’t ventilate, can’t intubate” is a recognised emergency in anaesthetic practice, for which there are published guidelines. The first instruction in one version of the guidelines is this: “Do not waste time trying to intubate when the priority is oxygenation.” Deprived of oxygen, our brains soon find it hard to function, our hearts to beat: ten minutes is about the longest we can suffer such a shortage before irreversible damage is done. The recommended solution is to carry out a form of tracheotomy, puncturing the windpipe to allow air in. Do not waste time trying to intubate.
Twenty minutes after Elaine’s airway collapsed, the doctors were still trying to get a tube down her throat. The monitors indicated that her brain was starved of oxygen and her heart had slowed to a dangerously low rate. Her face was blue. Her arms periodically shot up to her face, a sign that brain tissue is being irritated. Yet the doctors ploughed on. After 25 minutes, they had finally intubated their patient. But that was too late for Elaine.
If the severity of Elaine’s condition in those crucial minutes wasn’t registered by the doctors, it was noticed by others in the room. The nurses saw Elaine’s erratic breathing; the blueness of her face; the swings in her blood pressure; the lowness of her oxygen levels and the convulsions of her body. They later said that they had been surprised when the doctors didn’t attempt to gain access to the trachea, but felt unable to broach the subject. Not directly, anyway: one nurse located a tracheotomy set and presented it to the doctors, who didn’t even acknowledge her. Another nurse phoned the intensive-care unit and told them to prepare a bed immediately. When she informed the doctors of her action they looked at her, she said later, as if she was overreacting.
Reading this, you may be incredulous and angry that the doctors could have been so stupid, or so careless. But when the person closest to this event, Martin Bromiley, read Harmer’s report, he responded very differently. His main sensation wasn’t shock, or fury. It was recognition.
6.
Shortly after 5pm on the clear-skied evening of 28 December 1978, United Airlines Flight 173 began its descent to Portland International Airport. The plane had taken off from New York that morning and, after making a pre-scheduled stop in Denver, it was reaching its final destination with 189 souls on board.
As the landing gear was lowered there was a loud thump and the aircraft yawed slightly to the right. The flight crew noticed that one of the green landing gear indicator lights wasn’t lit. The captain radioed air-traffic control at Portland, telling them, “We’ve got a gear problem.”
Portland’s control agreed that the plane would orbit the airport while the captain, first officer and second officer worked out what to do. The passengers were told that there would be a delay. The cabin crew began to carry out checks. The flight attendants were instructed to check the visual indicators on the wings, which suggested that the landing gear was locked down.
Nearly half an hour after the captain told Portland about the landing gear problem, he contacted the United Airlines maintenance centre, informing the staff there that he intended to continue the holding pattern for another 15 or 20 minutes. He reported 7,000lbs of fuel aboard, down from 13,000 when he had first spoken to Portland.
United’s controller sounded a mild note of concern. “You estimate that you’ll make a landing about five minutes past the hour. Is that OK?” The captain’s response was ostentatiously relaxed: “Yeah, that’s a good ball park. I’m not gonna hurry the girls [the cabin crew].” United 173 had 30 minutes of fuel left.
The captain and his two officers continued to debate the question of whether the landing gear was down. The captain asked his crew how much fuel they would have left after another 15 minutes of flying. The flight engineer responded, “Not enough. Fifteen minutes is gonna – really run us low on fuel here.” At 18.07 one of the plane’s engines lost power. Six minutes later, the flight engineer reported that both engines were gone. The captain, as if waking up to the situation for the first time, said: “They’re all going. We can’t make Troutdale [a small airport on the approach route to Portland].” “We can’t make anything,” said the first officer. At 18.13, the first officer sent the plane’s final message to air-traffic control: “We’re going down. We’re not going to be able to make the airport.”
7.
This story of United 173 is known to every airline pilot, because it is studied by every trainee. To the great credit of the aviation industry, it became one of the most influential disasters in history. Galvanised by it and a handful of other crashes from the same era, the industry transformed its training and safety practices, instituting a set of principles and procedures known as CRM: crew resource management.
It worked. Although we usually notice only the high-profile exceptions, crashes are at the lowest level they have ever been, and flying is now one of the safest ways you can spend your time. As they are fond of saying in aviation, these days the most dangerous part of a flight is the journey to the airport.
CRM was born of a realisation that in the late 20th century the most frequent cause of crashes wasn’t technical failure, but human error. Its roots go back to the Second World War, when the US army assigned a psychologist called Alphonse Chapanis to investigate a curious phenomenon. B-17 bombers kept crashing on to the runway on landing, even though there were no apparent mechanical problem with the planes. Rather than blaming the pilots, Chapanis pointed to the instrument panel. The lever to control the landing gear and the lever that operated the flaps were next to each other. Pilots, weary after long flights, were confusing the two, retracting the wheels and causing the crash. Chapanis suggested attaching a wheel to the handle of the landing lever and a triangle to the flaps lever, making each easily distinguishable by touch alone. Problem solved.
Chapanis had recognised that human beings’ propensity to make mistakes when they are tired is much harder to fix than the design of levers. His deeper insight was that people have limits, and many of their mistakes are predictable effects of those limits. That is why the architects of CRM defined its aim as the reduction of human error, rather than pilot error. Rather than trying to hire or train perfect pilots, it is better to design systems that minimise or mitigate inevitable human mistakes.
In the 1990s, a cognitive psychologist called James Reason turned this principle into a theory of how accidents happen in large organisations. When a space shuttle crashes or an oil tanker leaks, our instinct is to look for a single, “root” cause. This often leads us to the operator: the person who triggered the disaster by pulling the wrong lever or entering the wrong line of code. But the operator is at the end of a long chain of decisions, some of them taken that day, some taken long in the past, all contributing to the accident; like achievements, accidents are a team effort. Reason proposed a “Swiss cheese” model: accidents happen when a concatenation of factors occurs in unpredictable ways, like the holes in a block of cheese lining up.
James Reason’s underlying message was that because human beings are fallible and will always make operational mistakes, it is the responsibility of managers to ensure that those mistakes are anticipated, planned for and learned from. Without seeking to do away altogether with the notion of culpability, he shifted the emphasis from the flaws of individuals to flaws in organisation, from the person to the environment, and from blame to learning.
The science of “human factors” now permeates the aviation industry. It includes a sophisticated understanding of the kinds of mistakes that even experts make under stress. So when Martin Bromiley read the Harmer report, an incomprehensible event suddenly made sense to him. “I thought, this is classic human factors stuff. Fixation error, time perception, hierarchy.”
8.
It’s a miracle that only ten people were killed after Flight 173 crashed into an area of woodland in suburban Portland; but the crash needn’t have happened at all. Had the captain attempted to land, the plane would have touched down safely: the subsequent investigation found that the landing gear had been down the whole time. But the captain and officers of Flight 173 became so engrossed in one puzzle that they became blind to the more urgent problem: fuel shortage. This is called “fixation error”. In a crisis, the brain’s perceptual field narrows and shortens. We become seized by a tremendous compulsion to fix on the problem we think we can solve, and quickly lose awareness of almost everything else. It’s an affliction to which even the most skilled and
experienced professionals are prone.
Imagine a stalled car, stuck on a level crossing as a distant train bears down on it. Panic rising, the driver starts and restarts the engine rather than getting out of the car and running. The three doctors bent over Elaine Bromiley’s throat were intent on finding a way to intubate, just as the three pilots in the cockpit of United 173 were determined to establish the status of the landing gear. In neither case did these seasoned professionals look up and register the oncoming train: in the case of Elaine, her oxygen levels, and in the case of United 173, its fuel levels.
When people are fixating, their perception of time becomes highly erratic; minutes stretch and elongate. One of the most striking aspects of the transcript of United 173’s last minutes is the way the captain seems to be under the impression that he has plenty of time, right up until the moment the engines cut out. It’s not that he didn’t have the correct information; it’s that his brain was running to a different clock. Similarly, it’s not that the doctors weren’t aware that Elaine Bromiley’s oxygen supply was a problem; it’s that their sense of how long she had been without it was distorted. When Harmer interviewed him, the anaesthetic consultant confessed that he had no idea how much time had passed.
Imagine, for a moment, being one of those doctors. You have a patient who has stopped breathing. The clock is ticking. The standard procedure isn’t working, but you have employed it dozens of times before and you know it works. Each of the senior colleagues around you is experiencing the same difficulty, which reassures you. You cling to the belief that, between the three of you, you will solve the problem, if it is soluble at all. You vaguely register nurses coming into the room and saying things but you don’t really hear what they say. Perhaps it occurs to you to step back from the patient and demand a rethink, but you don’t want your peers to see you as panicky or naive. So you focus on the one thing you can control: the procedure. You repeat it over and over, hoping for a different result. It is madness, but it is comprehensible madness.
Team trauma: British Midland Flight 92 came down near the M1 at Kegworth after a breakdown in communication among the crew
9.
In the months after Elaine’s death, as Bromiley tried to rebuild his family life, he couldn’t stop wondering about the difference between the way people in health care treated accidents and the way his industry dealt with them. So he would phone people in and around the National Health Service and ask them about it.
He discovered that many others – an anaesthetist in Scotland, a medical researcher in London – had been wondering the same thing. Eventually, he accumulated a long list of like-minded people, none of whom was talking to any of the others. So he booked a room in a hotel, called a meeting and invited them all, along with experts from other industries and academics, including James Reason. Everyone agreed that when it came to safety, health care was languishing in the Dark Ages. Hospitals more or less pretended that mistakes didn’t happen, failed to learn from them and, as a result, repeated them. If we don’t like to think that doctors make mistakes, doctors like to think about it even less.
One of the biggest problems identified was the unwritten but entrenched hierarchy of hospitals. Bromiley, who has worked with experts from various “safety-critical” industries, including the military, told me that the hospital is by far the most hierarchical workplace he has come across. At the top of the tree are consultant surgeons, the rock stars of the hospital corridors: highly driven, competitive, mostly male and not the kind who enjoy confessing to uncertainty. Then come anaesthetists, often quieter of disposition and warier of risk. Further down are nurses, valued for their hard work but not for their brains.
A key principle of human factors is that it is the unspoken rules of who can say what and when that often lead to crucial things going unsaid. The most painful part of the transcript of Flight 173’s final hour is the flight engineer’s interjections. You can sense his concern about the fuel situation, and his hesitancy about expressing it. Fifteen minutes is gonna – really run us low on fuel here. Perhaps he’s assuming the captain and his officers know the urgency of their predicament. Perhaps he’s worried about being seen to speak out of turn. Whatever it is, he doesn’t say what he feels: This is an emergency. We need to get this plane on the ground – NOW. Similarly, the nurses who could see the urgency of Elaine Bromiley’s condition didn’t feel able to tell the doctors that they were on the verge of committing a grave error. So they made tentative suggestions that were easy to ignore.
John Pickles, an ENT surgeon and former medical director of Luton and Dunstable Hospital NHS Foundation Trust, told me that usually when an operation is carried out on the wrong part of the body (a class of error known as “wrong-site surgery”), there is at least one person in the room who knows or suspects a mistake is being made. He recalled the case of a patient in South Wales who had the wrong kidney removed. A (female) medical student had pointed out the impending error but the two (male) surgeons ignored her and carried on. The patient, who was 70 years old, was left with one diseased kidney, and died six weeks later. In other cases nobody spoke up at all.
The pioneers of crew resource management knew that merely warning pilots about fixation error was not sufficient. It is too powerful an instinct to be repressed entirely even when you know about it. The answer lay with the crew. Because even the most experienced captains are prone to human error, the entire aircraft crew needed to act as a collective intelligence, vigilant for problems and responsible for solutions. “It’s the people at the edge of the room, standing back from the situation, who can often see it best,” Bromiley said to me.
He recalled the case of British Midland Flight 92, which had just taken off for its flight from London to Belfast on 8 January 1989 when the pilots discovered one of the engines was on fire. Following procedure, they shut it down. Over the PA, the captain explained that because of a problem with the right engine he was making an emergency landing. The cabin staff, who – like the passengers, but unlike the cockpit crew – could see smoke and flames coming from the left engine, didn’t pass this information on to the cockpit. After the pilots shut down the only functioning engine, British Midland 92 crashed into the embankment of the M1 motorway near Kegworth in Leicestershire. Forty-seven of the 126 people on board died; 74 sustained serious injuries.
The airline industry pinpointed a major block to communication among members of the cockpit crew: the captain. The rank of captain retained the aura of imperial command it inherited from the military and from the early days of flying, when pilots such as Chuck Yeager, immortalised in Tom Wolfe’s book The Right Stuff, were celebrated as audacious mavericks. The pioneers of CRM realised that, in the age of mass air travel, charismatic heroism was precisely the wrong stuff. The industry needed team players. The captain’s aura was a force field, stopping other crew members from speaking their mind at critical moments. It wasn’t just the instrument panel that had to change: it was the culture of the cockpit.
Long before they started doing more good than harm, surgeons were revered as men of genius. In the 18th and 19th centuries, surgical superstars performed operations in packed amphitheatres before hushed, admiring audiences. A great surgeon was a virtuoso performer with the hands of a god. His nurses and assistants were present merely to follow the great man’s commands, much as the planets in an orrery revolve around the sun. The advent of medical science gave this myth a grounding in reality: at least we can be confident that doctors today make people better, most of the time. But it reinforced a mystique that makes doctors, and especially surgeons (who, of course, still perform in operating theatres), hard to question, by either patients or staff.
Better safety involves bringing doctors off their pedestal or, rather, inviting them to step down from it. Modern medicine is more reliant than ever on teamwork. As operations become more complex, more people and procedures are involved. Operating rooms swarm with people; various specialists pronounce judgement or perform procedures, and then leave. Surgical teams are often comprised of individuals who know each other only vaguely, if at all. It is a simple but unavoidable truth that the more people are involved in something, and the less well they know each other, the more likely it is that someone will make an error.
The most significant human factors innovation in health care in recent years is surprisingly prosaic: the checklist. Borrowed from the airline industry, the checklist is a standardised list of procedures to follow for every operation, and for every eventuality. Checklists compensate for the inbuilt tendency of human beings under stress to forget or ignore what is important, including the most basic things (the first item on one aviation checklist is FLY THE AIRPLANE). They also empower the people at the edges of the room: before the operation and at key moments during it, the whole team goes through each point in turn, including emergencies, which gives a cue to more reserved members of the team to speak up.
Checklists are most effective in an atmosphere of informality and openness: it has been shown that simply using the first name of the other team members improves communication, and that giving people a chance to say something at the beginning of a case makes them more likely to speak up during the operation itself.
Naturally, this spirit of openness entails a diminishment of the surgeon’s power – or a dispersal of that power around the team. Some doctors don’t mind this – indeed, they welcome it, because they realise that their team can save them from career-ruining mistakes. Others are more resistant, particularly those who treasure their independence; mavericks don’t do checklists. Even those who see themselves as evolved team players may overestimate their openness. J Bryan Sexton, a psychologist at Johns Hopkins University in the US, has conducted global surveys of operating-room staff. He found that while 64 per cent of surgeons rated their operations as having high levels of teamwork, only 28 per cent of nurses agreed.
The lessons of human factors go far beyond the status of surgeons. From his earliest conversations with insiders, Bromiley realised that the NHS needed to undergo a profound cultural change if it was to reach the level of the aviation industry in terms of safety. Hospitals gave little or no thought to how their teams functioned. Doctors underestimated the effects of tiredness on their own performance. Medical schools taught doctors that technical excellence trumped everything else and spent little or no time teaching communication or team management skills. Specialists saw their job as fixing parts of the body, rather than helping a person (at this year’s Clinical Human Factors Group conference, Peter Jaye, a consultant surgeon at Guy’s Hospital in London, remarked: “At medical school I was trained to think of a patient as a pair of kidneys”). There was little or no data on which hospitals and doctors were making mistakes, and therefore which required the most urgent improvement.
Safety risks were routinely misperceived. The “can’t ventilate/can’t intubate” emergency happens about one in every 20,000 times, which anaesthetists consider a remote possibility. Yet as Bromiley told me: “In aviation, when we find out there’s a one-in-a-million chance of an engine failing, we worry. To me, one in 20,000 means a regular occurrence.”
As James Reason showed, mistakes arise out of coincidence. Suren Arul, the consultant paediatrician in Birmingham, told me that “when mistakes happen, it’s almost never one person’s fault. It’s usually a whole series of things, some of them tiny.” Bromiley asks hospital boards to consider their procurement of marker pens, used to mark the part of the body about to be operated on (best practice is for the surgeon to make the mark and to sign it with his initials). “I tell them, ‘I understand your need to cut budgets. But do you realise that because you didn’t buy those marker pens you’ve just trebled your likelihood of having a wrong-site surgery case?’ ”
There is now greater awareness of the complexity of safety than ever. The body that Bromiley founded, the Clinical Human Factors Group, has no official status within the NHS, but its influence has been felt right across that sprawling and multifarious institution. At the meeting I attended, everyone to whom I spoke seemed to believe that things are moving, albeit too slowly, in the right direction.
10.
It was a trauma situation: an 18-month-old baby boy had fallen down the steps at Euston Square station, smashing his head and injuring his leg. Through a one-way mirror, I watched as a young doctor entered the operating room and was greeted by the baby’s distressed mother. The woman sitting next to me turned to her team. “Why don’t you play the orthopaedic surgeon this time, Dave?” she said. “Clare, can you be the anaesthetic consultant?”
In recent years, simulations have started to become part of the training of doctors. Sarah Chieveley-Williams, the consultant anaesthetist who is director of clinical simulation at University College London Hospitals (UCLH), had invited me to watch junior paediatricians being put through their paces. The room before us was a near-perfect replica of an operating theatre, with an anaesthesia machine, various equipment, monitors and a stock of medication. The dummy baby had pupils that dilate, a heartbeat and a noisy cry.
Chieveley-Williams and her team were interested in the doctor’s ability to identify the right priorities: first, stabilise the baby’s condition by anaesthetising it; second, get a neurological consultant to look at its head injuries, in order to prevent or minimise brain damage. On our side of the glass, one of the team sat by a computer from where she could manipulate the baby’s vital signs. She slowed down its heart rate, ratcheting up the urgency of the situation.
Chieveley-Williams turned to another colleague: “Dave, see if you can get her fixated on the leg.” Dave left us and a moment later reappeared in the operating room wearing a white coat. After examining the patient, he proposed, with the air of someone used to being agreed with, that an X-ray be taken of the injured leg. Chieveley-Williams watched intently. In a quiet but firm voice, the young doctor said, “Right now, the priority is his head injury. The leg will have to wait.”
Chieveley-Williams turned to me with a grin. “That told him,” she said.
Over the next 20 minutes, a succession of people entered and left the room. Specialists were summoned, medications ordered and procedures arranged. At times the impression was one of near-chaos. A trauma incident, Chieveley-Williams explained to me, presents an acute management challenge, as well as a medical one. Because it often involves injuries to different parts of the body, many specialists come into the treatment room, each with his or her own agenda. “The doctor needs to establish leadership and keep everyone focused on the big picture – the patient’s health.”
In the case of Elaine Bromiley, there was too much hierarchy and too little. On the one hand, the nurses didn’t assert themselves. On the other hand, nobody was taking ultimate responsibility for the patient’s safety. As John Pickles remarks, “You had three very senior people in the room and no one in charge.”
Hierarchy, in the sense of clear leadership, is a good thing, as long as the leaders are confident enough to confess uncertainty. A common problem, Chieveley-Williams said, is young doctors being reluctant to say they don’t know what the answer is because they are so eager to project competence. A member of her team told me, “We tell them that when you’re stuck, ask everyone in the team for their view. One of them probably has the answer, but until you speak up they’ll assume you have it, too.”
These sorts of lessons weren’t being given ten years ago. Like UCLH, Great Ormond Street children’s hospital in London is at the forefront of the new thinking about patient safety, and is absorbing lessons from other industries. The transfer of patients from surgery to the intensive-care unit is a complex process that has to be accomplished at speed, and involves several people. Unsurprisingly, it is a well-known danger zone: things get dropped, tubes are left unattached, and patients suffer. In collaboration with the human factors researcher Ken Catchpole, the hospital studied the process of pit-stop changes in Formula 1, learning the importance of every individual on the team being allocated a precisely defined task. Mistakes fell.
11.
Martin Bromiley has rebuilt his life. Happily remarried, he is stepdad to his second wife’s two children, as well as still dad to Victoria and Adam. He is not haunted by the tragedy of Elaine’s death but driven by it. Between flying commitments, he talks to doctors, nurses, researchers and NHS boards, connecting the like-minded; telling his story to those, whether managers or medical students, who most need to hear it. It is a heavy workload. I wondered if he was ever tempted to leave it behind, now that the CHFG has its own momentum. He shook his head. “This is a duty.”
Improving the safety of patients in health care doesn’t necessarily require spending on expensive new technologies, or complex structural reorganisation. It requires forethought, empathy, humility and a willingness to learn from mistakes. Which, after all, is a duty to those who have suffered from them. Bromiley insisted that the Harmer report be made public as happens with air accident reports, and he chose for it an epigraph borrowed from aviation: “So that others may learn, and even more may live.” All of the medical staff involved in Elaine’s operation are back at work. That, says Bromiley, is exactly what he wanted, because they will be better clinicians for their experience, and advocates for the cause.
There are two approaches to reforming a large institution. You can impose change from outside by invoking the will, or the wrath, of the public – or you can persuade those inside to let you in and to listen to your message. Both can work. When Julie Bailey exposed the gross malpractices of staff at Stafford Hospital, she shook the health service from top to bottom. Bromiley greatly admires what Bailey has achieved, but he has taken a different path. Rather than using his story as a club – and nobody would have blamed him for doing that – he has deployed it as you would wish a surgeon to apply the knife to someone you love: with skill, subtlety and precision.
“From the moment something went wrong with Elaine, it was different, because they knew my profession,” he says. Responding to his calmness and extraordinary ability to empathise with their situation, the team rose to their best. As Elaine lay in a coma, they involved him in every decision they took, right up until the last one. It was proper teamwork, and a model for the long campaign that followed. “I’m an outsider who is also an insider,” he says.
Martin Bromiley has reminded clinicians that not everything is or should be clinical. His legacy, says Professor Jane Reid, a researcher in nursing at Queen Mary’s Hospital in south London, is “a new safety culture” in the NHS. He has no desire to take up any official position. “I’m not an expert on medical practice,” he told me. “I’m just a guy who flies planes.”
Ian Leslie is the author of Curious: The Desire to Know and Why Your Future Depends on It (Quercus, £10.99)