Black Box Thinking Read online

Page 3


  Further studies on both sides of the Atlantic have revealed similar results. Investigators working for the inspector general of the Department of Health and Human Services in the United States analyzed 273 hospitalizations and found that hospitals had missed or ignored 93 percent of events that caused harm.26 A European study discovered that although 70 percent of doctors accepted that they should disclose their errors, only 32 percent actually did.27 In a different study of 800 patient records in three leading hospitals, researchers found more than 350 medical errors. How many of these mistakes were voluntarily reported by clinicians? Only 4.28

  Think back to the way Dr. Edwards talked about the incident. “Look, Martin, there were some problems during anesthesia,” he said. “It is one of those things. The anesthetists did their very best, but it just didn’t work out. It was a one-off. I am so sorry.”

  This was not an out-and-out lie. Indeed, he may even have believed what he was saying. After all, the doctors were unlucky. It is unusual for a patient to have tight jaw muscles. It is also unfortunate that Elaine had a blocked airway that was resistant to attempts at tracheal intubation. They had done their best, hadn’t they? What more is there to say?

  This kind of reasoning represents the essential anatomy of failure-denial. Self-justification, allied to a wider cultural allergy to failure, morphs into an almost insurmountable barrier to progress.*

  For many people, traumatized by the loss of a loved one, this might have been the end of the story, particularly in the UK, where doctors are rarely challenged. It is not easy for a grieving family to insist on an investigation when the experts are telling them it is not necessary.

  But Martin Bromiley wouldn’t give up. Why? Because he had spent his entire professional life in an industry with a different—and unusual—attitude to failure. He is a pilot. He had flown for commercial airlines for more than twenty years. He had even lectured on system safety. He didn’t want the lessons from a botched operation to die along with his wife.

  So he asked questions. He wrote letters. And as he discovered more about the circumstances surrounding his wife’s death, he began to suspect that it wasn’t a one-off. He realized that the mistake may have had a “signature,” a subtle pattern that, if acted upon, could save future lives.

  The doctors in charge of the operation couldn’t have known this for a simple but devastating reason: historically, health-care institutions have not routinely collected data on how accidents happen, and so cannot detect meaningful patterns, let alone learn from them.

  In aviation, on the other hand, pilots are generally open and honest about their own mistakes (crash landings, near misses). The industry has powerful, independent bodies designed to investigate crashes. Failure is not regarded as an indictment of the specific pilot who messes up, but a precious learning opportunity for all pilots, all airlines, and all regulators.

  A quick example: in the 1940s the famous Boeing B-17 bomber was involved in a series of seemingly inexplicable runway accidents. The U.S. Army Air Corps responded by commissioning Alphonse Chapanis, a psychologist with a PhD from Yale, to undertake an investigation. By studying the crashes—their chronology, dynamics, and psychological elements—Chapanis identified poor cockpit design as a contributing factor.29

  He found that the switches controlling the flaps in B-17s were identical to the switches controlling the landing gear (the wheels) and were placed side by side. This was not a problem when the pilots were relaxed and flying conditions perfect. But under the pressure of a difficult landing, pilots were pulling the wrong lever. Instead of retracting the flaps, to reduce speed, they were retracting the wheels, causing the plane to belly flop onto the runway, with catastrophic results.

  Chapanis came up with the idea of changing the shape of the levers so that they resembled the equipment they were linked to. A small rubber wheel was attached to the landing-gear switch and a small flap shape to the flaps control. The buttons now had an intuitive meaning, easily identified under pressure. What happened? Accidents of this kind disappeared overnight.30

  This method of learning from mistakes has been applied to commercial aviation now for many decades, with remarkable results.

  Success in aviation has many components, of course. The speed of technological change has helped as has the fact that airlines, worried about reputational damage, competition from other providers, and insurance costs, have a strong commercial incentive to improve safety. Aviation has also benefited from the use of high-resolution simulators and effective training, as we’ll see.

  However, the most powerful engine of progress is to be found deep within the culture of the industry. It is an attitude that is easy to state, but whose wider application could revolutionize our attitude to progress: instead of denying failure, or spinning it, aviation learns from failure.

  And yet how does this happen in practice? How is learning institutionalized in the aviation system (given that pilots, regulators, engineers, and ground staff are dispersed across the world), how is an open culture created, and, most important of all, how can we apply the lessons beyond aviation?

  To find out, we’ll examine one of the most influential crashes of recent times, perhaps in the entire history of powered flight. We will see how investigators go about their business, excavate the lessons, and turn tragedies into learning opportunities.

  The name of the flight was United Airlines 173.

  Chapter 2

  United Airlines 173

  I

  United Airlines Flight 173 took off from JFK International airport in New York on the afternoon of December 28, 1978, bound for Portland, Oregon, as its final destination. The sky was clear, the flying conditions close to perfect.1

  Malburn McBroom, a fifty-two-year-old with silver-gray hair and a clipped voice, was the captain. A veteran of World War II, he had more than twenty-five years of flying experience, and lived with his wife in Boyd Lake, Colorado. His ambition to become a pilot had been ignited as a child when he saw traveling barnstormers while walking with his mother. “I’m going to fly airplanes, Mom,” he said.

  McBroom’s first officer was Rodrick Beebe, a forty-five-year-old who had been with United Airlines for thirteen years and had logged more than five thousand hours of flying time. The third person in the cockpit was Flight Engineer Forrest Mendenhall, a forty-one-year-old who had been with the airline for eleven years. He had clocked 3,900 flying hours. The passengers were in safe hands.

  After a brief stopover in Denver, United Airlines 173 departed for Portland at 14:47. It was three days after Christmas and the majority of the 181 passengers were returning home after the holidays. Up in the flight deck, the crew members chatted happily as the plane reached its cruising altitude. The planned flying time was 2 hours and 26 minutes.

  At around 17:10, as the plane was given clearance to descend by air traffic control at Portland Approach, McBroom pulled the lever to lower the landing gear. Normally this is followed by a smooth descent of the wheels and undercarriage, and an audible click as it locks into place. On this occasion, however, there was a loud thud, which reverberated around the airplane, followed by a shudder.

  In the cabin the passengers looked around anxiously. They began to speculate on the cause of the noise. Up in the cockpit the crew were also perturbed. Had the landing gear locked into place? If so, what was the loud thud? One of the lights that would normally be glowing if the landing gear was safely in place hadn’t illuminated. What did that mean?

  The captain had no choice. He radioed to air traffic control and asked for some additional flying time so he could troubleshoot the problem. Portland Approach instantly came back to advise United Airlines 173 to “turn left heading one zero zero.” In effect, the plane had been put into a holding pattern to the south of the airport, over the Portland suburbs.

  The crew made various checks. They couldn’t see beneath the plane to determine whether the landing ge
ar was in place, so they made some proxy checks instead. The engineer was sent into the cabin to see whether a couple of bolts, which shoot up above the wingtips when the landing gear is lowered, were visible. They were. They also contacted the United Airlines Systems Line Maintenance Control Center in San Francisco. Everything seemed to indicate that the gear was safely down.

  The captain was still worried, however. He couldn’t be certain. He knew that landing the plane without the gear lowered carried serious risks. Statistics show that planes that attempt to land without the wheels lowered typically suffer no fatalities, but it is still dangerous. McBroom, a responsible pilot, wanted to be sure.

  As the plane circled over Portland, he searched for an answer. He pondered why one of the landing gear lights had failed to turn green. He wondered if there was some way of checking the wiring. He searched his mind for other ways to troubleshoot the problem.

  While he deliberated, however, another problem was looming. At first, it was just a metaphorical speck in the distance, but as United Airlines 173 continued in its holding pattern, it became ever more real. There were 46,700 pounds of fuel on board the aircraft when it departed Denver, more than enough to reach its destination. But a DC8 burns fuel at around 210 pounds per minute. The plane could not circle indefinitely. At some point McBroom would have to bring the plane in to land.

  At 17:46 local time, the fuel level dropped to 5 on the dials. The situation was still within control, but the margin for error was shrinking. Time was becoming ever more critical. The engineer became agitated. He informed the pilot about the state of the fuel, warning about flashing lights in the fuel pump. The cockpit voice recording transcript reveals his growing anxiety.

  But McBroom didn’t respond in the way the engineer expected. The pilot is ultimately in charge of the flight. He has primary responsibility for the 189 passengers and crew. They were under his protection. He knew the dangers if he came in to land without the landing gear lowered. He was adamant that wouldn’t happen. He had to find out what was wrong. He had to be certain.

  He continued to focus on the landing gear. Was it down? Were there any further checks they hadn’t thought of? What more could they do?

  At 17:50 Engineer Mendenhall tried again to alert the captain to the dwindling reserves. The captain replied that there were still “fifteen minutes” of fuel in the tank, but he was wrong. He seemed to have lost track of time. “Fifteen minutes?” the engineer replied, a tone of incredulity in his voice. “Not enough . . . Fifteen minutes is gonna really run us low on fuel here.”

  With each second, the reserves of fuel were diminishing. A holding pattern had now become a potential catastrophe, not just for the passengers, but also for the residents of southern Portland. A 90-ton aircraft was circling above a city with its energy draining away.

  The first officer and engineer could not understand why the pilot was not heading directly to the airport. Fuel was now the principal danger. The landing gear hardly mattered anymore. But he was the authority figure. He was the boss. He had the experience and the seniority. They called him “sir.”

  At 18:06, the fuel was so low that the fourth engine flamed out. “I think you just lost number four, buddy, you . . .” Thirty seconds later, he repeated the warning. “We’re going to lose an engine, buddy.”

  Even now the pilot was oblivious to the catastrophic situation. His awareness of time had all but disintegrated. “Why?” he replied, seemingly incredulous at the loss of an engine. “Fuel” came the emphatic response.

  United Airlines 173 was perfectly capable of landing. The landing gear, it was later established, was in fact down and secure. Even if it hadn’t been, an experienced pilot could have landed the plane without loss of life. The night was crystal clear and the airport had been in sight since the initial descent had been aborted.

  But now, to the horror of the crew, they were eight miles short of the runway, over a major city, and the fuel had all but disappeared.

  It was too late now. As the remaining engines flamed out, all hope vanished. The plane was losing altitude at more than 3,000 feet per minute and they were not going to make it.

  McBroom strained his eyes across the horizon in a desperate search for a field or open space amid the mass of homes and apartment blocks stretching beneath the plane. Even now, he couldn’t understand what had happened. Had the fuel vanished into the ether? Where had the time gone?

  The last few moments of the transcript reveal their desperation as the flight careered down into suburban Portland:

  1813:38 Captain: They’re all going [i.e., all the engines are flaming out].

  1813:41 Captain: We can’t make Troutdale [another airport in Portland].

  1813:43 Co-Pilot: We can’t make anything.

  1813:46 Captain: Okay, declare a Mayday.

  1813:50 Co-Pilot (to Tower): Portland tower, United one seventy three, heavy Mayday we’re . . . the engines are flaming out, we’re going down, we’re not going to be able to make the airport.

  1813:58 Tower: United one . . .

  1814:35 (impact with transmission lines)

  (end of tape)

  United Airlines 173 was chosen as a vehicle to explore the aviation system for two reasons. First, it was a watershed event in aviation safety. That much is widely acknowledged. But for our purposes, it has an additional significance: it mirrors, in an intriguing way, the tragedy of Elaine Bromiley. While one accident happened in the skies and another in an operating theater, they share the same basic signature.

  Even on a cursory inspection the similarities are striking. Like Captain McBroom, who had become fixated on the landing gear problem, Dr. Anderton had become fixated on accessing the airway via the mouth. Perception had narrowed. Like McBroom, who had lost any sense of the dwindling reserves of fuel, the doctors overseeing Elaine Bromiley had lost perspective on the absence of oxygen. While McBroom was trying to solve the landing gear problem and the doctors were frantically trying to place the tracheal tube into the airway, the real disaster was all but ignored.

  Like Engineer Mendenhall, who had warned the captain but hadn’t gotten a response, Jane, the nurse, had alerted Dr. Anderton. They had both issued strong hints, had agonized about making their concerns more explicit, but had been intimidated by the sense of hierarchy. Social pressure, and the inhibiting effects of authority, had destroyed effective teamwork.

  But what is important for our purposes is not the similarity between the two accidents; it is the difference in response. We have already seen that in health care, the culture is one of evasion. Accidents are described as “one-offs” or “one of those things.” Doctors say: “We did the best we could.” This is the most common response to failure in the world today.

  In aviation, things are radically different: learning from failure is hardwired into the system.

  All airplanes must carry two black boxes, one of which records instructions sent to all on-board electronic systems. The other is a cockpit voice recorder, enabling investigators to get into the minds of the pilots in the moments leading up to an accident. Instead of concealing failure, or skirting around it, aviation has a system where failure is data rich.

  In the event of an accident, investigators, who are independent of the airlines, the pilots’ union, and the regulators, are given full rein to explore the wreckage and to interrogate all other evidence. Mistakes are not stigmatized, but regarded as learning opportunities. The interested parties are given every reason to cooperate, since the evidence compiled by the accident investigation branch is inadmissible in court proceedings. This increases the likelihood of full disclosure.

  In the aftermath of the investigation the report is made available to everyone. Airlines have a legal responsibility to implement the recommendations. Every pilot in the world has free access to the data. This practice enables everyone—rather than just a single crew, or a single airline, or a single nation
—to learn from the mistake. This turbocharges the power of learning. As Eleanor Roosevelt put it: “Learn from the mistakes of others. You can’t live long enough to make them all yourself.”

  And it is not just accidents that drive learning; so, too, do “small” errors. When pilots experience a near miss with another aircraft, or have been flying at the wrong altitude, they file a report. Providing that it is submitted within ten days, pilots enjoy immunity. Many planes are also fitted with data systems that automatically send reports when parameters have been exceeded. Once again, these reports are de-identified by the time they proceed through the report sequence.*

  In 2005, for example, a number of reports were filed in rapid succession alerting investigators to a problem with the approach to Lexington Airport in Kentucky. Just outside the airport, local authorities had installed a large mural on an empty expanse of land, as a way of brightening it up. At the top of the mural, they had placed lamps to illuminate it at night.

  But the lights were playing havoc with the perception of pilots. They were mistaking the mural lights for lights on the runway. They were coming in too low. Fortunately nobody crashed, but the anonymous reports revealed a latent problem before it was given a chance to kill anyone. Shawn Pruchnicki, an aviation safety expert who attended the meeting, told me: “We saw a whole bunch of reports in a single week. We instantly realized there was a problem, and that we had to act.”

  Within minutes an e-mail was sent out to all flights scheduled to land at Lexington warning of a potential distraction on approach. Within days the mural and its lights had been removed (this would have happened far sooner had the land been under the jurisdiction of the airport). An accident had been prevented.