Winter 2019

Of Pilots and Physicians, Passengers and Patients

The importance of maintaining situational awareness when the stakes — and the stress — are high

Artificial Intelligence Issue

  • by Tamara Fountain
  • 10 minute read

Woody Fountain, Tamara's father, stands next to a 747 he flew during his career as a pilot for Northwest Airlines

Woody Fountain, Tamara's father, stands next to a 747 he flew during his career as a pilot for Northwest Airlines

I am the daughter of an aviator. Before I could even sit up, my mom was pushing my stroller on the tarmac to visit my dad, an instructor pilot in the U.S. Air Force. By the time I was in kindergarten, he had left the military and settled into civilian life as a commercial pilot for Northwest Airlines, which merged with another carrier a few years ago. When Dad wasn’t out on a trip, he and I would drive out to the Minneapolis–Saint Paul International Airport and park along the runway access road. I’d scramble up on the hood of our burgundy ’65 Mustang, and we’d spend whole afternoons watching planes take off and land. Amid the roar of the engines and between blasts of jet fuel exhaust (the smell of which I loved, by the way), my dad would give me age-appropriate lessons in aerodynamics, air traffic control, and how to recognize types of airplanes by their fuselage and engine placement. When other kids declared that they wanted to be a fireman (usually boys) or a nurse (always girls) when they grew up, I would reply with similar conviction, and usually to a few teasing giggles, that I was going to be a pilot in the U.S. Air Force.

Aviator, glasses

Those career plans were crushed one brisk fall day when I was about seven years old. I was flying a kite with my dad at one of Minnesota’s 10,000 lakes. We were under the glide path for planes landing at an airport nearby. A low, distant rumble announced a jet coming in on final approach. As it came into view, Dad quizzed me on the type of aircraft.

Pilot Fountain wearing flight suit and holding baby daughter, Tamara
The author as an infant being held by her father, then an instructor pilot for the U.S. Air Force.

I recognized the distinct engine-in-the-tail profile.

“That’s a 727,” I chirped proudly.

“Yes, it’s a United 727,” he replied.

I squinted at the plane and turned to my dad. “How can you tell it’s United?”

He furrowed his brow. “I can read it. Can’t you?”

Tests would later confirm that I was not the victim of some blinding, degenerative disease. I was simply nearsighted. In the early 1970s, however, anyone with designs on a career in military aviation had to have 20/20 vision without glasses or contacts. “Aviator glasses,” I realized, was a cruel misnomer, meant to describe an off-duty fashion accessory, not a refractive aid.

From that day on, I could be a fireman. I could even be a nurse. I could not, however, be a U.S. Air Force pilot.

Flight lessons

I got over this early disappointment (mostly) and discovered a rewarding career in medicine as an ophthalmologist. How’s that for irony? Yet, I remain fascinated by aerodynamics. My dad has retired, but my mom will attest to the fact that he and I still talk about planes. A lot.

Over the years, I’ve come to recognize a number of parallels between medicine and aviation: They are both highly regulated institutions in which the barriers to entry are high. Both require long training periods. And, for both, the cost of failure is great.

In medicine, one of the greatest and costliest failures is misdiagnosis, defined as a failure to establish an accurate and timely explanation of the patient’s health problem and communicate that explanation to the patient. A landmark 2015 report from the Institute of Medicine, now known as the National Academy of Medicine, estimated that twelve million people in the United States are misdiagnosed each year in the outpatient setting alone. This represents 5 percent of all patient encounters, making misdiagnosis a more common medical error than drug errors or wrong-site surgery.

In aviation, the greatest failure is the crash of an airliner. Fortunately, commercial failures are rare. When they do happen, I am always anxious to call my dad and get his pilot’s take on what might have gone wrong in the cockpit.

From our discussions, one crash in particular may have lessons for us in medicine.

Missed signals

Late at night on the last day of May 2009, Air France flight 447, a wide-body Airbus 330, took off from Rio de Janeiro headed for Paris. While cruising at 35,000 feet about three hours into flight, the airplane’s speed indicator malfunctioned.

By itself, this malfunction would not cause the pilot to lose control of the craft. In fact, the airplane responded as it should: The autopilot disengaged because it was no longer receiving critical data on how fast the airplane was traveling. This instrument response is similar to what happens in a car if the speedometer malfunctions while cruise control is engaged. Cruise control cuts off, and the driver needs to take over.

The pilot in command, now forced to fly the plane manually, inexplicably executed a series of maneuvers that put the plane in a stall. Warnings and sirens blared as the plane began free fall in the nighttime sky.

The flight deck, according to the craft’s flight recorder, or black box, was thrown into chaos. Checklists that should have been followed were not. Communication that should have taken place between the pilots did not. Direction, altitude, and other essential flight parameters that should have been monitored were not. The plane was falling toward the ocean at more than 10,000 feet per minute.

The most senior pilot, who had been on a rest break in the cabin when the speed indicator was lost, rushed back to the flight deck. Within seconds, he recognized the airplane was in an aerodynamic stall.

Recovery from a stall involves putting the plane into a dive to recover both airspeed and lift, then pulling out to level flight. It’s a maneuver that should be instinctive for any pilot. Yet, by the time the crew made the right diagnosis, it was too late. They had run out of room. The last transmission from the flight deck was “F---, we’re dead!”

A little more than 4 minutes after the autopilot disengaged, Air France 447 slammed belly-first into the Atlantic Ocean. All 228 aboard perished.

Signs and signifiers

My dad identifies what can be a common theme in airplane crashes—the pilots, he says, actually “forget to fly the airplane.” A relatively minor distraction triggers a fatal cascade of events. The pilots lose situational awareness and, during the ensuing calamities, forget to ask three basic questions: Where am I? How fast am I going? In what direction am I headed?

Tamara and Woody Fountain in an airplane cockpit
Tamara Fountain with her father, Woody, in the cockpit of an airplane, circa 1985

Those three questions led me to think about how physicians fail to properly diagnose patients. The diagnoses we miss most often are common ailments, not exotic “zebras.” In my field of ophthalmology, among the top diagnoses we miss are glaucoma, retinal detachment, and intraocular infection. A first-year resident would easily recognize all three. Yet, like the pilots of that ill-fated Air France flight, physicians who fail to connect the dots and recognize what should be obvious commit the medical equivalent of forgetting to fly the airplane. We physicians lose clinical situational awareness and, as a consequence, forget to treat the patient.

Time is in short supply during most patient encounters. Physicians must make quick triage and diagnostic assessments based on the constellation of signs and symptoms in front of them. This “fast,” or intuitive, thinking, as described by psychologist Daniel Kahneman in his bestseller, Thinking, Fast and Slow, allows doctors to efficiently navigate through a forty-patient clinic day.

It is said that there are more than ten thousand diseases of the human body but only two hundred to three hundred symptoms. Diagnosing can be a challenge; it’s understandable that our initial assessments are sometimes wrong. When they are, that’s when what Kahneman describes as “slow,” or deliberative, thinking should take over. When a patient doesn’t respond to treatment as expected, the physician must step back and call a hard stop. To regain situational awareness, the following questions must be asked and answered: Why isn’t this patient getting better? What do history, exam, and tests show? What else could this be?

Docs’ black box

Aviation is the safest mode of travel in our country. More than nine million flights carry nearly one billion domestic passengers annually. There has not been a fatal crash of a U.S. commercial aircraft in nearly ten years. Because of the efforts of multiple federal agencies, including the Federal Aviation Administration and the National Transportation Safety Board, U.S. air travel benefits from a culture of safety, continuous improvement, and regular feedback to all stakeholders. Pilots, however, are no more immune to cognitive biases or human error than physicians are: Improved technology in aircraft design and air traffic control has largely removed the opportunity for pilot error to bring down a plane.

It’s when we consider the issue of patient safety that the parallels between aviation and medicine begin to break down. There are fundamental differences between the two that help explain why the safety record for medicine lags that for aviation.

One difference is sheer numbers. There are infinitely more physician-patient encounters every year than there are flights. Although electronic medical records and clinical registries amass volumes of patient-care data, we still lack the medical equivalent of the black box and the investigative transparency that allow federal agencies to pinpoint the root cause of every plane crash and even every near miss. The culture of medicine largely discourages transparency or critical analysis of mistakes. Diagnostic errors often go undetected, and feedback that could improve the performance of individual physicians remains generally unavailable.

Where is the technology that could help minimize physician error as it has for pilot error? Test tracking, warning systems, and clinical prompts, in theory, should be some of the advantages conferred by electronic medical records. Yet, any health care professional with experience on an electronic platform recognizes how much more difficult and distracting that platform can make finding, filtering, and interpreting the clinical data needed for optimal decision making.

I like to think electronic medical records are in their nascent, floppy-disk phase. It’s my hope that technology will one day succeed in making the computerized patient interface as intuitive and interoperable as the digital devices we all now hold in our hands. Only when this happens will we be able to reliably harness the power of artificial intelligence to support the diagnostic process and improve patient safety.

The day when technology saves physicians from themselves is not yet here—and may never come. I expect no machine will ever match the nuanced faculties a human physician brings to a patient encounter. As complex a system as a multimillion-dollar modern aircraft is, it is simple when compared to the complexity and individual variation of the human body and the myriad ways disease can afflict it.

This complexity makes our role as physicians that much more critical and challenging. One piece of good news is that common diseases present commonly and in (mostly) common ways. The clues to the correct diagnosis are often hiding in plain sight if we just take the time to look for them.

The other good news? Physicians usually have more time to connect the dots than pilots do.

Tamara Fountain, MD ’88, an ophthalmic plastic and reconstructive surgeon, is a professor in the Department of Ophthalmology at Rush University Medical Center in Chicago and a principal in Ophthalmology Partners, Ltd., in Deerfield, Illinois.

Images courtesy of Woody Fountain