“One man may hit the mark, another blunder; but heed not these distinctions. Only from the alliance of the one, working with and through the other, are great things born.”
– Antoine de Saint-Exupéry
Practically anyone who has even so much as soloed an airplane has heard of Saint-Exupery, the early 20th century French writer and aviator. Many of his writings coalesced the wonder of flight with truisms of life, but perhaps none so germane to the field of anesthesiology as the one above. Those of us who have made anesthesiology our careers and aviation our avocations have long appreciated the parallels of the two fields, and even non-aviators have been known to apply aviation terminology to the phases of anesthesia: a detailed and careful “preflight” (preanesthetic visit), a smooth and controlled “takeoff” (induction), a “cruise” portion marked by constant vigilance, and a well-executed “landing” (emergence) characterize both a successful flight and a successful anesthetic.
These comparisons, however similar they may be, only scratch the surface of the lessons we can learn from the transformation that the commercial aviation industry has undergone over the past three decades in becoming the safest method of transportation in the world. Let’s take a brief look at what the standard practices of airlines were 30 years ago, compare them to their current practices, and see what we can glean from lessons they learned – often fatally and always expensively – over this period.
Until the early 1980s, the captain was indeed the “captain” in the most classical sense. All-too-true stories abounded of his unquestionable authority (for it was not until 1986 that the first female moved to the captain’s seat, at American Airlines).1 Not-so-subtle jokes referred to the primary duty of the first officer as seeing that the captain’s coffee was served as he liked it, or worse, the second officer’s moniker as “170 pounds of dumb.”2 This is not to say that the captain did not at least sit on the right hand of God, for he was always a highly experienced aviator by the time he ascended into the left seat, but he had often come from a military background where he had learned from his earliest days of flight training to function independently and take full responsibility for his actions when he was pilot in command.
As a consequence of this dictatorial authority, however, the usefulness of the second and third brains in the cockpit was severely curtailed, and, depending on the receptiveness of the captain to suggestions (criticism in sheep’s clothing), those two additional brains could have some, little or no influence on his decisions. But then a series of aircraft accidents occurred, and safety authorities from both government and the airline industry began to dissect – in true root-cause analysis fashion – the reasons for the disasters. As they did, a recurring scenario began to unfold in the cockpits of the ill-fated airliners. Often, the precipitating cause would be a single (or small) series of circumstances that, taken in a different context, could have resulted in a minor interruption in the flight, a delay, or perhaps a precautionary landing or second approach.
One such event occurred in my home state of Oregon on the night of December 28, 1978.3 A United Airlines DC-8 from Denver was cleared for approach into Portland after an uneventful flight. When the landing gear was lowered on final approach, a loud thump and a yaw of the aircraft to the right occurred, accompanied by a failure of the right landing gear down indicator to illuminate. The captain, concerned that the gear was still up, aborted the approach and was vectored to a holding area where he began to analyze the situation. For over an hour, the captain went through all required procedures to ascertain the position of the landing gear and prepare the cabin for a potential gear failure on landing. What he did not do, however, was assign the responsibility for watching the fuel level to either of his other crewmembers. When he finally began to turn toward the airport to begin what he thought would be a controlled approach and landing with every precaution in place, the engines, one by one, began to fail from fuel starvation. The aircraft crashed six miles from the airport. Although eight passengers and two crewmembers were killed, miraculously no one on the ground was injured, though the plane destroyed two unoccupied homes.
The National Transportation Safety Board’s final and most significant action resulting from its investigation was to “... ensure that ... [the airline] ... flight crews are indoctrinated in principles of flight deck resource management, with particular emphasis on the merits of participative management for captains and assertiveness training for other cockpit crewmembers.”3 Numerous other airline crashes were increasingly attributed to the same lapse in communications, both in the United States and throughout the world. Three years later, United Airlines was the first to incorporate the concept of “cockpit resource management” (later changed to “crew resource management,” or CRM, to acknowledge the potential contribution of all members of the crew, including flight attendants).4
The “translational research” of recognizing the seminal work of the airlines has slowly but surely moved into other industries, including health care. Together with another aviation advance, the use of detailed, regular, realistic simulation, anesthesiologists can now learn both the use of CRM techniques and the management of critical situations to develop and maintain their skills in emergency situations. Medicine, especially anesthesiology, has further redefined CRM as “crisis resource management” to bring it into phase with the principles of simulation. (Both of these concepts are the subjects of recent ASA NEWSLETTER articles by Hannenberg and Cammarata5 and Telesz and Telesz.6) Critical to this process is the appreciation of the importance of our learning the principle of situational awareness and projecting it into the immediate future. It involves the rapid assimilation of every available source of information in the formulation of a response to what is – or may become – a critical event. CRM teaches, among other principles, the behaviors necessary to encourage all contributors to feel both obligated and comfortable in adding their knowledge and concerns to the resolution of a crisis.
Although anesthesia and operating room tragedies rarely involve more than one life, they are no less preventable. Several years ago, the failure of a surgical “crewmember” (the cell-saver technician) to voice her concerns when observing an anesthesiologist applying an external pressure device to a cell-salvage reinfusion bag containing a substantial amount of air above the blood, despite her knowing that such an action was potentially dangerous, resulted in the death of a young mother and a multi-million dollar verdict against the anesthesiologists and the cell-saver service vendor – with half of the damages apportioned to the vendor for failure of its employee to exhibit sufficient assertiveness in pointing out the error.7
CRM has saved lives, both in airplanes and operating rooms. The following are the words of Al Haynes, captain of United Airlines Flight 232 that suffered a catastrophic rupture of all three hydraulic systems and loss of hydraulic pressure due to an engine fan failure and was forced to make an emergency landing in Sioux City, Iowa on July 19, 1989, eight years after United put CRM training into place: “... the preparation that paid off for the crew was something ... called Cockpit Resource Management ... Up until 1980, we kind of worked on the concept that the captain was THE authority on the aircraft. What he said goes. And we lost a few airplanes because of that. Sometimes the captain isn’t as smart as we thought he was. And we would listen to him, and do what he said, and we wouldn’t know what he’s talking about. And we had 103 years of flying experience there in the cockpit, trying to get that airplane on the ground, not one minute of which we had actually practiced – any one of us. So why would I know more about getting that airplane on the ground under those conditions than the other three? So if I hadn’t used [CRM], if we had not let everybody put their input in, it’s a cinch we wouldn’t have made it.”8
If you would like to learn more about this topic, consider attending Practice Management 2014, to be held in Dallas on January 24-26. The conference will include a session on strategies to improve clinical safety using proven methods of communication, team building and crew resource management.
1. Female pilots make history. American Airlines website. http://www.aa.com/i18n/amrcorp/corporateInformation/facts/femalepilots.jsp. Revised January, 2011. Accessed November 11, 2013.
2. Nance JD. What can cardiac surgical teams really learn from aviation? Proc Am Acad Cardiovasc Perfusion. January, 2001;22:14-29.
3. National Transportation Safety Board. Aircraft accident report: United Airlines, Inc., McDonnell-Douglas, DC-8-61, N8082U, Portland, Oregon, December 28, 1978 [report no. NTSB-AAR-79-7]. Washington, DC: NTSB; June 7, 1979.
4. Helmreich RL, Merritt AC, Wilhelm JA. The evolution of crew resource management training in commercial aviation. Int J Aviat Psychol. 1999;9(1):19–32.
5. Hannenberg AA, Cammarata BJ. All hands on deck. ASA Newsl. 2013;77(9):54-55.
6. Telesz JB, Telesz B. Embracing simulation training. ASA Newsl. 2013;77(9):72-73.
7. Michael Anthony Jones v Richard E. Crawforth and Anesthesiology Consultants of Treasure Valley, PLLC. (Idaho Supreme Court 2009). http://www.isc.idaho.gov/opinions/jones%20%20b&%20b%20final%20opn.pdf. Accessed November 11, 2013.
8. Haynes A. The crash of United Flight 232. Presented at: NASA Dryden Flight Research Facility; May 24, 1991; Edwards, CA.
Previous Article / Next Article