May 1, 2013
Volume 77, Number 5
Cognitive Errors and Cognitive Aids in Anesthesiology
Marjorie Podraza Stiegler, M.D. Committee on Patient Safety and Education
Sara N. Goldhaber-Fiebert, M.D.
What Is Cognitive Error?
What does it feel like to be wrong? How can we recognize a mistake when we are making it? We are oblivious to our errors at the time that we make them, because being wrong feels exactly like being right. This is the idea of “error blindness.”1 The time that elapses between making a wrong decision and subsequently recognizing it ranges from very short (a few seconds) to very long (months to years, or never at all). As physicians, we have an obligation to prevent as many errors as possible, and to recover quickly and optimally when errors do occur. An important component of diagnostic and therapeutic decision-making mistakes is cognitive error – thought process errors made despite adequate knowledge and skill, and often in the setting of good intentions. Cognitive errors may be rooted in bias, heuristical decision-making, overconfidence, illogical thought preferences and other subconscious phenomena, leading to mistakes in anesthesiology decision-making.2 Here, we will discuss just a few examples of cognitive errors and the circumstances that might surround them. A catalog is presented in Table 1, and more comprehensive discussions are available elsewhere.3 In addition, we will discuss how cognitive aids can assist with prevention and recovery from these errors.
Example 1: Feedback Bias
Feedback Bias in anesthesiology describes the process by which our outcome data are elusive, untimely, reported in aggregate or never reported at all. There are numerous potential injuries that we may cause or contribute to that are likely referred away to a treating physician but not reported back to us (unless the injury is very grave or legal action is pursued).
These include nerve injury, airway trauma, corneal abrasions and so on. In these cases, it is possible that the patient or primary team doesn’t consider these to be “anesthesia complications” and simply refers the problem to a consultant in neurology, otolaryngology or ophthalmology. Additional examples may include unintended awareness, central venous catheter infection and other complications that may be revealed after the usual “post-op visit” timeline. While some hospitals collect this kind of data in aggregate (i.e., “last year, our department placed XX central lines that were linked to catheter-associated infection”) it is uncommon for an email or phone call to alert the anesthesiologist on the day the diagnosis is made.
Our brains interpret this absent feedback as positive feedback (“No news is good news!”) and thus our habits are reinforced in
a subconscious way, potentially leading us to continue suboptimal practice.
Example #2: Availability Bias
Availability Bias describes the phenomenon by which diagnoses come to mind, particularly under the time-pressured circumstances of clinical urgency (in contrast to a deliberate, Bayesian exploration of a theoretical problem). When availability bias causes cognitive error, it is typically the result of a trigger event – an emotionally memorable prior experience, usually negative. Because of these indelible experiences, certain diagnoses pop into mind with great frequency when certain cues that resemble the trigger event recur. However, the brain subconsciously discounts important differences between this presentation and that prior event, and the ease with which the diagnosis comes to mind results in an overestimation of the pretest probability. When a colleague says, “I’ve been burned” and goes on to describe an event that impacts his current practice, he is describing the effect of Availability Bias.
Example #3: Anchoring
Anchoring, also called Fixation or Tunnel Vision, describes the process during which attention is focused on a particular feature or condition at the expense of comprehending the situation in total. This may manifest as a distraction in the operating room, such as a persistent infusion pump alarm. Attention is easily diverted from the clinical case to troubleshoot the alarm and the device, and important changes in patient condition may go undetected for a period of time. Similarly, persistent fixation on hypovolemia as a (common) cause of hypotension can prevent adequate consideration of a full differential diagnosis. In all three examples, the physician is knowledgeable, skillful and well-intentioned. He or she wants to improve his or her practice, make the correct diagnoses, be vigilant and implement the correct treatment plans in a safe way that minimizes risk to the patient. Cognitive errors may result in behaviors consistent with various types of errors, defined elsewhere,4 e.g., slips, lapses, mistakes and even deliberate violations. Subconscious thought processes influence each of these errors. Thus, even when anesthesiologists have learned all the correct “medical and technical” knowledge, patients can be harmed. Certain strategies can be effective in reducing these kinds of errors, including metacognition (“thinking about thinking”), de-biasing exercises, training on specific situational pitfalls and using checklists, decision-support resources and other cognitive aids.5
Cognitive Aids: Emergency Manual Implementation
Cognitive aids are tools to help people to efficiently remember and act upon important information they likely already know, but may be latent. Checklists are one type of cognitive aid, though the name implies linearly “checking off” items, so they often require supplementary sections for complex events.
An emergency manual is a tool that contains relevant sets of cognitive aids for a specific clinical context. Emergency manual implementation, by institutions and practitioners, can help health care teams to effectively perform known best practices during critical events. Vital elements for success include not only the content and format but also access, training, roles and a culture that supports their use. There have been multiple simulation-based studies supporting the value of cognitive aids, each emphasizing one or some of these vital elements. Research on cognitive aid use for intraoperative events began about a decade ago, and there is growing recognition of the importance of this field, including a recent trial published in the New England Journal of Medicine.6-8
In aviation and other high-stakes industries, emergency manuals are routinely available and integrated into recurrent training, and their use is not only allowed but rather expected. These practices stem from human factors and cognitive psychology research demonstrating both common cognitive errors and frequent memory retrieval problems under stress, even for well-trained professionals.
Use of emergency manuals can catch cognitive errors stemming from, e.g., premature closure, anchoring or availability biases before patients are harmed. As a debriefing tool, emergency manuals can prevent the formation of other cognitive errors, such as outcome bias, after critical events. Once a “diagnosis” is chosen, appropriate use of a cognitive aid helps high-stakes teams to accurately and efficiently perform necessary management actions.
In health care, however, cognitive aids for critical events have not yet been widely adopted. Until recently, only a handful were readily available, e.g., advanced cardiac life support (ACLS) algorithms from the American Heart Association, a malignant hyperthermia (MH) poster from the MH Association of the United States (MHAUS), the Difficult Airway Algorithm from ASA, and Local Anesthetic Systemic Toxicity (LAST) management from the American Society of Regional Anesthesia and Pain Medicine (ASRA).
In 2003, the Veterans Affairs (VA) system was one of the first nationally to place useful sets of cognitive aids in each VA operating room, in work that originated from the book Crisis Management in Anesthesiology.8,9
Building upon these sources, the Stanford Anesthesia Cognitive Aid Group developed content and format with the goal of improved usability and with many years of iterative simulation-based testing. Stanford installed these emergency manuals, containing 23 perioperative critical events, in each O.R. as well as other relevant locations, and launched interdisciplinary staff training. Local practitioners now often utilize emergency manuals for pre-event review and education. There are growing reports of use during certain critical events, as well as for post-event debriefing. The Kaiser system and other institutions have similarly implemented various O.R. cognitive aids, and Brigham and Women’s hospital is currently planning its operating room launch, after showing successful use of crisis checklists during simulated
As a specialty, anesthesiology is moving beyond simulation testing into broader clinical implementation of emergency manuals. This spring, the Emergency Manual Implementation Collaborative was formed, including local champions from Harvard, Stanford, Kaiser and other institutions, with the mission of fostering effective use to enhance our patients’ safety. Questions remain, including scalable methods for effective familiarization and who can be effective “readers.”10
In the meantime, if you have a critical event during your own surgery tomorrow, would you want your anesthesiologist and team to have access to an emergency manual and training in its use?*
1. Schulz K. Being Wrong: Adventures in the Margin of Error. 1st ed. New York: Ecco; 2010:viii, 405.
2. Stiegler MP, Neelankavil JP, Canales C, Dhillon A. Cognitive errors detected in anaesthesiology: a literature review and pilot study.
Br J Anaesth. 2012;108(2):229-235.
3. Stiegler MP, Ruskin KJ. Decision-making and safety in anesthesiology. Curr Opin Anaesthesiol. 2012;25(6):724-729.
4. Reason J. Understanding adverse events: human factors. Qual Health Care. 1995:4(2):80-89.
5. Graber ML, Kissam S, Payne VL, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf. 2012:21(7):535-557.
6. Arriaga AF, Bader AM, Wong JM, et al. Simulation-based trial of surgical-crisis checklists. N Engl J Med. 2013:368(3):246-253.
7. Harrison TK, Manser T, Howard SK, Gaba DM. Use of cognitive aids in a simulated anesthetic crisis. Anesth Analg. 2006;103(3):551-556.
8. Neily J, DeRosier JM, Mills PD, Bishop MJ, Weeks WB, Bagian JP. Awareness and use of a cognitive aid for anesthesiology. Jt Comm J Qual Patient Saf. 2007;33(8):502-511.
9. Gaba DM, Fish KJ, Howard SK. Crisis Management in Anesthesiology. New York: Churchill Livingstone; 1994.
10. Burden AR, et al. Does every code need a “reader?” improvement of rare event management with a cognitive aid “reader” during a simulated emergency: a pilot study. Simul Healthc. 2012; 7(1):1-9.
* For free implementation resources, including content and tips: The Stanford emergency manual, containing 23 critical events, is freely available at http://emergencymanual.stanford.edu. Development by Stanford Anesthesia Cognitive Aid Group: Steve Howard, Larry Chu, Sara Goldhaber-Fiebert, David Gaba, and Kyle Harrison (listed in random order), with other collaborators. Crisis checklists developed by a Harvard team, for 12 critical events, are available at www.projectcheck.org/crisis. Various cognitive aids for critical events are also available for sale as publications and applications.
† To report critical event experiences and lessons learned, go to www.aqiairs.com. The Anesthesiology Quality Institute’s Anesthesia Incident Reporting System is a secure portal for reporting, anonymously if desired. Reflection on contributing factors and impact of cognitive aids is included.
Marjorie Podraza Stiegler, M.D. is
Assistant Professor of Anesthesiology and Director of Anesthesiology
Simulation for the University of
North Carolina, Chapel Hill.
Sara N. Goldhaber-Fiebert, M.D. is
Clinical Assistant Professor of
Anesthesiology and Co-director of Evolve Simulation Program for
Stanford University School of
Medicine, Stanford, California.
Next Article / Previous Article