Notice: Get a jump on 2015 — Pay your 2015 ASA membership dues now!


>

About ASA

The American Society of Anesthesiologists is an educational, research and scientific association of physicians organized to raise and maintain the standards of the medical practice of anesthesiology and improve the care of the patient.

>

Information for Authors

>

Subscribe

Published monthly, the NEWSLETTER contains up-to-date information on Society activities and other areas of interest. 

Subscribe to the ASA NEWSLETTER

>

Editor

N. Martin Giesecke, M.D., Chair

>

Contact

Send general NEWSLETTER questions to j.reid@asahq.org.

August 1, 2013 Volume 77, Number 8
Simulation and MOCA®: ASA and ABA Perspective, After the First Three Years Randolph H. Steadman, M.D., M.S., Editor-in-Chief, ASA Editorial Board for Simulation-Based Training

Arnold J. Berry, M.D., M.P.H., ASA Vice President for Scientific Affairs

Douglas B. Coursin, M.D., ABA President

J. Jeffrey Andrews, M.D., ABA Secretary



Physicians are being asked by local, state and federal agencies to do more to maintain their licensure and medical staff credentials in order to demonstrate their commitment to lifelong learning. In general, the requirements to practice our profession are increasing. These additional demands may seem overwhelming and, at times, unacceptable. On the other hand, our patients deserve physicians who can deliver the most appropriate evidence-based care and document the quality of their practice. Maintenance of Certification in Anesthesiology (MOCA®) consists of four parts, with the fourth being Practice Performance Assessment and Improvement (PPAI). One requirement of MOCA® Part 4 is participation in a PPAI activity at an endorsed simulation center, and recently many anesthesiologists are asking whether meaningful practice improvement is occurring as a result of this simulation requirement.


As a leader in patient safety, the specialty of anesthesiology pioneered the development of medical simulation. The American Board of Anesthesiology (ABA) initially proposed simulation as a mechanism for practice improvement for several reasons. In response to a 2006 survey of ASA members, 81 percent indicated an interest in simulation-based CME, 89 percent felt simulation would enhance their skills in the management of infrequent, difficult events, and 79 percent felt it would enhance their skill in crisis resource management. Simulation’s ability to engage and stimulate participants is likely to lead to an improved self-assessment and to assist physicians in identifying gaps in their practice. Simulation is also an excellent way to provide clinical practice for critical events that are life-threatening but rarely occur. During management of these crisis situations, anesthesiologists can practice leading a team that is responding to care for the simulated patient.


The ASA Committee on Simulation Education was formed in 2006 to foster access to high-quality simulation-based training for ASA members. The committee developed endorsement criteria for simulation programs, and 34 programs now meet these criteria. Simulation became a Part 4 MOCA® requirement in 2010, and in the ensuing three years more than 1,600 ABA diplomates have participated in simulation courses to fulfill one of their Part 4 activities. The simulation activity consists of three parts, all of which must be completed for the physician to get MOCA® credit. The diplomate first participates in a one-day simulation course at one of the 34 ASA-endorsed simulation centers. Typically, four to six anesthesiologists take part in the simulation exercises, with each having at least one turn as the anesthesiologist-in-charge managing the care of the patient. There is a debriefing and review of performance after each simulation scenario. After this, physicians are asked to reflect on the simulation activity and to identify several areas in their practice that could be improved (practice gaps) as a result of the experience. The list of three practice improvements is submitted to ASA. Several weeks later, ASA contacts the individual to find out if he or she was able to implement any of the changes in his or her practice, or if not, what barriers were encountered that prevented changes. Once this information is submitted, the individual is awarded credit for completing the simulation requirement. As can be seen, MOCA® simulation is not a test, but is a practice assessment and improvement activity.


The initial experience with the simulation activity has been extremely positive. In follow-up surveys, 95 percent of participants would recommend simulation to their colleagues, and 98 percent felt the course was relevant to their practice. Course participants have identified relevance as the most important element of the program. On follow-up within three months of the course, 95 percent of participants had successfully completed changes in their practice based on what they identified during the course. This is a remarkably high level of implementation.


Data from the first two years of simulation for MOCA® were recently published.1 In their report, McIvor et al. describe the first 583 participants’ view of the process and outline the format as an experiential learning opportunity (not a pass/fail test) designed to stimulate reflection that leads to practice improvement. Based upon the results, this process has accomplished what it set out to do – promote practice improvement.


The nature and impact of the improvements have been impressive in many instances. One participant learned intraosseous insertion during a MOCA® simulation course and subsequently used the skill to save a patient’s life.2 This is not surprising, given that simulation was credited with improving the response during the recent Boston Marathon bombings.3


Recently evaluated follow-up results for more than 1,800 self-identified practice improvement plans revealed many compelling, impactful initiatives that overcame barriers and/or exceeded the scope of the original plan. Examples include plans demonstrating direct benefits for patients related to improving teamwork and communication skills using directed, closed loop communication, designating a leader, standardized handoffs and other proven communication techniques. Other examples of compelling plans include the widespread dissemination of management guidelines (treatment checklists) across departments and across a hospital network. Half of the 1,800 submitted plans involved activities that extended beyond the submitting anesthesiologist to include benefits (improved access to knowledge, equipment and/or medications) for other anesthesia practitioners within the practice. Inter-professional collaboration was remarkable in many instances.


These results are not surprising, since simulation can demonstrate gaps between ideal and actual performance. In one study using simulation, fully-trained anesthesiologists performed less than 20 percent of the indicated key actions during two (hyperkalemia and malignant hyperthermia) of 12 scenarios.4 Simulation allows practitioners to identify important gaps as they are uncovered in a contextual setting (the scenario) and to reflect on (the debriefing) what is needed to achieve improvements (closing the gaps).


Research supports the transfer of simulation-based training to improved patient care during routine, complex events (such as cardiopulmonary bypass),5 during procedures (such as central line insertion),6,7 and during life-threatening events that require teamwork and communication (adult and pediatric codes).8,9


Because simulation courses emphasize the management of challenging, but infrequent, clinical occurrences, it will be difficult, and perhaps impossible, to acquire level 1 evidence (randomized trials) of simulation’s effectiveness in improving patient (population) outcomes.10 However, in other industries of intrinsic hazard in which the population depends on the skilled performance of professionals (aviation, nuclear power), level 1 evidence of simulation’s effectiveness does not exist, yet simulation is mandatory.


Despite the benefits, simulation courses have been characterized by some as needlessly expensive and/or designed to provide a profit to the host institution. Although fees for the use of the relevant specialized facilities can be significant, the primary cost involved in conducting MOCA® simulation courses is the labor expense for course personnel (clinical instructors and technical staff). Each program sets the price of its courses based upon its expenses, and the majority of programs do not seek (or make) a substantial profit when all costs and overheads are included. We appreciate that the combined financial burden to the diplomate which results from program tuition, travel and missed work can be significant. Programs strive to conduct courses when convenient for their MOCA® learners, and many centers in the ASA Simulation Education Network are located in large metropolitan areas that are home to many anesthesiologists, eliminating the need for travel for these individuals. The ASA Simulation Editorial Board is working to expand the number of endorsed simulation centers, which will give diplomates more choices for location and course dates.


In summary, the data collected so far indicate that MOCA® simulation courses have been well received by participants, are responsible for widespread practice improvements, and that patients have benefited from the skills and knowledge learned. We are currently analyzing the impact of the performance improvement plans generated during the courses, and a detailed report will be forthcoming. However, from the analysis performed to date, it is clear that participants in the MOCA® simulation courses have embraced personal and system-wide performance improvements, while including anesthesiology colleagues, surgeons, nursing staff, pharmacists and hospital administrators in a wide range of meaningful patient safety initiatives. This is what our patients and their families expect of us.



Randolph H. Steadman, M.D., M.S., is Professor and Vice Chair, Department of Anesthesiology, David Geffen School of Medicine at the University of California, Los Angeles. 


Arnold J. Berry, M.D., M.P.H. is Professor of Anesthesiology, Director, Office of Continuing Medical Education, and Assistant Dean for 
Education, Emory University School of Medicine, Atlanta.


Douglas B. Coursin, M.D. is Professor of Anesthesiology and Medicine, University of Wisconsin School of Medicine and Public Health, Madison.


J. Jeffrey Andrews, M.D. is the R. Brian Smith Professor and Chair, Department of Anesthesiology, University of Texas Health Science Center at San Antonio.


References:

  1. McIvor W, Burden A, Weinger MB, Steadman R. Simulation for maintenance of certification in anesthesiology: the first two years. J Contin Educ Health Prof. 2012;32(4):236-242.
  2. Anson JA. MOCA saves a life [letter]. ASA Newsl. 2013;77(1):47.
  3. Presutti C. World-famous hospitals helped Boston cope with bombing. Voice of America. http://www.voanews.com/content/world-famous-hospitals-helped-boston-cope-with-bombing/1643942.html. Published April 18, 2013. Accessed May 24, 2013.
  4. Murray DJ, Boulet JR, Avidan M, et al. Performance of residents and anesthesiologists in a simulation-based skill assessment. Anesthesiology. 2007;107(5): 705-713.
  5. Bruppacher HR, Alam SK, LeBlanc VR, et al. Simulation-based training improves physicians’ performance in patient care in high-stakes clinical setting of cardiac surgery. Anesthesiology. 2010;112(4):985-992.
  6. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):2697-2701.
  7. Burden AR, Torjman MC, Dy GE, et al. Prevention of central venous catheter-related bloodstream infections: is it time to add simulation training to the prevention bundle? J Clin Anesth. 2012;24(7):555-560.
  8. Wayne DB, Didwania A, Feinglass J, et al. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest. 2008;133(1):56-61.
  9. Thomas EJ, Taggart B, Crandell S, et al. Teaching teamwork during the Neonatal Resuscitation Program: a randomized trial. J Perinatol. 2007;27(7):409-414.
  10. Gaba DM. The pharmaceutical analogy for simulation: a policy perspective. Simul Healthc. 2010;5(1):5-7.

Previous ArticleNext Article