Skip to main content

A randomized controlled trial of simulation training in teaching coronary angiographic views

Abstract

Introduction

Simulation technology has an established role in teaching technical skills to cardiology fellows, but its impact on teaching trainees to interpret coronary angiographic (CA) images has not been systematically studied. The aim of this randomized controlled study was to test whether structured simulation training, in addition to traditional methods would improve CA image interpretation skills in a heterogeneous group of medical trainees.

Methods

We prospectively randomized a convenience sample of 105 subjects comprising of medical students (N = 20), residents (N = 68) and fellows (N = 17) from the University of Arizona. Subjects were randomized in a stratified fashion into a simulation training group which received simulation training in addition to didactic teaching (n = 53) and a control training group which received didactic teaching alone (n = 52). The change in pre and post-test score (delta score) was analyzed by a two-way ANOVA for education status and training arm.

Results

Subjects improved in their post-test scores with a mean change of 4.6 ± 4.0 points. Subjects in the simulation training arm had a higher delta score compared to control (5.4 ± 4.2 versus 3.8 ± 3.7, p = 0.04), with greatest impact for residents (6.6 ± 4.0 versus 3.5 ± 3.4) with a p = 0.02 for interaction of training arm and education status.

Conclusions

Simulation training complements traditional methods to improve CA interpretation skill, with greatest impact on residents. This highlights the importance of incorporating high-fidelity simulation training early in cardiovascular fellowship curricula.

Peer Review reports

Introduction

Clinical educators are constantly innovating to address the challenge of effectively teaching students with ever increasing time constraints, medical complexity, and performance evaluation expectations [1]. In recent years, technology-enhanced simulation training in health sciences has been established to have large effects on outcomes of knowledge, skills and behaviors with modest effects on patient related outcomes in comparison with traditional learning [2,3,4]. The American Council for Graduate Medical Education (ACGME) statement of mandatory fellow participation in training has therefore recommended using simulation as a condition of accreditation for general cardiovascular and all cardiovascular sub-specialties. However, the exact methodology, implementation and educational objectives for simulation remains undefined [5, 6].

In the cardiac catheterization laboratory, several small randomized and non-randomized trials have shown the superiority of global and technical performance scores encompassing multiple elements of performance relating to diagnostic coronary angiography [7, 8], percutaneous coronary intervention [9, 10] and trans-septal puncture [11]. Mentored simulation training may also have the potential to reduce procedural errors [7, 8, 12], contrast use, fluoroscopy and total procedure times [13,14,15]. Small randomized controlled studies have also demonstrated their potential role in percutaneous coronary intervention training [9, 13, 16]. There have, however, also been studies that have reported contradictory findings, primarily on clinical benefit, adding speculation to their overall advantages [10, 17, 18]. and as a tool for assessing competency.

We, therefore, designed a randomized study examining, as a subject of research, the use of simulation by manipulating a virtual fluoroscopy C-arm with continuous coronary anatomy overlay visualization in addition to a traditional method of learning via a lecture on coronary angiographic views on the achievement of a beginner competency in correct coronary angiographic view interpretation, focusing on early trainees. A beginner competency was defined in this study as an ability to correctly recognize at least 1/3rd of angiographic views correctly. In contrast to prior studies, we decided to focus on a single, important, granular competency with which many early trainees traditionally struggle with and to utilize a function of endovascular simulation which has not been previously well studied – the representation of 3-dimensional virtual anatomy as a teaching tool in addition to standard fluoroscopic 2-dimensional simulated imaging. We hypothesized that the greatest impact of the protocol would be dependent on the level of training and therefore chose to study a convenience sample of medical students, residents and fellows, with a particular emphasis on medical students and residents with either no or little prior exposure to the cardiac catheterization laboratory.

Methods

Study population

One hundred and five volunteer trainees, comprising a spectrum of medical students in their third and fourth year of training, internal medicine residents, family medicine residents, emergency medicine residents, and cardiovascular fellows from the University of Arizona met eligibility for the study and were consecutively enrolled from August 2016 to March 2017. Inclusion criteria: 3rd of 4th year medical students, residents from either internal medicine, family medicine or emergency medicine or cardiology fellows at any stage of training at the University of Arizona who provided consent and were willing to participate and complete the training session as well as the pre- and post-test. Exclusion criteria: Trainees outside the defined level of training and who were unable to commit to the training session or testing.

Ethics

The study protocol was in compliance with the Declaration of Helsinki and was approved by the University of Arizona Human Subject Protection Program Institutional Review Board. All participants were briefed individually at the time of enrollment and they provided written informed consent.

Study design

At study entry, all participants filled out an online survey to provide data on demographic and baseline self-reported visuo-spatial skills and were assigned a unique identification number at the start of the study. Next, they were administered a self-paced pre-intervention online test (pre-test), (see “Pre-test Coronary Angiographic Training Study”, Supplemental File). This test comprised of 30 de-identified still images and video clips of real angiographic films (that loop automatically every 5 s) in a multiple-choice format. Three questions tested for correct identification of each of the three coronary arteries with three multiple-choice question (MCQ) options. Twenty-seven questions tested for correct identification of the artery and projection with 9 MCQ options covering the six standard angiographic projections for the left coronary artery [left anterior oblique (LAO) caudal, LAO cranial, right anterior oblique (RAO) caudal, RAO cranial], and 3 for the right coronary artery (LAO cranial, AP cranial and RAO).

Study participants were then subjected to a stratified randomization process (Fig. 1) based on their education status (medical student, resident, fellow) and divided into two training arms; a simulation arm which received simulation training in addition to didactic teaching (N = 53) and a control arm which received didactic teaching alone (N = 52). Blinding was not performed. Convenience sampling was used.

Fig. 1
figure 1

Flow diagram of simulation versus control to teach coronary angiographic interpretation skills. Subjects (medical students, residents, and fellows) were randomly assigned to simulation arm (mentored simulation training using a dedicated simulator with two-dimensional and three-dimensional virtual anatomic views and didactic teaching) versus control (didactic teaching alone with no simulation). Subjects underwent testing before and after training

Simulation interface

The Mentice VIST ® -C (Gothenburg, Sweden) is a portable, high-fidelity endovascular simulator, with good construct and concurrent validity, used in catheter training for coronary and peripheral interventions. This device is connected to a monitor and a laptop which runs the simulation software (VIST-8). The rest of the interface comprises of a dual foot switch for fluoroscopy and cine-angiography, and a syringe for simulated contrast injection (Fig. 1). The simulator includes pre-designed coronary cases with angiographic data, and a single standard case with normal coronary anatomy was selected and used uniformly throughout the study. Additionally, the simulator has buttons and joysticks which enable the operator to virtually move the patient table and the C-arm to switch between angiographic projections akin to a real catheterization laboratory set-up. In addition to simulated standard fluoroscopic imaging, the user is able to switch to 3D visualized virtual anatomy, which overlays the coronary anatomy on the fluoroscopic image.

Simulation training

Simulation training consisted of a one-on-one mentored training on the simulator for 10–40 min (median: 20 min). The duration of training was decided by the subjects themselves based on their learning pace. At the start of the exercise, they were instructed in person by a trained operator with the aid of a standardized instruction guide to manipulate the table and the C-arm, acquire standard angiographic projections and to recognize dynamic changes in the orientation of the coronary vessels in each of these projections, both in 2D and 3D modes. An instruction manual with conceptual checkpoints and the trained operator were available to the participants throughout the session if needed.

Didactic teaching

Didactic teaching consisted of printed instruction material and a standardized ten-minute online video tutorial by Morton Kern, MD from his DVD “Cath Lab Essentials with Dr. Morton Kern” on identifying coronary angiographic projections [19]. Subjects could take notes during this session if they desired.

Evaluation of post-training performance

At the end of the training session, trainees were subjected to the online post-test which comprised of the same 30 projections from the pretest, but randomly shuffled and the test was again self-paced. All pre- and post-test data obtained including test performance scores were archived into an online encrypted folder.

Statistical analysis

Statistical analysis was performed using STATA version 14.2 (StataCorp., College Station, Texas). All data in this manuscript are represented as mean for parametric continuous variables and as proportions and percentages for categorical variables. Baseline characteristics (Table 1) include a 95% confidence interval for mean age (normal) and for proportions for binomial variables (Wald). ANOVA analysis was used to analyze the effects of training arm and education status (independent variables) and the interaction between these two factors upon the change in test score (delta score). With only two time point measures (pre and post), repeated measures ANOVA was not needed. A two-sided p value ≤ 0.05 was considered significant.

Table 1 Baseline characteristics

Results

Study population characteristics

A total of 105 subjects were enrolled in a convenience sample, comprising N = 20 medical students, N = 68 residents and N = 17 cardiology fellows. Participants were predominantly male (59%) with a mean age of 30 ± 4 years. Among the 68 residents, 55 were from Internal Medicine, 6 from Family medicine and 7 from Emergency medicine. A majority, 84%, of the participants self-reported playing a musical instrument, competitive sports, or video game routinely over the past year. In terms of handedness, the majority were right-handed (89%) with a small number of them reporting mixed handedness or being ambidextrous (8%). Prior to taking the study, 89% of the participants graded themselves as a novice or a beginner in coronary angiographic training of which 96% of them stated that they were not confident in accurately identifying coronary anatomy on an angiographic image. There were no significant differences in these baseline characteristics between the simulation and control arms (Table 1). After completion of the training experience, 50% rated the lecture as good or excellent. Among those randomized to simulation, 85% graded their satisfaction with the simulation experience as good or excellent and 98% agreed or strongly agreed that simulation training is essential in cardiovascular fellowship training. There were no adverse events.

Test performance scores

Out of a maximum score of 30, the pre-test score was 6.5 ± 2.5 for medical students, 6.5 ± 2.5 for residents, and 18.2 ± 4.1 for fellows (p < 0.0001 for effect of education status). The change in score (delta score) from pre- to post-test is given in Table 2. Subjects improved in their post-test scores with a mean delta score of 4.6 ± 4.0, with subjects that underwent simulation training having a greater delta score (5.4 ± 4.2 vs 3.8 ± 3.7, p = 0.04). The interaction of factors of training arm (simulation versus control) and education group (medical student, resident, fellow) were further evaluated in a two-way ANOVA. Education status alone was not significant (p = 0.13). However, there was a significant interaction effect of training arm and education status (p = 0.02), such that residents derived the greatest benefit from simulation training compared to control (6.6 ± 4.0 versus 3.5 ± 3.4, Table 2). Pre- and post-test scores by training arm and education status are shown in Fig. 2.

Table 2 Difference in test score after control or simulation training according to education group
Fig. 2
figure 2

Pre- and Post-test scores by education status and training group. Line plots of pre- and post-test scores for medical students (top row), residents (middle row) and fellows (bottom row), by control group (left panel) and simulation group (right panel). Mean values of pre- and post-test scores shown in red

The delta scores by angiographic view are shown in box plots in the Supplemental Fig. 1. In this figure there is a visual impression that simulation best improved the identification of the RAO projections of the right and left coronary arteries and of the LAO cranial projection of the right coronary artery.

Discussion

The results of this study demonstrate that one brief (median 20-min) training session using a high-fidelity three-dimensional simulation module, when used to supplement traditional lecture-based learning can significantly accelerate early trainees’ attainment of the competency to correctly identify coronary angiographic projections. Although the effect was modest, the amount of time investment involved relative to the gain in knowledge should be noted as it sometimes takes weeks to months to achieve the same level via traditional learning. Manipulating the C-arm in simulations allows the trainee to explore virtual three-dimensional coronary anatomies actively in real-time, thereby facilitating internal mental anatomical model construction, developing hand–eye coordination skills, and improving confidence in troubleshooting technical challenges in a safe learning environment. The ability to continuously track the coronary arteries in these simulation training sessions is a distinct advantage in visual-spatial learning compared to traditional interrupted 2-dimensional representation of coronary anatomy between shots in real world angiography.

Prior studies have consistently shown the greatest impact of simulation in novice trainees, consistent with findings reported from other simulation-based studies [9, 16, 20, 21]. Correspondingly, the greatest improvement in our study was noted in early trainees, specifically residents, which are best representative of a new cardiology fellow with no prior cath lab experience. These findings further support the need for more studies to justify the adoption of a simulation curriculum early on in undergraduate and graduate medical education programs [22].

The discrepancy between cardiology fellows and novice trainees is likely explained by the fellows’ previous attainment of the tested competency (basic anatomical identification on coronary angiograms) during their clinical training and experience. The study was conducted later in the academic year, and even our first-year cardiology fellows had already been exposed to coronary angiographic interpretation. Therefore, our described simulation training methodology should be integrated into a curriculum as part of introductory training. More broadly, our work demonstrates the importance of targeting a training protocol to the appropriate trainee. We speculate that cardiology fellows would best benefit from more advanced training protocols, such as teaching how to anticipate C-arm positioning to best visualize coronary anatomy.

To our knowledge, this is the first randomized controlled study to investigate the additive role of high-fidelity simulation training to traditional methods in teaching basic coronary angiography view interpretation to junior physicians. A recent study from France by Fischer et al [23] randomized 118 medical school students into simulation and traditional power-point based teaching. They reported that the simulation group did better in identifying coronary anatomy and coronary angiographic projections after a single simulation session. Although our main findings were similar, our study design is different. Firstly, we recruited a spectrum of trainees at various levels of clinical training, beyond medical students, to examine if more experienced trainees would benefit. Next, in order to allow for different learning speeds and preferences, subjects in the simulation arm were provided one-on-one instruction at the start of the exercise and then allowed independent unobserved practice time with no restriction on the amount of time spent on the simulator. Finally, since our subjects had varying amounts of exposure to clinical cardiology and familiarity with coronary angiography, we decided to focus on improvement in test performance from baseline, pre-intervention to post-intervention (delta scores) as our major primary outcome rather than an isolated post-intervention score by itself as reported by Fischer et al [23].

Limitations

There are limitations to our study that are inherent with our sample size and study design. Our results for fellows are likely affected by their small sample size and varied amount of exposure to CA prior to the study. However, a study specific to cardiology fellows would require a multi-year and multi-center study, which would be limited by the general availability of coronary simulators. We did not perform quantitative assessment of baseline visuospatial skills of our study participants and so their influence if any on the study outcome is unknown. We were also unable to explore the effect of a single structured simulation session on long-term retention. Also, the additive effect of periodic booster training sessions on knowledge acquisition and retention was not studied. It would have been interesting to see if subjects in the control arm would have benefitted from crossing over to simulation training at the end of the study by administering a repeat assessment. This study was not blinded, but as the outcomes measurement was the performance on a multiple choice question test, the lack of blinding is unlikely to have caused bias. There was no sample size calculation according to a predetermined improvement in performance.

Conclusion

In conclusion, this study highlights the growing role of simulation based medical education in the field of cardiology beyond just acquiring procedural skill in the cath lab. It suggests that even a single, brief targeted training session can rapidly improve early trainees’ attainment of coronary angiographic projection competency. This study strengthens the case for the development of a framework for learning and competency assessment using simulation. Further large studies are needed to justify the cost of implementing simulation-based programs as part of cardiovascular fellowship training and should focus on tailored simulation training methodology for achieving specific competencies. Despite the cost barriers of integrating simulation in training, widespread adoption will hopefully result in a decrease in cost via the economies of scale.

Availability of data and materials

The datasets used and/or analysed during the current study available from the corresponding author on reasonable request.

References

  1. DaRosa DA, Skeff K, Friedland JA, et al. Barriers to effective teaching. Acad Med. 2011;86(4):453–9. https://doi.org/10.1097/ACM.0b013e31820defbe.

    Article  Google Scholar 

  2. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 2002;236(4):458–63; discussion 63–4 https://doi.org/10.1097/01.SLA.0000028969.51489.B4 [published Online First: 2002/10/09]

  3. McGaghie WC, Issenberg SB, Petrusa ER, et al. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44(1):50–63. https://doi.org/10.1111/j.1365-2923.2009.03547.x. [publishedOnlineFirst:2010/01/19].

    Article  Google Scholar 

  4. Au K, Lam D, Garg N, et al. Improving skills retention after advanced structured resuscitation training: a systematic review of randomized controlled trials. Resuscitation. 2019;138:284–96. https://doi.org/10.1016/j.resuscitation.2019.03.031. [publishedOnlineFirst:2019/04/01].

    Article  Google Scholar 

  5. King SB 3rd, Babb JD, Bates ER, et al. COCATS 4 task force 10: training in cardiac catheterization. J Am Coll Cardiol. 2015;65(17):1844–53. https://doi.org/10.1016/j.jacc.2015.03.026. [publishedOnlineFirst:2015/03/18].

    Article  Google Scholar 

  6. Green SM, Klein AJ, Pancholy S, et al. The current state of medical simulation in interventional cardiology: a clinical document from the society for cardiovascular angiography and intervention’s (SCAI) simulation committee. Catheter Cardiovasc Interv. 2014;83(1):37–46. https://doi.org/10.1002/ccd.25048. [publishedOnlineFirst:2013/06/06].

    Article  Google Scholar 

  7. Casey DB, Stewart D, Vidovich MI. Diagnostic coronary angiography: initial results of a simulation program. Cardiovasc Revasc Med. 2016;17(2):102–5. https://doi.org/10.1016/j.carrev.2015.12.010. [publishedOnlineFirst:2016/01/27].

    Article  Google Scholar 

  8. Schimmel DR, Sweis R, Cohen ER, et al. Targeting clinical outcomes: endovascular simulation improves diagnostic coronary angiography skills. Catheter Cardiovasc Interv. 2016;87(3):383–8. https://doi.org/10.1002/ccd.26089. [publishedOnlineFirst:2015/07/23].

    Article  Google Scholar 

  9. Bagai A, O’Brien S, Al Lawati H, et al. Mentored simulation training improves procedural skills in cardiac catheterization: a randomized, controlled pilot study. Circ Cardiovasc Interv. 2012;5(5):672–9. https://doi.org/10.1161/CIRCINTERVENTIONS.112.970772. [publishedOnlineFirst:2012/10/11].

    Article  Google Scholar 

  10. Jensen UJ, Jensen J, Olivecrona G, et al. The role of a simulator-based course in coronary angiography on performance in real life cath lab. BMC Med Educ. 2014;14:49. https://doi.org/10.1186/1472-6920-14-49. [publishedOnlineFirst:2014/03/14].

    Article  Google Scholar 

  11. De Ponti R, Marazzi R, Ghiringhelli S, et al. Superiority of simulator-based training compared with conventional training methodologies in the performance of transseptal catheterization. J Am Coll Cardiol. 2011;58(4):359–63. https://doi.org/10.1016/j.jacc.2011.02.063. [publishedOnlineFirst:2011/07/16].

  12. Gurm HS, Sanz-Guerrero J, Johnson DD, et al. Using simulation for teaching femoral arterial access: a multicentric collaboration. Catheter Cardiovasc Interv. 2016;87(3):376–80. https://doi.org/10.1002/ccd.26256.[publishedOnlineFirst:2015/10/23].

    Article  Google Scholar 

  13. Jensen UJ, Jensen J, Ahlberg G, et al. Virtual reality training in coronary angiography and its transfer effect to real-life catheterisation lab. EuroIntervention. 2016;11(13):1503–10. https://doi.org/10.4244/EIJY15M06_05. [publishedOnlineFirst:2015/11/09].

    Article  Google Scholar 

  14. Popovic B, Pinelli S, Albuisson E, et al. The simulation training in coronary angiography and its impact on real life conduct in the catheterization laboratory. Am J Cardiol. 2019;123(8):1208–13. https://doi.org/10.1016/j.amjcard.2019.01.032.

    Article  Google Scholar 

  15. Prenner SB, Wayne DB, Sweis RN, et al. Simulation-based education leads to decreased use of fluoroscopy in diagnostic coronary angiography. Catheter Cardiovasc Interv. 2018;91(6):1054–9. https://doi.org/10.1002/ccd.27203.

    Article  Google Scholar 

  16. Voelker W, Petri N, Tonissen C, et al. Does simulation-based training improve procedural skills of beginners in interventional cardiology?–a stratified randomized study. J Interv Cardiol. 2016;29(1):75–82. https://doi.org/10.1111/joic.12257. [publishedOnlineFirst:2015/12/17].

    Article  Google Scholar 

  17. Norman G, Dore K, Grierson L. The minimal relationship between simulation fidelity and transfer of learning. Med Educ. 2012;46(7):636–47. https://doi.org/10.1111/j.1365-2923.2012.04243.x. [publishedOnlineFirst:2012/05/24].

    Article  Google Scholar 

  18. Aeckersberg G, Gkremoutis A, Schmitz-Rixen T, et al. The relevance of low-fidelity virtual reality simulators compared with other learning methods in basic endovascular skills training. J Vasc Surg. 2019;69(1):227–35. https://doi.org/10.1016/j.jvs.2018.10.047. [publishedOnlineFirst:2018/12/24].

    Article  Google Scholar 

  19. Kern M. Cathlab Essentials with Dr. Morton Kern: Park Bench Media, 2011.

  20. Gallagher AG, Ritter EM, Champion H, et al. Virtual reality simulation for the operating room: proficiency-based training as a paradigm shift in surgical skills training. Ann Surg. 2005;241(2):364–72. [published Online First: 2005/01/15].

    Article  Google Scholar 

  21. Hsu JH, Younan D, Pandalai S, et al. Use of computer simulation for determining endovascular skill levels in a carotid stenting model. J Vasc Surg. 2004;40(6):1118–25. https://doi.org/10.1016/j.jvs.2004.08.026. [publishedOnlineFirst:2004/12/29].

    Article  Google Scholar 

  22. Westerdahl DE. The necessity of high-fidelity simulation in cardiology training programs. J Am Coll Cardiol. 2016;67(11):1375–8. https://doi.org/10.1016/j.jacc.2016.02.004.

    Article  Google Scholar 

  23. Fischer Q, Sbissa Y, Nhan P, et al. Use of simulator-based teaching to improve medical students’ knowledge and competencies: randomized controlled trial. J Med Internet Res. 2018;20(9): e261. https://doi.org/10.2196/jmir.9634. [publishedOnlineFirst:2018/09/27].

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Dr. Morton Kern for allowing us to use excerpts from his education video series titled “Cathlab Essentials with Dr. Morton Kern” in our study. We also appreciate the help of Jennifer Tsang, DO and Jonida Krate, MD for their assistance with recruitment of participants for the study.

Funding

None.

Author information

Authors and Affiliations

Authors

Contributions

Conception and design: KSL. Data acquisition: BN, WXW, WY, SK, IN. Data interpretation: KSL, KBK, DA, DF, OH. Drafting and revision: WV, JHI, JZL, KSL. Of note, the manuscript submitted is original, with no portion under simultaneous consideration for publication elsewhere or previously published, only those who made important contributions to the study and are thoroughly familiar with the primary data are included as authors, and all authors are responsible for the contents and have read and approved the manuscript for submission to the BMC Medical Education.

Corresponding author

Correspondence to Julia H. Indik.

Ethics declarations

Consent for publication

Not applicable.

Competing interests

Dr. Kwan Lee discloses he is a consultant for Mentice. The other authors have no conflicts.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Work attributed to: Department of Medicine, University of Arizona, Tucson, Arizona.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lee, K.S., Natarajan, B., Wong, W.X. et al. A randomized controlled trial of simulation training in teaching coronary angiographic views. BMC Med Educ 22, 644 (2022). https://doi.org/10.1186/s12909-022-03705-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-022-03705-z

Keywords