Skip to main content

Advertisement

Implementation of a large-scale simulation-based cardiovascular clinical examination course for undergraduate medical students – a pilot study

Article metrics

Abstract

Background

We report the implementation of a large-scale simulation-based cardiovascular diagnostics course for undergraduate medical students.

Methods

A simulation-based course was integrated into the curriculum of second-year medical students (> 400 students/year). The first session aimed at teaching cardiac auscultation skills on mannequins and the second at teaching blood pressure measurement, peripheral arterial examination, and the clinical examination of heart failure in a technical skill-based manner and in a scenario.

Results

A total of 414 (99.8%) and 402 (98.5%) students, as well as 102 and 104 educators, participated during the 2016–2017 and 2017–2018 academic years across both types of sessions. The number of positive appreciations by students was high and improved from the first to the second year (session 1: 77% vs. 98%, session 2: 89% vs. 98%; p < 0.0001). Similar results were observed for educators (session 1: 84% vs. 98%, p = 0.007; session 2: 82% vs. 98%, p = 0.01). Feedbacks by students were positive regarding the usefulness of the course, fulfillment of pedagogical objectives, quality of the teaching method, time management, and educator-student interactivity. In contrast, 95% of students criticized the quality of the mannequins during the first year leading to the replacement of the simulation material the following year. Students most appreciated the auscultation workshop (25%), the practical aspect of the course (22%), and the availability of educators (21%).

Conclusions

Despite the need to commit significant human and material resources, the implementation of this large-scale program involving > 400 students/year was feasible, and students and educators reacted favorably.

Background

The first year of medical studies in France is exclusively theoretical and students reach a real patient’s bedside during the second year. At our university, the teaching of clinical examination to second-year students was previously exclusively based on theoretical lectures and short bedside teaching sessions of 3 h a week in small groups. As the time allocated to bedside teaching had markedly declined during the past years at our university, as well as in most institutions worldwide [1,2,3,4], the development of alternative teaching methods had become an essential task. As a consequence, we implemented a simulation-based education program and a dedicated simulation center was created with the purpose of promoting simulation in healthcare education.

Simulation-based medical teaching has rapidly expanded over the past decade in the light of several studies that have consistently demonstrated its effectiveness in improving students’ skills and performance [5,6,7,8,9,10,11,12]. Although several studies have shown the benefits of such educational programs, large-scale implementations have seldom been reported [6,7,8, 13, 14], mainly because such programs represent a considerable investment, both in terms of financial and human resources.

In this article, we seek to describe the implementation and demonstrate the feasibility of a large-scale, compulsory, simulation-based cardiovascular diagnostics course for undergraduate medical students. Our main objective was to assess whether level 1 of Kirkpatrick’s model (evaluation of the degree of favorable reactions to learning events by participants) [15, 16] was reached by analyzing the perceptions of educators and undergraduate medical students.

Methods

Simulation-based cardiovascular diagnostics course description

The course was divided into 2 consecutive compulsory sessions of 75 min each aiming at teaching basic cardiovascular clinical examination skills to just over 400 students and took place at the iLumens Paris-Diderot platform (Université de Paris, Paris, France) dedicated to teaching through simulation. Our experience over 2 consecutive years is reported in the present study. Prior to the beginning of the course, a 10-min oral presentation was made to explain the educational objectives, followed by a 10-min video precisely showing each step of the cardiovascular clinical examination and reminding basic anatomy and physiology concepts (link to the video: https://youtu.be/MhfiDq2XePU). The objectives of the first session were to teach the examination and palpation of the precordium, followed by heart auscultation on mannequins using an electronic stethoscope, including normal and pathological heart sounds (Lifeform® Auscultation Trainer and Smartscope®). At the end of this session, 5 short clinical scenarios on previously taught valvular heart diseases (aortic stenosis, aortic regurgitation and mitral regurgitation) were used to evaluate students’ understanding. Each student was individually exposed to all 5 cases. These scenarios consisted of brief and typical case presentations after which students had to use the stethoscope on mannequins to diagnose and characterize valvular conditions. All scenarios were developed by cardiologists and approved by expert members of the pedagogical committee. The second session was aimed at teaching blood pressure measurement using a manual sphygmomanometer (Welch Allyn®), peripheral arterial auscultation as well as pulse localization and palpation, and finally the clinical examination of heart failure in a technical skill-based manner and considering communication with other students acting as simulated patients in a scenario.

All educators were junior staff attendings from various medical specialties, including non-cardiologists, provided with a detailed instruction manual describing how to use the mannequins, the educational objectives, and the duration of each step of the course. In addition, a senior cardiologist coordinating the education program and a technical supervisor demonstrated the operation of the simulation equipment to all educators before the course. The senior cardiologist and the technical supervisor were present on site during both sessions. Students were divided into small groups of no more than 4 students for 1 educator in order to facilitate educator-student interaction and to maximize the time spent practicing and using the teaching equipment. A total of 16 independent rooms were provided at the simulation platform, and therefore 16 educators and up to 64 students were simultaneously present at the simulation platform. Two administrative staff members were actively involved in the organization of the module including sessions scheduling, students’ registration, and student orientation at the platform. A total of 8 mannequins dedicated to cardiac auscultation were used, as half of the students attended the first session involving auscultation of mannequins, and the second half simultaneously attended the second session that did not require mannequins dedicated to auscultation. After 75 min, all students and teachers changed rooms to complete the other session.

Students’ and educators’ perception evaluation tools

In order to assess the relevance of the teaching program, we used Kirkpatrick’s model of evaluation, which is divided into 4 levels - Level 1: evaluation of the degree of favorable reactions to learning events by participants; level 2: evaluation of knowledge acquisition (level 2a: attitudes/perceptions and level 2b: knowledge/skills); level 3: evaluation of behavioral changes and to what extent participants apply what they have learned during the training; and level 4: overall impact of training (level 4a: organizational practice, level 4b: student benefit and level 4c: patient benefit) [15, 16]. In the present study, we specifically evaluated whether the first level of Kirkpatrick’s model was reached. For this purpose, students and educators were asked to complete a course evaluation. The questionnaire was approved by the pedagogical committee of Paris-Diderot University, mandatory, anonymous and could be completed on a printed document or on any mobile device. The students’ evaluation consisted of 6 questions based on a score ranging from 0 to 3 (0 = poor, 1 = fair, 2 = good, 3 = excellent). These questions were ordered as follows: 1) usefulness of the course, 2) fulfillment of pedagogical objectives, 3) quality of the teaching method, 4) time management during the session, 5) educator-student interactivity and relationship, and 6) quality of the simulation equipment. In addition, a free response area was provided so students could specify the most and least useful aspects of the course and make suggestions for improvement. The educators’ survey consisted of 1 question assessing the overall satisfaction with the course based on the same score ranging from 0 to 3. The feedback was considered positive in case of a score ≥ 2.

Statistical analysis

Continuous variables were expressed as mean ± standard deviation, or numbers (percentages). The Shapiro-Wilk test was used to evaluate distribution among variables. As continuous variables were not normally distributed in the present study, the Wilcoxon test was used for comparison between groups. The χ2 test was used for comparison between categorical variables. The total score presented in the students’ survey is defined by the average (mean ± standard deviation) of the 6 individual questions. The analysis of the free response area is presented as the percentage of students expressing the same opinion. The free response areas of the questionnaire were independently analyzed by 2 independent observers, and opinions representing more than 20% of the study population were reported. A p value < 0.05 was considered statistically significant. Statistical analyses were performed using JMP V.10 software (SAS institute, Cary, North Carolina, USA). The datasets used and/or analysed during the current study are available from the corresponding author.

Results

During the 2016–2017 and 2017–2018 academic years, 415 and 408 second-year undergraduate medical students were enrolled at the Paris-Diderot University, and 414 (99.8%) and 402 (98.5%) students, as well as 102 and 104 educators, participated in the simulation-based education program respectively. In order to cover all the learners, a duration of use of the simulation platform of 28 h/year was required. Thus, 16 educators were simultaneously present on-site, each supervising a small group of no more than 4 students in a dedicated room. The survey was completed by 379 (92%) and 343 (85%) students, as well as 85 (83%) and 104 (100%) educators during the 2016–2017 and 2017–2018 academic years respectively. Therefore, a total of 722 surveys completed by students and 189 surveys completed by educators were analyzed in the present study.

Students’ overall appreciation improved from the 2016–2017 to the 2017–2018 academic years (Fig. 1). Thus, the appreciation for the first session was considered positive (overall score ≥ 2) by 276 (77%) and 336 (98%) students during the 2016–2017 and 2017–2018 academic years respectively (p < 0.0001). Similarly, the number of positive feedbacks for the second session improved from 325 (89%) to 337 (98%) (p < 0.0001). Educators’ overall satisfaction with the course also improved from 37 (84%) positive appreciations to 55 (98%) for the first session (p = 0.007) and from 34 (82%) to 47 (98%) for the second session (p = 0.01) (Fig. 2).

Fig. 1
figure1

Proportion of overall positive feedbacks by students for both sessions during the 2016–2017 and 2017–2018 academic years

Fig. 2
figure2

Proportion of overall positive feedbacks by educators for both sessions during the 2016–2017 and 2017–2018 academic years

Feedbacks by students were positive in both years and both sessions regarding the usefulness of the course, fulfillment of pedagogical objectives, quality of the teaching method, time management, educator-student interactivity and relationship (all mean scores ≥2). In contrast, the quality of the simulation equipment was considered as poor the first year and improved the following year (1.43 ± 0.78 vs. 2.11 ± 0.68 respectively, p < 0.0001). Students’ and educators’ feedbacks on the course are presented in Table 1.

Table 1 Students’ and educators’ feedbacks on the course

The analysis of the free response area of both years and sessions revealed that students most appreciated the auscultation workshop (25%), emphasizing that it allowed studying and comparing normal as well as abnormal heart sounds. Students also acknowledged the practical aspect of the course (22%), with a particular interest in the opportunity of using mannequins for auscultation, measuring blood pressure on each other, and being confronted with short scenarios. Finally, the availability of educators was appreciated (21%), as students highlighted the benefits of working in small groups (25%) that allowed questions to be asked more easily and facilitated educator-student as well as student-student interactivity, in a calmer setting than the hospital. During the first year of implementation, the quality of the simulation material was criticized and considered as the worst aspect of the course by 360 students (95%), mainly because of technical issues. The following year the number of students criticizing the simulation material dropped to 116 (34%).

Discussion

In this study, we report an original experience regarding the implementation of a large-scale simulation-based cardiovascular diagnostics course for undergraduate medical students. Despite the participation of more than 400 students each year, the implementation proved feasible and successful. We observed a significant improvement from the first to the second year as evidenced by the positive feedbacks of both students and educators.

A major issue encountered during the first year of implementation was related to poor quality simulation mannequins, which represented by far the main complaint by students and educators. Mannequins were considered of poor quality, with heart sounds different from reality and not in line with the national curriculum and the theoretical objectives of the local second-year program. As a consequence, they were completely replaced and refunded the following year and an alternative supplier was used (SAM Basic® with a SimScope® stethoscope, Cardionics, Texas, USA). The new mannequins were tested before the purchase and had the advantage of presenting a wider range of customizable heart sounds of which the characteristics were much closer to reality, leading to the significant improvement of the feedback collected. This point emphasizes the importance of a thorough simulation material selection.

During the first year, the teaching sessions were scheduled mid-year, after the students had started learning the clinical examination by examining real patients in wards. Although the appreciation of students was overall positive, many of them mentioned that the simulation-based teaching program would have been much more useful if it had been scheduled before reaching a real patient’s bedside. Consequently, the sessions were scheduled at the beginning of the following year. A large proportion of students appreciated the possibility to repeat each step of the clinical examination at their own pace without feeling the pressure of facing a real patient, allowing them to gain confidence and acquire knowledge.

Despite the necessity to commit important resources and the poor quality of the mannequins during the first year of implementation, the overall level of satisfaction of the students and educators remained high, suggesting a positive impact of the student- educator relationship during the training. Indeed, most educators were very enthusiastic on the fact that they had a dedicated time for teaching with a restricted number of students, without having to manage multiple non-educational tasks concomitantly as it is often the case during traditional teaching sessions in hospital wards. Reciprocally, students appreciated the opportunity to benefit from the full attention of their educators and emphasized that they could ask questions freely. Thus, the proposed format encouraged the active involvement of all students through each step of the course, as they were constantly questioned on the cardiovascular clinical examination. It has been previously reported that small-group discussion and repeated auscultation of simulated heart sounds improves auscultation proficiency [1, 17].

Although similar teaching programs have been described in literature [7, 18], the implementation of such a large scale program has seldom been reported. We showed that developing such a program is feasible and widely appreciated by students as well as educators. Despite the need to commit significant human and material resources to carry out this educational project, students reacted favorably to the training sessions. Consequently, it can be concluded that Kirkpatrick’s level 1 was reached [15, 16, 19], encouraging us to continue and improve this educational program.

Implementing a simulation-based education program to such a large number of students is a time consuming and difficult task. The development of a dedicated platform is a key factor for success as it allows bringing together educators, students, as well as technical and administrative staffs, in a single facility specially designed to facilitate the use of simulation equipment and to constitute small groups of students. The work accomplished by the administrative staff is of paramount importance to schedule sessions and coordinate the presence of students and educators. In addition, the presence of a competent technical staff on site is crucial to immediately correct major and minor technical issues, so that educators can dedicate all their time to teaching.

Finally, it is worth emphasizing that the overall appreciations were high and improved from one year to the next. This improvement may not only be related to the positive consequences of the replacement of the simulation material, but it may also reflect the learning curve of educators who gained experience over time in using new educational tools.

This study has several limitations [20]. First, the quality of the education program was assessed on the basis of the analysis of educators’ and students’ opinions, and students’ cardiovascular knowledge was not evaluated [21]. However, it is worth emphasizing that the objective of this pilot study was to assess whether level 1 was reached in Kirkpatrick’s model. Second, educators’ opinion evaluation was based on a single question. Third, changing the timing of the sessions, from mid-year to the beginning of the year, may have positively impacted the results of the surveys. Moreover, the implementation of a new educational activity may be associated with increased enthusiasm from both educators and students, and may also have impacted the results of the surveys. Lastly, although the implementation of the simulation-based education program has been a success, the relatively high cost of the simulation material and of creating such a platform, as well as the necessity to recruit a large number of educators, may represent a limit to the generalization of this teaching approach.

Conclusions

Despite the need to commit significant human and material resources to carry out this large-scale educational project, it was feasible, the students reacted favorably, and Kirkpatrick’s level 1 was reached. Further studies evaluating students’ skills acquisition following the teaching program are necessary to confirm the benefits of this method.

Availability of data and materials

The datasets generated and/or analyzed during the current study are available from the corresponding author on request.

References

  1. 1.

    Gale CP, Gale RP. Is bedside teaching in cardiology necessary for the undergraduate education of medical students? Med Educ. 2006;40:11–3. https://doi.org/10.1111/j.1365-2929.2005.02357.x.

  2. 2.

    Reichsman F, Browning FE, Hinshaw JR. Observations of undergraduate clinical teaching in action. J Med Educ. 1964;39:147–63.

  3. 3.

    Collins GF, Cassie JM, Daggett CJ. The role of the attending physician in clinical training. J Med Educ. 1978;53:429–31.

  4. 4.

    K Ahmed ME-B. What is happening to bedside clinical teaching? Med Educ. 2002;36:1185–8.

  5. 5.

    Issenberg SB, McGaghie WC, Hart IR, Mayer JW, Felner JM, Petrusa ER, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861–6.

  6. 6.

    Issenberg SB, Petrusa ER, McGaghie WC, Felner JM, Waugh RA, Nash IS, et al. Effectiveness of a computer-based system to teach bedside cardiology. Acad Med J Assoc Am Med Coll. 1999;74:S93–5.

  7. 7.

    Ewy GA, Felner JM, Juul D, Mayer JW, Sajid AW, Waugh RA. Test of a cardiology patient simulator with students in fourth-year electives. J Med Educ. 1987;62:738–43.

  8. 8.

    Butter J, McGaghie WC, Cohen ER, Kaye M, Wayne DB. Simulation-based mastery learning improves cardiac auscultation skills in medical students. J Gen Intern Med. 2010;25:780–5. https://doi.org/10.1007/s11606-010-1309-x.

  9. 9.

    Norman G, Dore K, Grierson L. The minimal relationship between simulation fidelity and transfer of learning. Med Educ. 2012;46:636–47. https://doi.org/10.1111/j.1365-2923.2012.04243.x.

  10. 10.

    McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ. 2010;44:50–63. https://doi.org/10.1111/j.1365-2923.2009.03547.x.

  11. 11.

    Weller JM, Nestel D, Marshall SD, Brooks PM, Conn JJ. Simulation in clinical teaching and learning. Med J Aust. 2012;196:594.

  12. 12.

    Gosai J, Purva M, Gunn J. Simulation in cardiology: state of the art. Eur Heart J. 2015;36:777–83. https://doi.org/10.1093/eurheartj/ehu527.

  13. 13.

    Favrat B, Pécoud A, Jaussi A. Teaching cardiac auscultation to trainees in internal medicine and family practice: does it work? BMC Med Educ. 2004;4(5). https://doi.org/10.1186/1472-6920-4-5.

  14. 14.

    Swamy M, Sawdon M, Chaytor A, Cox D, Barbaro-Brown J, McLachlan J. A study to investigate the effectiveness of SimMan® as an adjunct in teaching preclinical skills to medical students. BMC Med Educ. 2014;14:231. https://doi.org/10.1186/1472-6920-14-231.

  15. 15.

    Yardley S, Dornan T. Kirkpatrick’s levels and education “evidence.”. Med Educ. 2012;46:97–106. https://doi.org/10.1111/j.1365-2923.2011.04076.x.

  16. 16.

    Piryani RM, Dhungana GP, Piryani S, Sharma Neupane M. Evaluation of teachers training workshop at Kirkpatrick level 1 using retro-pre questionnaire. Adv Med Educ Pract. 2018;9:453–7. https://doi.org/10.2147/AMEP.S154166.

  17. 17.

    Horiszny JA. Teaching cardiac auscultation using simulated heart sounds and small-group discussion. Fam Med. 2001;33:39–44.

  18. 18.

    de Giovanni D, Roberts T, Norman G. Relative effectiveness of high- versus low-fidelity simulation in learning heart sounds. Med Educ. 2009;43:661–8. https://doi.org/10.1111/j.1365-2923.2009.03398.x.

  19. 19.

    Leslie K, Baker L, Egan-Lee E, Esdaile M, Reeves S. Advancing faculty development in medical education: a systematic review. Acad Med J Assoc Am Med Coll. 2013;88:1038–45. https://doi.org/10.1097/ACM.0b013e318294fd29.

  20. 20.

    Eva KW. Broadening the debate about quality in medical education research. Med Educ. 2009;43:294–6. https://doi.org/10.1111/j.1365-2923.2009.03342.x.

  21. 21.

    Boulet JR, Durning SJ. What we measure … and what we should measure in medical education. Med Educ. 2019;53:86–94. https://doi.org/10.1111/medu.13652.

Download references

Acknowledgements

We sincerely thank Professor Guillaume Alinier for reviewing our manuscript.

Funding

None.

Author information

DA, JA, JG, PFC, PP, VDL, AL, AF have made substantial contributions to the conception an design of the work; DA, JA, JG, IE, SAR, VDL, AL, AF have made substantial contributions to the acquisition, analysis, and interpretation of data; DA, JA, PR, VDL, AL, AF have drafted the work or substantively revised it. DA, JA, JG, PFC, SAR, IE, PR, PP, VDL, AL, AF have approved the submitted version (and any substantially modified version that involves the author’s contribution to the study); DA, JA, JG, PFC, SAR, IE, PR, PP, VDL, AL, AF have agreed both to be personally accountable for the author’s own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature.

Correspondence to Dimitri Arangalage.

Ethics declarations

Ethics approval and consent to participate

The pedagogical committee of the University of Paris that deals with research authorizations and ethical considerations in the field of education has approved the study. Verbal consent was obtained from study participants and approved by the committee.

Consent for publication

Not applicable.

Competing interests

Authors have no conflict of interest related to the present paper to declare.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Keywords

  • Clinical education
  • Mannequins
  • Simulation
  • Auscultation
  • Heart sounds