Skip to main content

Self-directed learning by video as a means to improve technical skills in surgery residents: a randomized controlled trial

Abstract

Background

With their demanding schedules, surgical residents have limited time to practice techniques. The aim is to evaluate the pedagogic model of self-directed learning using video in surgery residents.

Methods

Informed consent was obtained from all the participants. A randomized controlled trial was conducted in 2018 at Hôpital Maisonneuve-Rosemont (University of Montreal). Participants were general surgery residents. There were 27 eligible residents; 22 completed the study. They were filmed performing an intestinal anastomosis on cadaveric pig bowel. The self-directed learning by video (SDL-V) group was given an expert video, which demonstrated the technique performed by an experienced surgeon. The control group continued with their regular duties. Three weeks later, participants performed a second filmed anastomosis. Two attending surgeons evaluated the residents’ filmed anastomosis using the Objective Structured Assessment of Technical Skills scale. After their second anastomosis, all participants had access to the expert video and completed a survey.

Results

Score did not differ significantly between groups during the first (control: 23.6 (4.5) vs. SDL-V: 23.9 (4.5), p = 0.99, presented as mean (SD)) or second filmed anastomosis procedure (control: 27.1 (3.9) vs. SDL-V: 29.6 (3.4) p = 0.28). Both groups improved significantly from pre- to post-intervention (mean difference between the two anastomosis procedure with 95% CI for control: 3.5, [1.1; 5.9] and for SDL-V: 5.8, [3.4: 8.2]). Correlation between the evaluators for score was moderate (r = 0.6, 95% CI: [0.3: 0.8]). The pass/fail global evaluation exhibited poor inter-rater reliability (Kappa: 0.105, 95% CI: [− 0.2:0.4]). On the survey, all participants wanted more expert-made videos of specific surgical techniques.

Conclusions

Despite a higher final OSATS score for the intervention group, self-directed learning by video failed to produce a statistically significant difference on the overall OSATS scores between the two groups in this small cohort.

Peer Review reports

Background

Throughout their surgical training, residents must acquire both theoretical knowledge and technical skills. These skills derive from the psychomotor development of residents, which links cognition to movement [1]. Despite work hour restrictions, residents’ schedules remain extremely busy with both hospital and academic demands. This time-constraint schedule leaves limited time to practice surgical techniques. Thus, it is imperative to investigate optimal teaching methods for surgical residents.

Simulation is widely used in surgical residency programs. Notable examples include the “Fundamentals of Laparoscopic Surgery” (FLS) course [2], which evaluates basic laparoscopic competencies on trainer boxes, and institutions with laboratories that use animal models to learn and practice surgical procedures [3]. There are many known benefits of teaching through simulation, such as improved skill acquisition and increased safety in patient care [4, 5].

More recently, studies have investigated the use of video coaching in surgical training. During a coaching session, the novice will receive directed feedback from an expert in the specific field. These studies have shown improved technical skills in intestinal anastomosis on canine models [6] and also in the operating room [7].

Auto-didacticism refers to the process by which the students organize their thoughts, behaviors and emotions to maximally reap rewards from a learning experience [8]. The learning benefit stems from the students’ motivation and autonomy. In this article, auto-regulation, auto-didacticism, and self-learning will be used interchangeably.

Two studies from the United Kingdom have compared the effects of personalized feedback from an expert versus a video made by an expert highlighting key points [9, 10]. In both cases, the medical students performed procedures (sutures, Foley catheter insertion, etc.) while being filmed. Students would then review their video with an expert or alone with a video made by an expert. The results showed that auto-didacticism was equally beneficial as the presence of an expert to improve the procedural skills. These studies, conducted with medical students and basic procedures, open the doors to a potential novel teaching method that should be investigated with surgical residents.

With the proliferation and accessibility of new technologies, a wide range of e-learning tools have developed to improve medical and surgical education [11]. These e-learning platforms span from online curricula to interactive modules. There are multiple advantages, such as availability of the material independent of the presence of an instructor. Studies in surgical disciplines have looked into e-learning interventions to improve knowledge acquisition for trainees and patient care. In gynecology, a blended learning program using e-learning videos with face-to-face training modalities reduced the incidence of obstetric anal sphincter injuries [12]. Another study gave surgical residents access to internet-based interactive modules [13]. Residents had increased course score after completing the module and the tool was recommended by most who participated. Another group created instructional videos detailing laparoscopic suture and knot tying [14]. While their findings did not demonstrate improved performance, they concluded that optimizing this method might lead to a useful teaching technique. A 2016 systematic review concluded that most surgical e-learning models are effective, but there remained significant heterogeneity in terms of application [15].

The goal of our study was to determine the efficacy of self-directed learning by video on the acquisition of a surgical procedure in surgical residents. The primary outcome was the impact of auto-didacticism on the Objective Structure Assessment of Technical Skills (OSATS) total score [16]. The secondary outcomes were the impact of auto-didacticism on the various sub-categories of the OSATS chart, the interrater reliability for OSATS score, the interrater correlation for Pass/Fail, and lastly, the results of the survey.

Methods

Design, setting and participants

This project was approved by the “Comité d’Évaluation Scientifique en Santé Physique” (Scientific Evaluation Committee) affiliated with Hôpital Maisonneuve Rosemont. Informed consent was obtained from all the participants. The inclusion criterion for participants in this project was being a surgical resident from the University of Montreal’s general surgery residency program, post-graduate year 1 (PGY-1) to PGY-5 who consented to participate. The exclusion criteria were not consenting or being unable to participate, due to scheduling issues, for instance. This study is a single blinded randomized controlled trial. There were two arms; a self-directed learning by video group (SDL-V) and a control group (C). The study took place in 2018 at the Surgical Simulation Laboratory at Hôpital Maisonneuve-Rosemont, a hospital affiliated with University of Montreal. All methods were carried out in accordance with relevant guidelines and regulations.

Intervention

Baseline assessment

Residents were asked to create an end to end, single layer intestinal anastomosis with interrupted sutures on a cadaveric pig bowel. The same surgical assistant was present for all the procedures to ensure uniformity. Participants were told that they should inform this person exactly how they needed to be assisted during the anastomoses; otherwise the assistant did not take any initiative. This assistant also started and ended the recording, using a camera (SONY Handycam Exmor R) mounted on a tripod.

The intestines were harvested from pigs used in prior teaching modules (CPA 2016-NO-026, approved by Comité de Protection des Animaux du Centre de Recherche de l’Hôpital Maisonneuve-Rosemont). They were resected, divided into 15 cm long sections, and immediately frozen so that they could be thawed individually for each participant.

The Audiovisual Department of Maisonneuve-Rosemont Hospital edited the recordings. The final videos had no sound and repetitive tasks, such as knot tying, were removed.

Self-directed learning by video

After completing their baseline anastomosis, participants randomized to the intervention group received a Google Drive link (Mountain View, CA, USA) containing an expert video. This video was made prior to all baseline assessments. Producing the expert video took approximately 45 min. It demonstrates an end-to-end, single layer intestinal anastomosis using interrupted sutures on pig bowel. An experienced surgeon, who practices at the teaching hospital, performed the procedure. The video was finalized with voice-over narration by the same surgeon explaining key steps. The duration of this video was approximately 5 min. Residents were instructed to watch the video at their leisure and as many times as they wanted, they were also instructed not to share the video with other residents. Approximately 3 weeks after their baseline anastomosis, residents returned to the laboratory to perform the same procedure, which was also filmed.

Control group

After completing their baseline anastomosis, participants randomized to the control group did not have access to the expert video and were also asked not to view any videos online or elsewhere. They continued their regular clinical duties and returned to perform the same procedure, 3 weeks after their baseline anastomosis. After completion of their second anastomosis, they received the Google drive link, so they could also benefit from the expert video.

Residents, regardless of group, did not have access to the pig bowel between their baseline and second anastomosis to limit uneven skill acquisition through laboratory practice.

Outcomes measured

All participants performed two anastomoses during two separate sessions approximately 3 weeks apart. All the anastomoses were performed on a 15 cm thawed pig bowel that had been cut in half by the same assistant. The interventions were filmed in the same manner, such that only the gloved hands of the participants, with their reference number written on them, could be seen. A video bank was created on a separate Google drive. The final videos had no sound.

The primary outcome was the total score of the Objective Structured Assessment of Technical Skills (OSATS) scale [16]. This operative performance rating scale contains the following seven categories: respect for tissues, time and motion, instrument handling, knowledge of instrument, flow of operation, use of assistants, and knowledge of specific procedure. Each category is rated from 1 (lowest) to 5 (highest). The maximum score is 35. The evaluator then decides if the resident passed or failed the intervention, this is determined by the holistic impression and not a cut-off score. Two experienced surgeons were given access to the video bank and scored each performance individually.

The OSATS scale has been validated for construct validity, internal consistency and inter-rater reliability for evaluating open surgical simulation procedures. Given that we had two evaluators, who independently reviewed all the videos, we also looked at the inter-rater consistency in our study.

Finally, two short surveys were created by the authors. They were sent to all the participants after they had completed the second filmed anastomosis; one was sent to the intervention group and the other to the control group. They were composed of questions with multiple choice answers (see additional file 1 for full surveys). They were designed to gauge the residents’ attitudes toward self-directed learning by video in the context of their surgical training.

Sample size

Given that the pool of possible participants were surgical residents at the University of Montreal, the study relied on a convenient sample. There were 27 eligible candidates, 5 of which were excluded (due to scheduling or “away rotation” reasons), thus 22 residents participated in the study.

Randomization

After making their baseline anastomoses, participants were randomly allocated to either the control group or the intervention group. This was done using a 1:1 ratio and block randomization, stratified by the residents’ post-graduate year. Block size varied according to the number of residents per year who consented to participate. A sealed envelope method was used for the randomization process. The two surgeons who scored the videos were blinded to the randomization.

Statistical analysis

The PGYs of the participants of the two groups were compared with the Chi-square test. Spearman’s correlation coefficient was used to analyze the association between the OSATS scores (total and various components) provided by the two evaluators. Cohen’s Kappa coefficient was used to determine the inter-rater reliability between the two evaluators in terms of pass/fail. The intervals between the two sets of anastomoses of the groups were compared with the unpaired student-t test. Repeated measures two-way ANOVA followed with post-hoc Sidak’s multiple comparisons test was performed to compare the OSATS scores (total and various categories) of the two sets of anastomoses within and between each group. A p < 0.05 was considered significant. Statistical Analysis was done using PRISM 8.0 (Graphpad Software, La Jolla, CA). Since we used a convenient sample, we retrospectively calculated the size of the between-group difference in total OSATS scores that the experiment was powered to detect (statpages.info/postpowr.html). Unless stated otherwise, data are presented as mean (SD).

Results

There were 27 general surgery residents in our program. Two residents were unable to participate due to away rotations and 3 were unable to participate due to scheduling issues. Thus, 22 participants were consented for the study. There were 11 participants that completed the study per group (Fig. 1).

Fig. 1
figure1

Study Flow Chart

Baseline OSATS score and resident distribution per year were similar in both groups (Table 1).

Table 1 Baseline score, group distribution, and day interval between anastomosis. OSATS score and day interval presented as mean (SD). OSATS = objective structured assessment of technical skills. PGY = post graduate year. SDL-V = self-directed learning by video

OSATS results

There was no difference between groups regarding the total OSATS score during the initial (control: 23.6 (4.5) vs. SDL-V: 23.9 (4.5) p = 0.9) or the second filmed anastomosis procedure (control: 27.1 (3.9) vs. SDL-V: 29.6 (3.4), p = 0.28), but both groups significantly improved their performances during the interval (mean difference between video 1 and 2 with 95% CI in control: 3.5 [1.1; 5.9] and SDL-V: 5.8 [3.4; 8.2]) (Fig. 2.). Neither the groups (2.3%) nor the interaction between groups and videos (1.6%) were a significant source of variation during the RM ANOVA analysis of the total OSATS scores. Regarding the various OSATS components, both groups improved significantly in four of the seven categories (respect of tissue, time and motion, flow of operation and use of assistant) while only the SDL-V group improved in the three others (instrument handling, knowledge of instruments and knowledge of specific procedural steps) (Table 2). However, no difference was found between the groups for the first or the second filmed anastomosis procedure in any of the OSATS categories; in all categories, neither the groups nor the interaction between groups and videos were a significant source of variation during the RM ANOVA analysis.

Fig. 2
figure2

Mean difference in total OSATs score (presented as mean difference and 95% confidence interval) between first and second video for the control and intervention groups

Table 2 Mean score improvement (with 95% confidence interval) and p-value, per group, per OSATS category. CI = confidence interval. OSATS = objective structure assessment of technical skills

Retrospective power analysis

Having found a non-significant difference of 2.6 (p = 0.28) in total OSATS scores between the groups for the second video, a retrospective sample size analysis showed that 36 participants would have been necessary to deem a variation of such magnitude as significant. With the 11 participants that were used, power was sufficient to detect a mean difference of 4.61.

Inter-rater reliability

Two attending surgeons at our institution independently evaluated each filmed anastomosis, and were blinded to the study group. There was a statistically significant correlation between the overall score (the tally of each OSATS category) between the two evaluators, r = 0.58 (95% CI 0.34–0.75, P < 0.0001). OSATS score for each category were also significantly correlated, with the singular exception for “knowledge of instruments” (Table 3). However, for the pass/fail results, the evaluators agreed for 68% of the videos. Since the number of agreements expected by chance was 64%, resulting in a Cohen’s Kappa coefficient value of 0.105 (95% CI: − 0.20 to 0.41), the correlation was considered “poor.” The percentage of participants deemed to have passed or failed per filmed anastomosis per group for each evaluator is detailed in Table 4.

Table 3 Correlation between evaluators on OSATS score. OSATS = objective structured assessment of technical skills
Table 4 Percentage of participants (n in parenthesis) deemed to have passed, per first and second filmed anastomosis, for each rater

Survey

In the control group, 2 candidates reported watching a video on intestinal anastomosis and performing one during a real-case procedure. Every candidate in the control group wished they had had access to the video.

In the intervention group, all candidates watched the video 1 to 2 times between the two anastomoses. This self-teach didactic activity was considered enjoyable and useful by all.

Every candidate, regardless of group, reported wishing they had access to similar videos on various techniques throughout their residency.

Discussion

General surgery residency is a 5-year training program. During this time, the residents’ role is divided between being a healthcare provider and learner. In North America, residents can work between 70 to 100 h per week, depending on the specialty [17, 18]. Surgical residency is notorious for its long hours. Furthermore, the knowledge acquired in a surgical program must be both cognitive and technical. Thus, with limited time, the learning experience requires optimization. The traditional Halsteadian method of teacher-apprentice can no longer provide residents with sufficient operative experience and is often simply not feasible [19,20,21].

Our focus was to investigate a practical, focused learning method that was amenable to the residents’ schedule. E-learning encompasses a wide array of learning techniques, from online reading to interactive multimedia platforms. Its use to optimize medical pedagogy is under investigation as it has several advantages (accessible, can contain extensive material, interactive, independent of instructor availability) [22, 23]. Studies have tried to assess its use as a stand-alone method and in conjunction with other learning techniques.

In this study, we designed our e-learning technique. We hypothesized that creating a short video, tailored specifically for a single technique, voiced over by an expert in the field, could provide a useful tool for the residents. The use of this method would give the residents flexibility in their learning; it could quickly and easily be viewed at any time, even on the ubiquitous smart phone. The material was made by a reliable source, and the narration provided key points. The residents universally appreciated the experience and all the candidates reported wanting more similar videos.

The intervention group had a statistically significant improvement in all individual OSAT categories between their first and second filmed anastomosis, compared to the control group, which improved in only 4 of the 7 categories. Perhaps the intervention is a useful tool to hone those specific skills (instrument handling, knowledge of instruments and knowledge of specific procedural steps). This could be evaluated more closely in a follow up study.

However, despite a higher final OSATS score for the intervention group, there was no statistically significant difference between those who had access to the video and those who did not. Since both groups improved, it is plausible that simply performing two anastomosis a few weeks apart was sufficient to improve the skill. This could also be due to the effect of mental imagery or mental rehearsal; that despite not having access to the expert video, the control group still improved on the skill [24]. On the survey, two candidates in the control group reported seeing a video of the procedure, which can also explain the improvement. The diffusion of our expert video from the intervention group to the control group seems highly unlikely, though cannot be entirely ruled out.

There are some limitations to our study. It is possible that the lack of significance is due to a small number of participants: there were 27 residents in our program, and 22 were randomized. The retrospective power analysis indicated a minimum of 18 participants per group to show a difference. Also, subtle technical differences may be lost in the quality of the filming, which was done with a stationary camera. The procedural task may have been too simple to show a significant difference. Finally, the intervention itself could have no impact in terms of acquisition of technical surgical skill.

In terms of the inter-rater reliability, studies have previously shown a correlation coefficient for OSATS scores close to 0.8 [16]. In our study, the correlation coefficient was moderate, at 0.58. The different result from our study compared to the literature could possibly due to our study’s limitations. In several pedagogic studies, despite a correlation for scaled items, holistic views on whether or not a candidate passes or fails tend to have poor agreements between evaluators [25, 26]. Therefore, though our kappa score for the pass/fail results is poor this may not be that surprising.

Given the positive response to this intervention, there may be a place for similar learning techniques in the surgical curriculum. With patient consent, real interventions (such as appendectomies, cholecystectomies, bowel resections) could be filmed, edited and narrated. A video library could be created for residents. This could be coupled with current technical workshops and lectures that several surgical programs usually offer.

Conclusion

In our study, despite a global positive appreciation of the experience, self-directed learning by video did not prove to be a significant learning tool. It is conceivable that the number of participants was insufficient. With the widespread use of technology, there is certainly a role for video or e-learning in the surgical curriculum [27], and further research in this subject is pertinent.

Availability of data and materials

The datasets generated and analysed during the current study are restricted to protect participant anonymity. Information may be available from the corresponding author on reasonable request.

Abbreviations

SDL-V:

self-directed learning by video

FLS:

Fundamentals of Laparoscopic Surgery

OSATS:

Objective Structure Assessment of Technical Skills

PGY:

post-graduate year

References

  1. 1.

    Peignot, S. Centre for Psychological and Educational Consultation (in French). http://www.ccpeweb.ca/definition-application-psychomotricite/ (2016).

    Google Scholar 

  2. 2.

    Fundamentals of Laparoscopic Surgery. FLS Program Description. https://www.flsprogram.org/ (2020).

    Google Scholar 

  3. 3.

    Miller S, Shipper E, Hasty B, Bereknyei Merrell S. Introductory surgical skills course: technical training and preparation for the surgical environment. MedEdPORTAL. 2018;14:10775.

    Article  Google Scholar 

  4. 4.

    Stefanidis D, Scerbo M, Montero P, Acker CE. Simulator training to automaticity leads to improved skill transfer compared with traditional proficiency-based training: a randomized controlled trial. Ann Surg. 2012;255:30–7.

    Article  Google Scholar 

  5. 5.

    Steinemann S, Berg B, Skinner A, DiTulio A. In situ, multidisciplinary, simulation-based teamwork training improves early trauma care. J Surg Educ. 2011;68:472–7.

    Article  Google Scholar 

  6. 6.

    Soucisse ML, Boulva K, Sideris L, Drolet P. Video-coaching as an efficient teaching method – a randomized controlled trial. J Surg Educ. 2017;74:365–71.

    Article  Google Scholar 

  7. 7.

    Hu YY, Mazer LM, Yule SJ, Arriaga AF. Complementing operating room teaching with video-based coaching. JAMA Surg. 2017;152:318–25.

    Article  Google Scholar 

  8. 8.

    Zumbrunn S, Tadlock J, Roberts ED. Encouraging self-regulated learning in the classroom: a review of the literature. Metropolitan Educational Research Consortium, Virginia Commonwealth University; 2011.

    Google Scholar 

  9. 9.

    Nesbitt C, Phillips A, Searle RF, Stansby G. Randomized trial to assess the effect of supervised and unsupervised video feedback on teaching practical skills. J Surg Educ. 2015;72:697–703.

    Article  Google Scholar 

  10. 10.

    Phillips A, Matthan J, Bookless L, Whitehead IJ. Individualised expert feedback is not essential for improving basic clinical skills performance in novice learners: a randomized trial. J Surg Educ. 2016;74:612–20.

    Article  Google Scholar 

  11. 11.

    Frenk J, Chen L, Bhutta ZA, Cohen J. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010;376:1923–58.

    Article  Google Scholar 

  12. 12.

    Ali-Masri H, Hassan S, Fosse E, Zimmo KM. Impact of electronic and blended learning programs for manual perineal support on incidence of obstetric anal sphincter injuries: a prospective interventional study. BMC Med Educ. 2018;18:258.

    Article  Google Scholar 

  13. 13.

    Azer N, Shi X, de Gara C, Karmali S. “iBIM” – internet-based interactive modules: an easy and interesting learning tool for general surgery residents. Can J Surg. 2014;57:E31–5.

    Article  Google Scholar 

  14. 14.

    Schmidt M, Kowaleski KF, Trent SM, Benner L. Self-directed training with e-learning using the first-person perspective for laparoscopic suturing and knot tying: a randomized controlled trial: learning from the surgeon’s real perspective. Surg Endosc. 2020;34:869–79.

    Article  Google Scholar 

  15. 15.

    Maertens H, Madani A, Landry T, Vermassen F. Systematic review of e-learning for surgical training. Br J Surg. 2016;103:1428–37.

    Article  Google Scholar 

  16. 16.

    Van Hove PD, Tuijthof GJM, Verdaasdonk EGG, Stassen LPS. Objective assessment of technical surgical skills. Br J Surg. 2019;97:972–87.

    Article  Google Scholar 

  17. 17.

    Pattani R, Wu P, Dhalla IA. Resident duty hours in Canada: past, present and future. CMAJ. 2014;186:761–5.

    Article  Google Scholar 

  18. 18.

    Mendelsohn D, Despot I, Gooderham PA, Singhal A. Impact of work hours and sleep on well-being and burnout for physicians-in-training: the resident activity tracker evaluation study. Med Educ. 2019;53:306–15.

    Article  Google Scholar 

  19. 19.

    Sealy W. Halsted is dead: time for change in graduate surgical education. Curr Surg. 1999;56:34–9.

    Article  Google Scholar 

  20. 20.

    Krajewski A, Filippa D, Staff, I, Singh R. Implementation of an intern boot camp curriculum to address clinical competencies under the new accreditation Council for Graduate Medical Education supervision requirements and duty hour restrictions. JAMA Surg. 2013;148:727–32.

    Article  Google Scholar 

  21. 21.

    Schell S, Flynn TC. Web based minimally invasive surgery training: competency assessment in PGY1-2 surgical residents. Curr Surg. 2004;61:120–4.

    Article  Google Scholar 

  22. 22.

    Ruiz J, Minstzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. 2006;81:207–12.

    Article  Google Scholar 

  23. 23.

    Taveira-Gomes T, Ferreira P, Taveira- Gomes I, Severo M. What are we looking for in computer-based learning interventions in medical education? A systematic review. JMIR. 2016;18:e204.

    Google Scholar 

  24. 24.

    Wallace L, Raison N, Ghumman F, Moran A. Cognitive training: how can it be adapted for surgical education. Surgeon. 2017;15:231–9.

    Article  Google Scholar 

  25. 25.

    Morgan PJ, Cleave-Hogg D, Guest CB. A comparison of global ratings and checklist scores from an undergraduate assessment using an anesthesia simulator. Acad Med. 2001;76:1053–5.

    Article  Google Scholar 

  26. 26.

    Alammari M, Nawar ES. Inter-rater and intra-raters’ variability in evaluating complete dentures insertion procedure in senior undergraduates’ prosthodontics clinics. Electron Physician. 2018;10:7287–92.

    Article  Google Scholar 

  27. 27.

    Augestad KM, Butt K, Ignjatovic D, Keller DS. Video-based coaching in surgical education: a systematic review and meta-analysis. Surg Endosc. 2019;34:521–35.

    Article  Google Scholar 

Download references

Acknowledgements

Kerianne Boulva, MD: participated in idea generation and protocol design for the project.

Josée Tessier: coordinator of the “Unité de Formation Chirurgicale”. She permitted the authors to use the laboratory, and its resources, to conduct the project. She also provided the specimens used for the anastomosis.

Mathieu Favreau: from the Audio-Visual Department (Hôpital Maisonneuve Rosemont), he provided the camera used for the project and edited the participants’ videos, as mentioned in the methods’ section.

Funding

The authors declare that there was no funding for this study.

Author information

Affiliations

Authors

Contributions

GC’s role was project design, ethics approval, participant recruitment, data collection, manuscript writing. MS’s role was project design, manuscript revision. PDubé’s role was project design, participant recruitment, manuscript revision. JST’s role was data collection, manuscript revision. PDrolet’s role was project design, data collection, statistical analysis, manuscript revision. LS’s role was supervising author, project design, data collection, manuscript revision. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Geneviève Chartrand.

Ethics declarations

Ethics approval and consent to participate

This project was approved by the “Comité d’Évaluation Scientifique en Santé Physique” (Scientific Evaluation Committee) affiliated with Hôpital Maisonneuve Rosemont. Informed consent was obtained from all the participants. All methods were carried out in accordance with relevant guidelines and regulations.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Chartrand, G., Soucisse, M., Dubé, P. et al. Self-directed learning by video as a means to improve technical skills in surgery residents: a randomized controlled trial. BMC Med Educ 21, 91 (2021). https://doi.org/10.1186/s12909-021-02524-y

Download citation

Keywords

  • Surgery
  • Video
  • Residency
  • Pedagogy
  • Self-directed learning