Skip to main content

Methods of teaching evidence-based practice: a systematic review

Abstract

Background

To identify the effectiveness of different teaching modalities on student evidence-based practice (EBP) competency.

Methods

Electronic searches were conducted in MEDLINE, Cochrane central register of controlled trials, PsycINFO, CINAHL, ERIC, A + Education and AEI through to November 2021. We included randomised-controlled trials comparing EBP teaching modes on EBP knowledge, skills, attitudes or behaviour in undergraduate and post-graduate health professions education. Risk of bias was determined using the Cochrane risk of bias tool.

Results

Twenty-one studies were included in the review. Overall, no single teaching modality was identified as being superior to others at significantly increasing learner competency in EBP. Changes in learner knowledge, skills, attitudes and behaviour were conflicting, with studies either reporting no change, or a moderate increase in EBP behavioural outcomes when directly compared to another intervention.

Conclusion

Current evidence highlights the lack of a single teaching modality that is superior than others regarding learner competency in EBP, regardless of health professions discipline or graduate status. The poor quality, heterogeneity of interventions and outcome measures limited conclusions. Further research should focus on the development of high-quality studies and use of psychometrically validated tools to further explore the impact of different EBP teaching modalities.

Peer Review reports

Background

Evidence-based practice (EBP) is essential for the delivery of quality healthcare [1]. It is a process that allows patients, health professionals, researchers and/or policy makers to make informed health decisions in a given context based on an integration of the best available evidence, with clinical expertise and patient values and preferences [2, 3]. Most commonly this involves five steps: Ask, Acquire, Appraise, Apply and Assess [1, 2, 4]. Competency in EBP is expected by many accreditation bodies, requiring health practitioners to be able to demonstrate skills across the five domains including asking a defined question, literature searching, critical appraisal, integrating evidence into clinical practice and self-reflection [2]. These domains intersect with key EBP competencies of knowledge, skills, attitudes and behaviours – each being critical to the successful implementation of the five steps in clinical practice [2]. However, there still seems to be a gap between the desired and actual practice [5].

Many of the identified barriers to the use and implementation of EBP include those that could be overcome by education [6]. These include inadequate skills and a lack of knowledge particularly pertaining to the research skills required to acquire and appraise studies. In addition to education around the core skills it appears that more practice and exposure to EBP could also overcome barriers, particularly those relating to lack of awareness and negative attitudes [6]. Many practitioners misunderstand EBP as being the ability to keep up to date with research. With an average of over 2000 citations added to Medline each day [7], the ability to effectively and efficiently search and identify relevant, high quality evidence is a critical skill.

A study of undergraduate health student perceptions of the meaning of EBP revealed a very limited understanding of EBP processes or principles [4]. This is despite the fact that over the last two decades EBP has been integrated into core health curricula [2]. The most common teaching methods in undergraduate programs include, research courses and workshops, collaboration with clinical practice, IT technology, assignments, participation in research projects, journal clubs, or embedded librarians [8]. There have been a number of systematic reviews published evaluating the effectiveness of interventions focused on teaching EBP. Some of these have addressed only one aspect of EBP such as literature searching [9], whilst others have focused on specific cohorts such as medical trainees [10], medical students [11], undergraduate health students [12], postgraduate teaching [13], or nursing programs [14].

More recently, MM Bala, T Poklepović Peričić, J Zajac, A Rohwer, J Klugarova, M Välimäki, T Lantta, L Pingani, M Klugar, M Clarke, et al. [15] performed an overview of systematic reviews examining the effects of different teaching approaches for EBP at undergraduate and postgraduate levels. The review identified 22 systematic reviews, with the most recent systematic review published in 2019. This overview of systematic reviews identified that knowledge improved when interventions were compared to no intervention, or pre-test scores [15] across a diverse range of teaching modalities and populations. Similarly, there was positive changes in behaviour, with EBP skills also improving in certain populations. However, of the systematic reviews included only three were judged of high quality, one as moderate, one as low and the other 17 were considered as critically low quality. MM Bala, T Poklepović Peričić, J Zajac, A Rohwer, J Klugarova, M Välimäki, T Lantta, L Pingani, M Klugar, M Clarke, et al. [15] reported that the reasons for categorisation as low-quality were most often a lack of a comprehensive search strategy and/or an adequate risk of bias assessment tool.

As the principles of EBP remain the same irrespective of the health profession, the aim of this systematic review was to identify the current evidence-base on the effectiveness of different teaching modalities on undergraduate or postgraduate learner competency in EBP among all fields of medicine, allied health and health sciences. This review also aims to provide a high-quality update on our 2014 systematic review, which focussed on EBP training in medical trainees. This review expands the population of interest to all health professions trainees at both undergraduate and post-graduate levels, encompassing all areas of EBP competency including knowledge, skills, attitudes and behaviours, including self-efficacy.

Methods

Cochrane methodology were used to ensure the robustness the systematic approach of this review by following the Cochrane Handbook [16] and reporting in accordance with the PRISMA 2020 statement [17], outlined in the steps below.

Step 1: Defining the research question and inclusion criteria

A conceptual framework for data synthesis [18, 19], which shares similarities with the EBP PICOTS framework [20], was utilised to determine the inclusion criteria and eligibility of studies to be included in this systematic review (Table 1).

Table 1 Conceptual framework for data synthesis

Step 2: Searching the literature and the identification of studies

Electronic searches were conducted across the following databases; MEDLINE, Cochrane central register of controlled trials, PsycINFO, CINAHL, ERIC, A + Education and AEI (Informit). No language or date restrictions were imposed. The search was last updated in November 2021. The full search strategy is available in Supplementary file 1. Citations of all articles returned from the search of the respective electronic databases was uploaded for review using Covidence [21]. All citations were reviewed independently by two authors (BH and DI) for possible inclusion in the systematic review based on the title and abstract. Full-text of those articles, as well as those where it was not possible to determine inclusion/exclusion based solely on the title and/or abstract was conducted by two authors (BH and DI). Articles that met the selection criteria after final review of the full-text were included in this systematic review. Any discrepancies in author judgement regarding the merits of article selection was resolved by the third author (BD).

Step 3: Data collection, extraction and management

A data extraction form was piloted before the commencement of data collection and extraction. Two authors (BH and BD) independently extracted data from each included study. Information on the following domains was recorded; study citation, country, setting, study design, study period, inclusion/exclusion criteria, number and type of participants, methodology (including teaching intervention and comparison), outcomes measures and time point, study funding source and conflict of interests. Any discrepancies in author judgement with data extraction were resolved by the third author (DI) before a single consolidated data extraction form was created.

Step 4: Assessment of risk of bias in included studies

The methodological quality of included studies was assessed using the Cochrane risk of bias tool [22]. Two authors (BH and BD) independently assessed each included study across four domains; 1) random sequence generation (selection bias), 2) allocation concealment (selection bias), 3) blinding of outcome assessment (detection bias) and 4) incomplete outcome data (attrition bias). Risk of bias for each study domain was assessed as ‘high’, ‘unclear’, or ‘low’ risk of bias. Overall risk of bias for the evidence base was similarly assessed. Any discrepancies in author judgement were resolved by the third author (DI).

Step 5: Data synthesis and analysis

Due to the relative heterogeneity of studies included in this review, a formal meta-analysis was not deemed appropriate. Studies varied across interventions, comparisons, outcomes measured (and tools for measuring outcomes), as well as timing of outcome measurement. Studies also differed with the type of EBP content delivered, with some focussing on single aspects of EBP, whilst others taught all EBP steps as part of the educational intervention. A descriptive analysis was performed on all included studies, with focus on differences in knowledge, skills, behaviour and attitudes between educational interventions and the methodological quality of the evidence.

Results

Description of studies

A total of 1,355 citations were identified from the search, of which 71 were examined for full-text. Twenty-one studies met the inclusion criteria and were included in the review as seen in the PRISMA flowchart (Fig. 1) [17]. Of the 21 studies included in this review, 14 studies were conducted with undergraduate medical students, one with undergraduate osteopathic medical students, one with graduate physician assistant students, one with undergraduate nursing students and one with graduate family nurse practitioner students. Three studies implemented an interdisciplinary approach to teaching with a combination of medical, occupational therapy, physiotherapy, nutrition, pharmacy and dental students. The majority of studies were conducted in the USA, with other studies involved across a variety of countries including Australia, Canada, Hong Kong, Indonesia, Japan, Lebanon, Malaysia, Mexico, Norway, Portugal, Taiwan and United Kingdom. The characteristics of included studies (including information on methodology, participants, interventions, outcomes and findings for each study) are detailed in Table 2.

Fig. 1
figure 1

PRISMA flowchart [17]

Table 2 Characteristics of included studies

Methodological quality

The risk of bias for each included study is illustrated in Fig. 2. Four studies had a low risk of bias [24, 26, 39, 46], ten studies [23, 29, 31, 34, 35, 37, 44, 47, 51, 53] had a high risk of bias and seven studies [41,42,43, 45, 49, 50, 52] had uncertain risk of bias as there was a lack of detail in the study methodology to genuinely judge the quality of the study. Table 3 provides further details about the evidence to support judgements about risk of bias for all included studies. Across methodological domains, studies were varied in terms of their risk of bias, with an overall judgement about the summary of evidence being ‘unclear’, in terms of the summary of evidence (Fig. 3).

Fig. 2
figure 2

Risk of bias summary. Review authors' judgements about each risk of bias item for each included study

Table 3 Risk of bias overview
Fig. 3
figure 3

Risk of bias graph. Review authors' judgements about each risk of bias item presented as percentages across all included studies

Study population and EBP interventions

Studies differed in the EBP competencies delivered as part of their respective training programs. GM Leung, JM Johnston, KY Tin, IO Wong, LM Ho, WW Lam and TH Lam [46] delivered an introduction to the principles of EBP with undergraduate medical students and D Cardoso, F Couto, AF Cardoso, E Bobrowicz-Campos, L Santos, R Rodrigues, V Coutinho, D Pinto, M-A Ramis, MA Rodrigues, et al. [26] delivered an EBP education program to undergraduate nursing students. The study by PM Krueger [45] focussed on teaching critical appraisal skills to undergraduate osteopathic medical students.

Four studies focussed on teaching searching skills (including constructing a clinical question). Of the four studies, JD Eldredge, DG Bear, SJ Wayne and PP Perea [34] and D Ilic, K Tepper and M Misso [37] training undergraduate medical students, whilst HL Johnson, P Fontelo, CH Olsen, KD Jones, 2nd and RW Gimbel [41] trained graduate family nursing students and LA Kloda, JT Boruff and AS Cavalcante [43] trained undergraduate occupational therapy and physiotherapy students.

Two studies focused on delivering teaching searching and appraising clinical evidence. The study by RG Badgett, JL Paukert and LS Levy [23] consisted of two quasi RCTs with undergraduate medical students, whilst the study by JD Long, P Gannaway, C Ford, R Doumit, N Zeeni, O Sukkarieh-Haraty, A Milane, B Byers, L Harrison, D Hatch, et al. [47] was with pharmacy and nutrition students.

A total of 12 studies delivered teaching programs on the 4 steps of EBP (asking a clinical question, searching the literature, critical appraisal of the evidence, and integration of evidence in the clinical setting). P Bradley, C Oterholt, J Herrin, L Nordheim and A Bjorndal [24], J Davis, S Crabb, E Rogers, J Zamora and K Khan [31], T Hadvani, A Dutta, E Choy, S Kumar, C Molleda, V Parikh, MA Lopez, K Lui, K Ban and SS Wallace [35], D Ilic, RB Nordin, P Glasziou, JK Tilson and E Villanueva [39], JM Johnston, CM Schooling and GM Leung [42], M Sanchez-Mendiola, LF Kieffer-Escobar, S Marin-Beltran, SM Downing and A Schwartz [50] and IS Widyahening, A Findyartini, RW Ranakusuma, E Dewiasty and K Harimurti [53] delivered their programs to undergraduate medical students. HM Cheng, FR Guo, TF Hsu, SY Chuang, HT Yen, FY Lee, YY Yang, TL Chen, WS Lee, CL Chuang, et al. [29] and K Schilling, J Wiecha, D Polineni and S Khalil [51] delivered their programs to undergraduate medical students as an integrated clinical rotation. MA Stack, NO DeLellis, W Boeve and RC Satonik [52] delivered their program to graduate physician assistant students, whilst E Nango and Y Tanaka [49] and D Koufogiannakis, J Buckingham, A Alibhai and D Rayner [44] delivered their programs as part of an interdisciplinary program.

It was noted that none of the studies included long term follow up and assessment of EBP competencies post intervention, with assessment delivered post-intervention.

EBP competency

Knowledge

Twelve studies in total examined the impact of teaching modes on learner knowledge [24, 26, 29, 31, 35, 42, 44, 45, 49,50,51, 53]. Five studies determined learner knowledge post-intervention via a non-validated tool or survey [24, 44, 45, 49, 51], three utilised either the Berlin or Fresno tool in isolation [26] or combination [31, 53], and another two utilised the Knowledge, Attitude and Behaviours (KAB) questionnaire [29, 42]. Other methods used to determine impacts on learner knowledge included the Knowledge, Attitudes, Access, and Confidence Evaluation (KACE) [35] or Taylor et al. [25] questionnaire [50]. Five of the included studies identified no statistically significant difference in learner knowledge between teaching interventions [24, 31, 35, 42, 49, 53]. Teaching modalities investigated across those five studies included comparisons between directed and self-directed learning; computer versus lecture based; self-directed multimedia vs didactic; PBL versus non-PBL structure; multidisciplinary versus homogenous disciplines; and near peer tutored versus staff tutored session. Two of the included studies identified differences in knowledge scores between teaching interventions. HM Cheng, FR Guo, TF Hsu, SY Chuang, HT Yen, FY Lee, YY Yang, TL Chen, WS Lee, CL Chuang, et al. [29] compared structured case conferencing to lecture based teaching. Learners in the structured case conferences were identified to have significantly higher knowledge scores at follow up (MD = 2.2 95%CI 0.52–3.87). D Koufogiannakis, J Buckingham, A Alibhai and D Rayner [44] identified significantly higher learner knowledge with those that attended EBP-related PBL sessions with a librarian compared to sessions without a librarian. Four studies compared the teaching of an EBM course to no teaching [26, 45, 50, 51]. Unsurprisingly, learners who attended the EBM course had significantly higher knowledge scores when compared to those allocated to the control group.

Skills

Twelve studies in total examined the impact of teaching modes on learner EBP skills [23, 24, 26, 34, 35, 37, 39, 43, 47, 51,52,53]. Impacts on learner EBP skills were assessed after the intervention via the Fresno tool in isolation [26, 35, 43, 52] or in combination with the EBP questionnaire [37], non-validated methods [23, 24, 34, 51], the Berlin tool [39], Research Readiness Self-Assessment (RRSA) [47] or the EBP confidence (EPIC) scale [53]. Eight of the studies concluded no statistically significant difference in learner EBP skill between teaching interventions [23, 24, 34, 37, 39, 43, 53]. Teaching modalities investigated across those eight studies included directed versus self-directed learning; blended versus didactic learning; self-directed multimedia vs didactic; specific workshop on searching versus no workshop; near peer tutored versus staff tutored sessions; and EBP training with and without peer assessment. The studies by MA Stack, NO DeLellis, W Boeve and RC Satonik [52], K Schilling, J Wiecha, D Polineni and S Khalil [51], D Cardoso, F Couto, AF Cardoso, E Bobrowicz-Campos, L Santos, R Rodrigues, V Coutinho, D Pinto, M-A Ramis, MA Rodrigues, et al. [26] and JD Long, P Gannaway, C Ford, R Doumit, N Zeeni, O Sukkarieh-Haraty, A Milane, B Byers, L Harrison, D Hatch, et al. [47] examined the effectiveness of an EBP teaching intervention, or curriculum, to usual practice. MA Stack, NO DeLellis, W Boeve and RC Satonik [52] compared the impact of students undertaking a curriculum with EBM teaching embedded, versus students who undertook a curriculum without EBM content. Students undertaking the EBM-based curriculum demonstrated higher EBM-related skills, and also recorded higher self-efficacy with respect to EBM skills. The study by JD Long, P Gannaway, C Ford, R Doumit, N Zeeni, O Sukkarieh-Haraty, A Milane, B Byers, L Harrison, D Hatch, et al. [47] examined the use of a web-based tool to support student EBM searching and critical appraisal skills. Students reported significantly higher self-efficacy scores when using the EBM-related technology. The study by K Schilling, J Wiecha, D Polineni and S Khalil [51] reported higher EBP skills in students attending an online clerkship in EBP, compared to students that did not. D Cardoso, F Couto, AF Cardoso, E Bobrowicz-Campos, L Santos, R Rodrigues, V Coutinho, D Pinto, M-A Ramis, MA Rodrigues, et al. [26] reported greater improvements in EBP skills for those who participated in the EBP education program.

Attitudes

Ten studies in total examined the impact of teaching modes on learner EBP attitudes [24, 31, 35, 39, 41, 42, 44, 50, 53, 55]. The main method for determining impact on attitudes post intervention was via the use of Taylor et al. [25]. Other methods included the use of the KAB questionnaire [29, 42], KACE [35] or Asessing Competency in Evidence based medicine (ACE) [39] tool, or non-validated means. Eight of the included studies identified no significant differences in learner EBP attitudes between teaching interventions [24, 31, 35, 41, 42, 53, 55]. Teaching modalities investigated across those eight studies included directed versus self-directed learning; structured case conference versus lectures; computer-based sessions versus lecture; self-directed multimedia vs didactic; near peer tutoring versus staff tutoring; PBL versus usual teaching; librarian assisted PBL versus non-librarian assisted PBL; and web-based teaching versus usual teaching. The study by D Ilic, RB Nordin, P Glasziou, JK Tilson and E Villanueva [39] examined blended learning versus didactic teaching of EBM. No overall difference in learner attitudes was identified, although several significant differences on sub-questions relating to the tool used to access attitudes was observed. Unsurprisingly, the study by M Sanchez-Mendiola, LF Kieffer-Escobar, S Marin-Beltran, SM Downing and A Schwartz [50] observed significantly higher learner attitudes towards EBP when comparing the implementation of an EBP course to no EBP teaching.

Behaviour

Five studies in total examined the impact of teaching modes on learner EBP behaviour [29, 39, 42, 46, 52]. Three studies determined impact on behaviours post intervention by the KAB questionnaire [29, 42, 46], one via the ACE tool [39] and one via the Patient Encounter Clinical Application (PECA) scale [52]. Two of the included studies identified no impact of teaching modes on EBP behaviour (PBL versus usual teaching and EBM curriculum versus curriculum without EBM integration) [42, 52]. The study by D Ilic, RB Nordin, P Glasziou, JK Tilson and E Villanueva [39] identified increases in EBP behaviour sub-scores in learners that received blended learning, versus those that received a didactic approach. HM Cheng, FR Guo, TF Hsu, SY Chuang, HT Yen, FY Lee, YY Yang, TL Chen, WS Lee, CL Chuang, et al. [29] reported increases in EBP behaviour in learners that were exposed to case conference style teaching of EBP, compared to those that received lecture-based sessions. Unsurprisingly, students who received any form of EBP teaching reported higher EBP behavioural scores compared to students that weren’t exposed to any form of EBP teaching [46].

Discussion

The findings of this systematic review build on the emerging evidence base exploring the effectiveness of different teaching strategies on the competency of EBP learners [9,10,11,12,13,14,15]. Results from this current review update and extend on the findings from our 2014 systematic review, which identified a small, albeit moderate quality of evidence, on the effectiveness of different training modalities in medical trainees [10]. Although our current study expanded the search to include allied and health sciences trainees, very few additional studies across these health professions have been performed. As per our 2014 review, our current findings highlight the variability in methodology quality, and use of psychometrically validated tools to assess learner competency in EBP [10]. These results align with the most recent overview of systematic reviews, which concluded the current limitations of evidence on the topic to be poor quality, heterogeneity of interventions and outcome measures [15].

In a bid for EBP education to be more ‘evidence-based’, a 2018 consensus statement was developed detailing the minimum core competencies in EBP that health professionals should meet to improve and standardise education in the discipline [2]. For such competencies to be translated into practice, a variety of robust teaching implementation and assessment strategies and tools must be available. A systematic review of evaluation tools in 2006 identified over 104 assessment instruments, of which only two were psychometrically evaluated [56]. The CREATE framework was developed in 2011, with the objective of creating a common taxonomy for assessment tools to cover assessing all steps of competency in EBP (including asking; acquiring; appraising; applying and assessing) [57]. A 2020 extension of the earlier 2006 systematic review identified six tools of reasonable validity evaluating some, but not all aspects of EBP [58].

Whilst our systematic review included 21 studies for review, the heterogeneity between the studies in terms of how outcomes were measured precluded any meaningful meta-analysis. I Chalmers and P Glasziou [59] have highlighted the impact of research waste in medical literature. Therefore, future research in the EBP education field must avoid waste by assessing agreed-upon outcome measures with robust, psychometrically validated tools. Such assessment tools should be competency focussed, rather than discipline focused, with the emphasis on evaluating specific EBP domains as recommended by the CREATE framework [40, 57]. Use of anything else only contributes to the gathering pile of research waste.

The findings from this systematic review suggest that insufficient evidence to promote one teaching modality over another in terms of its effectiveness on EBP learner outcomes. What is common across the current evidence is the need for multi-faceted, interactive, authentic learning experiences and assessment. Teaching utilities such as journal clubs have the potential to incorporate strong andragogic principles, for example with the use of PBL principles, coupled with a teaching modality that is commonly used in practice as a method of professional education. Further research is required to better understand how such authentic teaching and learning utilities are best served in the novice to expert continuum. For example, should novice EBP competencies be scaffolded through structured teaching modalities such as PBL, whilst those with a higher level of competency engage in more ‘authentic’ utilities such as journal clubs?

An important aspect of education not included in any study to date is the impact of cost and value on the teaching and learning experience [60]. The Prato Statement, published in 2017, highlights the goal of incorporating economic analyses into health professions education research in order to create an evidence that maximises value – both from an educational and economic perspective [60]. Whilst findings from our review demonstrate relevant equivalence between teaching modalities with respect to measured EBP competencies, the availability of economic data could well provide evidence to persuade the ‘value’ of one teaching modality over another.

Van der Vleuten’s assessment utility formula incorporates cost as a key quality characteristic of assessment [61]. Cost and value are important ingredients that educators should consider when evaluating the effectiveness of teaching strategies, from both a pragmatic and academic perspectives. The number of studies incorporating some form of economic evaluation is growing within the health professions education, however the quality of their reporting is poor [62]. The use of reporting guidelines to incorporate well-constructed economic evaluations as part of the assessment outcomes, particularly in the growing evidence base of EBP teaching, would provide sufficient evidence to conduct sensitivity analyses and provide a different lens from which to interpret evidence – particularly when it seems inconclusive on face value.

Many of the studies included in this systematic review were assessed as having either a high, or unclear, risk of bias. This potential for methodological bias brings a certain level of uncertainty in the evidence base with respect to interpreting the overall results of the review. A further limitation was the heterogeneity between studies with respect to outcomes measured, and tools used to measure these end-points. This variance in outcome measures prevented the possibility of conducting a meta-analysis, which would bring some level of quantifiable assessment of study outcomes. Subsequently, it was not possible to conduct a funnel plot analysis of studies and the potential impact of publication bias on this systematic review. Similarly, the included studies provided little information regarding the educational interventions that could allow reproducibility of the educational method, thereby adding to the ‘educational’ heterogeneity of the studies. All included studies focussed assessment of EBP competency immediately after intervention. The lack of long term follow-up is a significant evidence gap, as it critical questions regarding the need, and frequency, of continued professional development in EBP remain unanswered – particularly the impact that time, resources and environment may have upon self-efficacy, behaviours and attitudes toward implementing EBP principles in practice.

The majority of studies to date have focussed on medical trainees. Further research is required, particularly in the development of high-quality methodological studies to explore the impact of different teaching modalities across the broad spectrum of health professions disciplines. Such studies should focus on assessing key EBP competencies across the domains of knowledge, skills, attitudes and behaviours using robust, psychometrically validated outcome assessment tools.

Conclusions

The current evidence suggests limited differences on learner competency in EBP across different teaching modalities. Future studies should focus on conducting high methodological studies, with specific focus on measuring core EBP competencies using validated tools across disciplines within the health professions. Similarly, future studies should explore the use of emerging teaching strategies, and their effectiveness in teaching EBP across different stages of training. The COVID-19 pandemic has seen the need for many educational programs to pivot to an online delivery, with many adopting a hybrid online/face-to-face engagement as pandemic restrictions ease. Future work is needed to identify how successfully teaching of EBP can be translated into these emerging teaching modalities. There is also a need to explore long term follow up of learner competency in EBP as learners move along the novice to expert continuum from students to clinicians practicing EBP in the clinical environment.

Availability of data and materials

All data generated during this study are included in this published article.

References

  1. Dawes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, Porzsolt F, Burls A, Osborne J: Sicily statement on evidence-based practice. BMC Medical Education 2005, 5(1) https://doi.org/10.1186/1472-6920-5-1.

  2. Albarqouni L, Hoffmann T, Straus S, Olsen N, Young T, Ilic D, Shaneyfelt T, Haynes R, Guyatt G, Glasziou P. Core competencies in evidence-based practice for health professionals: consensus statement based on a systematic review and Delphi survey. JAMA Netw Open. 2018;1(2):e180281–e180281.

    Article  Google Scholar 

  3. Rolloff M. A constructivist model for teaching evidence-based practice. Nurs Educ Perspect. 2010;31(5):290–3.

    Google Scholar 

  4. Murphy KA, Guisard Y, Curtin M, Biles J, Thomas C, Parnell T. Evidence-based practice: What do undergraduate health students think it means?. Focus on Health Professional Education: A Multi-Professional Journal. 2019;20(3):12–29. https://doi.org/10.11157/fohpe.v20i3.319.

  5. Saunders H, Gallagher-Ford L, Kvist T, Vehvilainen-Julkunen K. Practicing healthcare professionals’ evidence-based practice competencies: an overview of systematic reviews. Worldviews Evid-Based Nurs. 2019;16(3):176–85.

    Article  Google Scholar 

  6. Sadeghi-Bazargani H, Tabrizi JS, Azami-Aghdash S. Barriers to evidence-based medicine: a systematic review. J Eval Clin Pract. 2014;20(6):793–802.

    Article  Google Scholar 

  7. MEDLINE Citation Counts by Year of Publication (as of January 2020) [https://www.nlm.nih.gov/bsd/medline_cit_counts_yr_pub.html ]

  8. Larsen C, Terkelsen A, Carlsen A, Kristensen H. Methods for teaching evidence-based practice:a scoping review. BMC Med Educ. 2019;19:33. https://doi.org/10.1186/s12909-019-1681-0.

    Article  Google Scholar 

  9. Hirt J, Nordhausen T, Meichlinger J, Braun V, Zeller A, Meyer G. Educational interventions to improve literature searching skills in the health sciences: a scoping review. J Medica Library Assocaition. 2020;108(4):534–46.

    Google Scholar 

  10. Ilic D, Maloney S. Methods of teaching medical trainees evidence-based medicine: a systematic review. Med Educ. 2014;48(2):124–35.

    Article  Google Scholar 

  11. Ahmadi SF, Baradaran HR, Ahmadi E. Effectiveness of teaching evidence-based medicine to undergraduate medical students: a BEME systematic review. Med Teach. 2015;37(1):21–30.

    Article  Google Scholar 

  12. Kyriakoulis K, Patelarou A, Laliotis A, Wan AC, Matalliotakis M, Tsiou C, Patelarou E: Educational strategies for teaching evidence-based practice to undergraduate health students: systematic review. Journal of Education Evaluation for Health Professions 2016, 13:34. https://doi.org/10.3352/jeehp.2016.13.34.

  13. Coomarasamy A, Khan KS. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review BMJ. 2004;329(7473):1017–9.

    Google Scholar 

  14. Hickman LD, DiGiacomo M, Phillips J, Rao A, Newton PJ, Jackson D, Ferguson C. Improving evidence based practice in postgraduate nursing programs: A systematic review: Bridging the evidence practice gap (BRIDGE project). Nurse Educ Today. 2018;63:69–75.

    Article  Google Scholar 

  15. Bala MM, Poklepović Peričić T, Zajac J, Rohwer A, Klugarova J, Välimäki M, Lantta T, Pingani L, Klugar M, Clarke M, et al. What are the effects of teaching Evidence-Based Health Care (EBHC) at different levels of health professions education? An updated overview of systematic reviews. PLoS ONE. 2021;16(7):e0254191.

    Article  Google Scholar 

  16. Higgins J, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, Welch VA: Cochrane Handbook for Systematic Reviews of Interventions version 6.4 (updated February 2022). In.: Cochrane; 2022 Available from www.training.cochrane.org/handbook.

  17. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, Shamseer L, Tetzlaff JM, Akl EA, Brennan SE, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71.

    Article  Google Scholar 

  18. Reed D, Price EG, Windish DM, Wright SM, Gozu A, Hsu EB, Beach MC, Kern D, Bass EB. Challenges in systematic reviews of educational intervention studies. Ann Intern Med. 2005;142(12 Pt 2):1080–9.

    Article  Google Scholar 

  19. Young T, Rohwer A, Volmink J, Clarke M. What are the effects of teaching evidence-based health care (EBHC)? Overview of systematic reviews. PLoS ONE. 2014;9(1):e86706.

    Article  Google Scholar 

  20. Guyatt G, Rennie D, Meade M, Cook D: Users' guides to the medical literature: a manual for evidence-based clinical practice. In., 3rd edn: McGraw Hill; 2015.

  21. Covidence systematic review software, Veritas Health Innovation, Melbourne, Australia. Available at https://www.covidence.org.

  22. Higgins J, Altman D, Gotzsche P, Juni P, Moher D, Oxman A, Savovic J, Schulz K, Weeks L, Sterne J. The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials. BMJ. 2011;343:d5928.

    Article  Google Scholar 

  23. Badgett RG, Paukert JL, Levy LS. Teaching clinical informatics to third-year medical students: negative results from two controlled trials. BMC Med Educ. 2001;1:3. https://doi.org/10.1186/1472-6920-1-3.

    Article  Google Scholar 

  24. Bradley P, Oterholt C, Herrin J, Nordheim L, Bjorndal A. Comparison of directed and self-directed learning in evidence-based medicine: a randomised controlled trial. Med Educ. 2005;39(10):1027–35.

    Article  Google Scholar 

  25. Taylor R, Reeves B, Mears R, Keast J, Binns S, Ewings P, Khan K. Development and validation of a questionnaire to evaluate the effectiveness of evidence-based practice teaching. Med Educ. 2001;35(6):544–7.

    Article  Google Scholar 

  26. Cardoso D, Couto F, Cardoso AF, Bobrowicz-Campos E, Santos L, Rodrigues R, Coutinho V, Pinto D, Ramis M-A, Rodrigues MA, et al. The Effectiveness of an Evidence-Based Practice (EBP) Educational Program on Undergraduate Nursing Students’ EBP Knowledge and Skills: A Cluster Randomized Control Trial. Int J Environ Res Public Health. 2021;18(1):293. https://doi.org/10.3390/ijerph18010293.

    Article  Google Scholar 

  27. Cardoso D, Rodrigues MA, Apóstolo J. Evidence-based practice educational program: a Portuguese experience with undergraduate nursing students. JBI Evidence Implementation. 2019;17:S72–4.

    Google Scholar 

  28. Cardoso D, Couto F, Cardoso AF, Louçano C, Rodrigues M, Pereira R, Parola V, Coelho A, Ferraz L, Pinto D, et al. Fresno test to measure evidence-based practice knowledge and skills for Portuguese undergraduate nursing students: A translation and adaptation study. Nurs Educ Today. 2021;97:104671.

    Article  Google Scholar 

  29. Cheng HM, Guo FR, Hsu TF, Chuang SY, Yen HT, Lee FY, Yang YY, Chen TL, Lee WS, Chuang CL, et al. Two strategies to intensify evidence-based medicine education of undergraduate students: a randomised controlled trial. Annals of Academice Medicine Singapore. 2012;41(1):4–11.

    Google Scholar 

  30. Johnston JM, Leung GM, Fielding R, Tin KYK, Ho L-M. The development and validation of a knowledge, attitude and behaviour questionnaire to assess undergraduate evidence-based practice teaching and learning. Med Educ. 2003;37(11):992–1000.

    Article  Google Scholar 

  31. Davis J, Crabb S, Rogers E, Zamora J, Khan K. Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomized controlled trial. Med Teach. 2008;30(3):302–7.

    Article  Google Scholar 

  32. Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer HH, Kunz R. Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ. 2002;325(7376):1338.

    Article  Google Scholar 

  33. Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003;326(7384):319.

    Article  Google Scholar 

  34. Eldredge JD, Bear DG, Wayne SJ, Perea PP. Student peer assessment in evidence-based medicine (EBM) searching skills training: an experiment. J Med Libr Assoc. 2013;101(4):244–51.

    Article  Google Scholar 

  35. Hadvani T, Dutta A, Choy E, Kumar S, Molleda C, Parikh V, Lopez MA, Lui K, Ban K, Wallace SS. Effectiveness of Modalities to Teach Evidence Based Medicine to Pediatric Clerkship Students: A Randomized Controlled Trial. Acad Pediatr. 2021;21(2):375–83.

    Article  Google Scholar 

  36. Hendricson WD, Rugh JD, Hatch JP, Stark DL, Deahl T, Wallmann ER. Validation of an Instrument to Assess Evidence-Based Practice Knowledge, Attitudes, Access, and Confidence in the Dental Environment. J Dent Educ. 2011;75(2):131–44.

    Article  Google Scholar 

  37. Ilic D, Tepper K, Misso M. Teaching evidence-based medicine literature searching skills to medical students during the clinical years: a randomized controlled trial. J Med Libr Assoc. 2012;100(3):190–6.

    Article  Google Scholar 

  38. Upton D, Upton P. Development of an evidence-based practice questionnaire for nurses. J Adv Nurs. 2006;53(4):454–8.

    Article  Google Scholar 

  39. Ilic D, Nordin RB, Glasziou P, Tilson JK, Villanueva E. A randomised controlled trial of a blended learning education intervention for teaching evidence-based medicine. BMC Med Educ. 2015;15:39. https://doi.org/10.1186/s12909-015-0321-6.

    Article  Google Scholar 

  40. Ilic D, Bin Nordin R, Glasziou P, Tilson J, Villanueva E. Development and validation of the ACE tool: assessing medical trainees’ competency in evidence based medicine. BMC Med Educ. 2014;14:114. https://doi.org/10.1186/1472-6920-14-114.

    Article  Google Scholar 

  41. Johnson HL, Fontelo P, Olsen CH, Jones KD 2nd, Gimbel RW. Family nurse practitioner student perception of journal abstract usefulness in clinical decision making: a randomized controlled trial. J Am Assoc Nurse Pract. 2013;25(11):597–603.

    Google Scholar 

  42. Johnston JM, Schooling CM, Leung GM. A randomised-controlled trial of two educational modes for undergraduate evidence-based medicine learning in Asia. BMC Med Educ. 2009;9:63. https://doi.org/10.1186/1472-6920-9-63.

    Article  Google Scholar 

  43. Kloda LA, Boruff JT, Cavalcante AS. A comparison of patient, intervention, comparison, outcome (PICO) to a new, alternative clinical question framework for search skills, search results, and self-efficacy: a randomized controlled trial. Journal of the Medical Library Assocaition. 2020;108(2):185–94.

    Google Scholar 

  44. Koufogiannakis D, Buckingham J, Alibhai A, Rayner D. Impact of librarians in first-year medical and dental student problem-based learning (PBL) groups: a controlled study. Health Info Libr J. 2005;22(3):189–95.

    Article  Google Scholar 

  45. Krueger PM. Teaching critical appraisal: a pilot randomized controlled outcomes trial in undergraduate osteopathic medical education. J Am Osteopath Assoc. 2006;106(11):658–62.

    Google Scholar 

  46. Leung GM, Johnston JM, Tin KY, Wong IO, Ho LM, Lam WW, Lam TH. Randomised controlled trial of clinical decision support tools to improve learning of evidence based medicine in medical students. BMJ. 2003;327(7423):1090.

    Article  Google Scholar 

  47. Long JD, Gannaway P, Ford C, Doumit R, Zeeni N, Sukkarieh-Haraty O, Milane A, Byers B, Harrison L, Hatch D, et al. Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool. Worldviews on Evidence-Based Nursing. 2016;13(1):59–65.

    Article  Google Scholar 

  48. Ivanitskaya LV, Hanisko KA, Garrison JA, Janson SJ, Vibbert D. Developing health information literacy: a needs analysis from the perspective of preprofessional health students. Journal of the Medical Library Association : JMLA. 2012;100(4):277–83.

    Article  Google Scholar 

  49. Nango E, Tanaka Y. Problem-based learning in a multidisciplinary group enhances clinical decision making by medical students: a randomized controlled trial. J Med Dent Sci. 2010;57(1):109–18.

    Google Scholar 

  50. Sanchez-Mendiola M, Kieffer-Escobar LF, Marin-Beltran S, Downing SM, Schwartz A. Teaching of evidence-based medicine to medical students in Mexico: a randomized controlled trial. BMC Med Educ. 2012;12:107. https://doi.org/10.1186/1472-6920-12-107.

    Article  Google Scholar 

  51. Schilling K, Wiecha J, Polineni D, Khalil S. An interactive web-based curriculum on evidence-based medicine: design and effectiveness. Fam Med. 2006;38(2):126–32.

    Google Scholar 

  52. Stack MA, DeLellis NO, Boeve W, Satonik RC. Effects of Teaching Evidence-Based Medicine on Physician Assistant Students′ Critical Appraisal, Self-Efficacy, and Clinical Application: A Randomized Controlled Trial. J Physician Assist Educ. 2020;31(3):159–65.

    Article  Google Scholar 

  53. Widyahening IS, Findyartini A, Ranakusuma RW, Dewiasty E, Harimurti K. Evaluation of the role of near-peer teaching in critical appraisal skills learning: a randomized crossover trial. International Jouranl of Medical Education. 2019;10:9–15.

    Article  Google Scholar 

  54. Salbach NM, Jaglal SB. Creation and validation of the evidence-based practice confidence scale for health care professionals. J Eval Clin Pract. 2011;17(4):794–800.

    Article  Google Scholar 

  55. Chen KS, Monrouxe L, Lu YH, Jenq CC, Chang YJ, Chang YC, Chai PY. Academic outcomes of flipped classroom learning: a meta-analysis. Med Educ. 2018;52(9):910–24.

    Article  Google Scholar 

  56. Shaneyfelt T, Baum KD, Bell D, Feldstein D, Houston TK, Kaatz S, Whelan C, Green M. Instruments for evaluating education in evidence-based practice: a systematic review. JAMA. 2006;296(9):1116–27.

    Article  Google Scholar 

  57. Tilson J, Kaplan S, Harris J, Hutchinson A, Ilic D, Niederman R, Potomkova J, Zwolsman S. Sicily statement on classification and development of evidence-based practice learning assessment tools. BMC Med Educ. 2011;78(11):78. https://doi.org/10.1186/1472-6920-11-78.

    Article  Google Scholar 

  58. Kumaravel B, Hearn J, Jahangiri L, Pollard R, Stocker C, Nunan D. A systematic review and taxonomy of tools for evaluating evidence-based medicine teaching in medical education. Syst Rev. 2020;9:91. https://doi.org/10.1186/s13643-020-01311-y.

    Article  Google Scholar 

  59. Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. The Lancet. 2009;374(9683):86–9.

    Article  Google Scholar 

  60. Maloney S, Reeves S, Rivers G, Ilic D, Foo J, Walsh K. The Prato Statement on cost and value in professional and interprofessional education. J Interprof Care. 2017;31(1):1–4.

    Article  Google Scholar 

  61. Van der Vleuten C. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ. 1996;1(1):41–67.

    Article  Google Scholar 

  62. Foo J, Cook D, Walsh K, Golub R, Elhassan Abdalla M, Ilic D, Maloney S. Cost evaluations in health professions education: a systematic review of methods and reporting quality. Med Educ. 2019;53(12):1196–208.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This review was not registered, nor was funding received.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the design of the systematic review including the development of the PICO question. BH, BD and DI constructed the search strategy, BH performed the search. BH and DI reviewed citations for inclusion in the review based on abstract and full-text review. BH and BD performed the data extraction. BH and BD assessed the risk of bias for included studies. BH and DI interpreted the results. All authors contributed to the writing and critical revision of the manuscript. All authors have approved the submitted version of the manuscript.

Corresponding author

Correspondence to Dragan Ilic.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

Two of the included papers in the review are authored by DI. This review builds upon a previously published systematic review by DI. Authors BH, BD and DI teach evidence-based practice in undergraduate and/or postgraduate health professions education.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Howard, B., Diug, B. & Ilic, D. Methods of teaching evidence-based practice: a systematic review. BMC Med Educ 22, 742 (2022). https://doi.org/10.1186/s12909-022-03812-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-022-03812-x

Keywords