- Research article
- Open Access
- Open Peer Review
Theory-based strategies for teaching evidence-based practice to undergraduate health students: a systematic review
BMC Medical Education volume 19, Article number: 267 (2019)
Undergraduate students across health professions are required to be capable users of evidence in their clinical practice after graduation. Gaining the essential knowledge and clinical behaviors for evidence-based practice can be enhanced by theory-based strategies. Limited evidence exists on the effect of underpinning undergraduate EBP curricula with a theoretical framework to support EBP competence. A systematic review was conducted to determine the effectiveness of EBP teaching strategies for undergraduate students, with specific focus on efficacy of theory-based strategies.
This review critically appraised and synthesized evidence on the effectiveness of EBP theory-based teaching strategies specifically for undergraduate health students on long or short-term change in multiple outcomes, including but not limited to, EBP knowledge and attitudes. PubMed, CINAHL, Scopus, ProQuest Health, ERIC, The Campbell Collaboration, PsycINFO were searched for published studies and The New York Academy of Medicine, ProQuest Dissertations and Mednar were searched for unpublished studies. Two independent reviewers assessed studies using the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument.
Twenty-eight studies reporting EBP teaching strategies were initially selected for review with methodological quality ranging from low to high. Studies varied in course duration, timing of delivery, population and course content. Only five included papers reported alignment with, and detail of, one or more theoretical frameworks. Theories reported included Social Cognitive Theory (one study), Roger’s Diffusion of Innovation Theory (two studies) and Cognitive Apprenticeship Theory (one study). Cognitive Flexibility Theory and Cognitive Load Theory were discussed in two separate papers by the same authors. All but one study measured EBP knowledge. Mixed results were reported on EBP knowledge, attitudes and skills across the five studies.
EBP programs for undergraduate health students require consideration of multiple domains, including clinical behaviors, attitudes and cognitive learning processes; Interventions grounded in theory were found to have a small but positive effect on EBP attitudes. The most effective theory for developing and supporting EBP capability is not able to be determined by this review therefore additional rigorous research is required.
Evidence-based practice (EBP) education is a recommended component of undergraduate health degree courses [1,2,3] aiming to provide students with a fundamental understanding and level of EBP capability upon graduation [4, 5]. The importance of effectively teaching EBP to health students to support requirements for professional licensing and/or registration is also emphasized in the literature [5,6,7,8]. EBP educational research to-date has historically focused on teaching EBP skills and knowledge to undergraduates, with lesser focus on EBP capability and/or long-term effect of learnt skills . More specifically, despite recommendations to base EBP learning curricula on all the steps of the EBP process  many undergraduate programs focus on teaching for a level of competence in literature searching and appraisal skills, with less consideration of implementing and evaluating evidence in practice . Programs that do address all components of the EBP process are challenging as they require students to integrate steps of the process with the conceptual model of EBP, namely the combination of best research evidence with clinical expertise and patient preference in order to provide optimal patient care [11, 12]. Other difficulties identified in regard to EBP curricula include timing of delivery of EBP interventions [7, 13], how to support student engagement with learning EBP [7, 14], level of clinical integration required for best learning outcomes  and most appropriate theoretical framework for underpinning EBP interventions to support and develop EBP behaviors [10, 15].
Several systematic reviews have been conducted on the effectiveness of strategies for teaching the EBP process to postgraduate students and/or clinicians [16,17,18,19,20,21]. Young, Rohwer, Volmink, and Clarke  synthesized 15 published and one unpublished systematic reviews, from 1993 to 2013, on EBP teaching strategies for a mixture of undergraduate and postgraduate student and health professional populations from medicine, nursing and allied health fields. Each included review evaluated single and/or multi-faceted educational interventions aimed at improving various EBP outcomes including, but not limited to, knowledge, critical appraisal skills, attitudes and EBP behaviors. Recommendations suggested teaching strategies should account for individual student factors such as learning style and capability as well as external organizational factors such as the setting of the learning activity and delivery format. The review suggested a combination of methods (e.g. journal clubs, small group discussions, incorporating clinical scenarios, lectures) had greatest effect on improving critical appraisal skills, EBP behaviors and knowledge .
A recent systematic review by Kyriakoulis et al. , suggests that while multi-faceted interventions may support undergraduate students learning about EBP, current evidence is insufficient to confidently determine which strategy is most effective. The review included 20 papers reporting use of EBP educational interventions in medicine, nursing, dentistry, pharmacy and allied health fields, suggested that multifaceted strategies including technology and /or simulation techniques, could influence undergraduate skills, knowledge and attitude towards EBP. Results indicated that the teaching strategies primarily focused on teaching information literacy skills (including critical appraisal), with few studies focusing on developing EBP implementation skills . Additionally, difficulty in engaging students in learning about EBP was identified. Measures to address strategies for EBP engagement are crucial in academic and clinical environments to support students translating EBP competence to professional practice after graduation.
The challenge of implementing evidence in practice, across all health professions has led to recommendations for use of psychological and/or behavioral theory as an underpinning framework for implementation research and knowledge translation interventions [15, 22,23,24,25,26]. Theoretical constructs provide guidance for examining and understanding a concept in a manner that is generalizable, through aligning with prior work on how ideas can be organised and/or represented as well as regarding domains or dimensions of the concept being investigated [27, 28]. Such theoretically based interventions support extension beyond consideration of ‘what works best’ to address more in-depth understandings of why, how or when interventions may or may not be successful [29, 30]. The use of theory is recommended for complex interventions where behavior change is required [24, 29], or when trying to predict behavior change [31,32,33]. More specifically regarding EBP, some research exists incorporating Social Cognitive Theory (SCT) into interventions for promoting health professionals’ adoption of EBP, both in the clinical setting [28, 31, 34,35,36] and from an educational perspective . Evidence also exists to support the predictive power of such theories [32, 33]. Eccles and colleagues suggest intention can be an acceptable measure for subsequent behavior in health professionals, when supported by an appropriate theoretical framework . Undergraduate students’ intention to use EBP is influenced by a level of confidence and/or capability with the behaviors prior to graduation [4, 13, 38, 39], which is where theory-based programs may be effective. The question this systematic review addressed therefore, was, “What is the effectiveness of theory-based strategies aimed at teaching the EBP process to undergraduate health students?”
Modifications for the original protocol
The original protocol for this review was published on the Joanna Briggs Institute database  as well as on the PROSPERO register (CRD42015019032). Initially the review aimed to identify the overall effectiveness of EBP teaching strategies to undergraduate health students however prior to completion of our original review, another systematic review was published on this topic . Considering the findings of that review, as well as other recent literature specifically on undergraduate EBP education for [10, 13, 39], a pragmatic decision was made to look critically at the selected studies to focus on those that reported underpinning their interventions in theory. Considering the potential impact theoretical constructs can have on behavior change [27, 31], as well as the association between student capability and their intention to use EBP after graduation , investigating any effect these types of interventions may have on student’s EBP skills, knowledge and other specified outcomes could identify strategies that further support EBP capability.
Inclusion and exclusion criteria
For this review, an undergraduate student was defined as one who is completing their first formal university degree training for their particular discipline; however, it is acknowledged that there are some differences globally in teaching courses, nomenclature and durations for different health disciplines which may limit the synthesis of results. Included studies identified some or all of the five steps of the EBP process as outlined by Sackett et al. . Experimental or comparative studies were considered for inclusion if they reported on any pedagogical and/or psychological theory as part of their intervention. As per our original protocol, outcomes of interest included EBP behavior, knowledge, skills, attitudes, self-efficacy (or self-confidence), beliefs, values and EBP use or future use.
Databases searched include: PubMed, CINAHL, Scopus, ProQuest Health (including ProQuest Health and Medical Complete, ProQuest Nursing and Allied Health), ERIC, the Campbell Collaboration, PsycINFO and Science Direct. Unpublished studies were searched within The New York Academy of Medicine, ProQuest Dissertations and Mednar. Due to limited resources for translation, only studies published in English were sought. The initial search strategy, undertaken in July 2015 was updated in December 2016. Relevant published systematic reviews were hand searched [6, 7, 10, 20, 41] and any individual study that met inclusion criteria was retrieved. Published research arising from included dissertations was also sought. The initial search strategy for PubMed is included as (Additional file 1).
Two reviewers independently verified papers for inclusion and two independent reviewers assessed selected studies for methodological validity prior to final inclusion in the review, using the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instruments (JBI-MAStARI)  for randomized controlled trials or one-group quasi-experimental studies, depending on study design. The instruments address risk of bias in specific aspects of the study methods, such as randomization, blinding, sampling and reporting. Any disagreements that arose between reviewers were resolved through consensus or with a third reviewer. Papers reporting educational interventions are known to be of varying and frequently low quality , therefore a minimum cut-off score of 3/10 was agreed upon for inclusion, however all papers that based their teaching strategy in theory were included for analysis in this review.
Data extraction and synthesis
Two phases of data extraction were undertaken. Firstly, specific details were extracted of the intervention, geographical location, population, study design, methods and outcomes of significance to the review question and specific objectives, including details of the underpinning theory. Secondly, data extraction of the actual results of interventions, including statistical data was conducted. Heterogeneity in interventions, outcomes and outcome measurement tools, across and within studies, prevented meta-analysis, therefore a narrative and tabular analysis is presented.
Description of studies and appraisal process
The initial search identified 2696 studies. A total of 2371 articles, titles and abstracts were examined, after removing duplicates, non-English and out of date range studies. From these, 148 full-text studies were retrieved. Reasons for exclusion at this stage were that the articles did not fit the systematic review criteria, for example they were not specific to undergraduate students, or were not empirical research studies. Verification of these studies by two reviewers (MR, EBA, ACh, ACo or DL) identified 34 published studies reporting on interventions for teaching EBP to undergraduates. These papers were assessed for quality with 28 being included initially. Reasons for the six studies being excluded at this stage included not addressing the outcomes of interest or insufficient statistical analysis. Following revision of the aim of the review (refer to Methods section), further examination of appraised studies identified five papers that based their teaching intervention on theory. These five studies became the primary focus of this paper with the aim of examining the effect of the theory-based teaching intervention on reported outcomes. A summary of the study details and components of the 23 non theory-based studies is attached as Additional file 2. The full search and selection process is outlined in the PRISMA  flowchart (Fig. 1). Risk of bias was identified across the five studies in areas regarding randomization, blinding and group allocation. Results of appraisal scores are presented in Additional file 3.
Two of the five included studies used quasi-experimental designs [45, 46], comparing their intervention to a control group who didn’t receive the intervention. Two other studies reported pre−/post-test results without a control group [47, 48] and the final study using a mixed-methods design . The mixed-methods study comprised three study arms with quantitative and qualitative designs for testing their intervention, but one of these arms comprised post-registration doctoral students and was therefore not included in the analysis for this review . Sample sizes ranged from 80 to 259 with a total of 933 participants. The studies included medicine, nursing, and nutrition students across different academic years. Overall, duration of the included interventions ranged from 10 sessions to 15 months and comprised techniques including didactic lectures, small group discussions, facilitated workshops and problem-based learning activities. Greater detail of the EBP interventions can be seen in Table 1.
Findings of the review
Theories and intervention details presented in included studies
The intervention by Kim et al.  was reported to be based upon two theories: Bandura’s self-efficacy construct from SCT [50, 51] and Roger’s Theory of Diffusion of Innovations . Bandura’s theory was addressed in the multiple regression modeling component of their study where students were asked to rate their confidence with making clinical decisions. Greater detail was presented regarding the second theory reported in this study - Rogers’ Theory of Diffusion of Innovations  - which proposes that new ideas can be built over time and through following a series of steps, be shared and adopted by others. One specific example of this as identified in the study was the use of an interactive assignment, which aligned with Rogers’ stage of adopting an innovation through social collaboration .
Ashktorab et al.  also grounded their intervention in Roger’s Theory of Diffusion Innovations  and clearly reported each stage of the intervention according to the five stages of Rogers’ theory. An example of how the knowledge acquisition phase was addressed was through provision of ten educational sessions with PowerPoint presentation and question and answer discussion sessions .
Long et al.,  used Cognitive Apprenticeship Theory (CAT) for their EBP teaching strategy. This theory posits social interactions between the learner and the expert form a base for further cognitive development. Learning is accomplished through teaching techniques such as scaffolding, observation, modeling, mentoring, reflection and participation . Such techniques gradually support learners and encourage them to delve even further into their learning experience. As part of the supplementary material for the paper, the authors included a hypothesized model of four elements of CAT (scaffolding, exploring, articulating, and reflecting) and strategies used to link the theory to the study intervention. For example, opportunities to practice skills were linked to the reflection component of the theory .
Liabsuetrakul et al. reported two studies, referring to Cognitive Load Theory (CLT)  in one study  and Cognitive Flexibility Theory (CFT)  in the other . Although two different theories it was suggested in both studies that teaching techniques such as small group discussion, self-directed work and problem-based learning principles, along with integration of clinical scenarios, supported the theoretical principles, however attribution of individual techniques to specific elements of the proposed theories was not detailed.
Reported outcome measures included EBM/EBP behaviors, knowledge, skills, attitudes, self-efficacy (or self-confidence), beliefs, values, EBP use or future use. A more detailed tabular summary of the statistical results is presented in Table 2.
EBP knowledge was measured in three of the included studies [45, 46, 48]. Small to moderate significant increases in EBP knowledge scores were reported in two studies of nursing students by Kim et al.  (mean difference = 0.25; p = 0.001) and Ashktorab et al.  (intervention group mean score 45.2, SD = 3.89; control group mean score 31, SD = 7.05; paired t-test, p < 0.0001). These scores were measured at completion of the intervention. Liabsuetrakul et al.  measured knowledge scores one week after completion of their intervention being delivered to medical students with an eight item summative assessment. Significant improvements were noted from pre-test scores to post-test (p < 0.001).
Four of the five studies measure EBP attitudes [45,46,47,48] with significant improvements noted in three of these studies [45, 47, 48]. Two studies measured immediate short-term changes in attitudes following their interventions [45, 46]. Ashktorab et al.  reported no significant difference in EBP attitudes, between control and intervention groups at baseline but a significant difference between groups after delivery of the EBP intervention to nursing students (p < 0.0001). Kim et al. reported no significant difference between groups for EBP attitudes (mean difference = 0.12; p = 0.398) with the authors suggesting delivering their intervention over a longer time may influence results. Liabsuetrakul et al. measured attitudes over a longer duration in both studies [47, 48]. A fluctuation of effect was noted with significant increase at week one (p < 0.001) [47, 48], followed by a slight decrease in scores at week five and week 13 but overall significant increase in scores from baseline at 37 weeks following the intervention (p = 0.007) . Such results suggest time could be a factor for developing and/or sustaining positive EBP attitudes throughout the undergraduate curriculum.
Long et al. measured ‘overall research skills’ using a web-based tool that assessed searching and appraising evidence skills . This measurement was recorded via self-report to a Likert scale question developed from the Research Readiness Self-Assessment tool . Significant improvement from pre-test to post-test results was noted in nursing students using the tool (p = 0.001), as well as in the second arm of the study which was an RCT comprising intervention and control groups of undergraduate students studying nutrition (p = 0.002). Liabsuetrakul measured EBM skills in both studies [47, 48], through student self-reported answers to a Likert scale developed by the researchers. Fluctuations were again noted from baseline to different time points. Overall scores for EBM skills were significantly higher from baseline to week one (p < 0.001) [47, 48] and at 37 weeks post intervention (p = 0.003) , after students were given more time to reflect and conduct some individual learning.
EBP use and EBP future use
Only one study  measured outcomes of EBP use and EBP future use, using a validated tool developed by Johnston et al. . The tool relies on student self-report but has high reliability and validity measures and has been tested in other studies of undergraduate students EBP [59,60,61]. A small but significant difference between intervention and control groups was reported for EBP use (mean difference = 0.26, p = 0.015), however no significant difference between groups was reported for EBP future use (mean diff =0.13, p = 0.255).
Other outcome measures
None of the included studies specifically measured outcomes of EBP self-efficacy, confidence or capability. Long et al.  measured an overall outcome of student ‘ability to distinguish credibility of online sources’, through measuring student responses to questions built into their web-based intervention. A non-significant difference (p = 0.70) between pre-test and post-test results as reported from the nursing arm of the study. In the second arm of the study, the nutrition students did report a significant difference between intervention and control groups after using the technology (p = 0.39). It was unclear if there were significant differences at baseline within or across the groups.
This systematic review aimed to identify effectiveness of theory-based interventions designed for improving undergraduate health students’ EBP. Effective learning of EBP requires consideration of cognitive, affective, behavioral and environmental elements , which is where theory-based interventions could be of value, however this review has identified that no single theory is yet aligned with EBP teaching and learning . Due to heterogeneity in theories reported, populations and interventions it was not possible to confidently determine in this review which theory was most effective for effecting improvement in student EBP knowledge, skills, attitudes or other domains. However, the systematic review has identified some common elements influential to undergraduate EBP success in some domains, which require further exploration.
While each of the theories utilized in the studies had a different focus some overlapping concepts were noted. Social and environmental influences were noted in studies that used small groups and strategies for sharing evidence [45,46,47]. Such methods have been aligned with constructivism pedagogy and problem-based learning strategies [62, 63]. Learning is a social process  and for undergraduate students who are more now socially connected and technology aware, the power of social influence on successful learning must be considered . Such influences are also recognized in, for example, Bandura’s Social Learning Theory (as a precursor to SCT) as affecting one’s self-efficacy to adopt certain behaviors . EBP requires a level of cognitive ability as well as adoption of learnt behaviors therefore learning programs that acknowledge and accommodate social influences in both clinical and academic environments may be powerful to supporting students’ successful accomplishment of EBP skills.
Mixed results regarding changes in EBP knowledge were reported in the included studies. Only one included study measured EBP knowledge via a summative assessment  with other studies reporting short-term change in self-reported knowledge immediately following delivery of the EBP intervention. Measuring change in EBP knowledge has been a focal point of EBP interventions for many years with emphasis on the first three steps of the EBP process [6, 10, 65]. Undergraduate students require fundamental knowledge of these steps; however, without implementing strategies to improve students’ EBP attitudes and capability it may be that over time students feel less encouraged to use EBP in their respective clinical environments. Additional research monitoring changes over time and particularly on transition to professional practice is beyond the scope of this review but is suggested for future research.
The impact of role modeling on EBP behavior was acknowledged in three studies [46, 48, 49] in varying degrees and even though each of the studies included in the review used a different theoretical framework, there is growing support for consideration of role modeling in EBP education due to the positive impact on EBP beliefs and subsequent EBP behavior [66,67,68]. While role models in both academic and clinical areas are important, facilitators who can specifically support students with demonstrating how EBP knowledge learnt in the academic setting can be used in clinical contexts, have a critical role in EBP education across health disciplines [69,70,71]. Without seeing EBP in practice it can be difficult for undergraduate students across all disciplines to assimilate the components being taught and relevance to their future work.
The studies identified that students’ need time for reflection in order to assimilate their knowledge into practice and to develop positive EBP attitudes. The two studies reporting no significant difference in EBP attitudes [45, 46] were measured immediately after the intervention, while results from Liabsuetrakul et al. found improvement in EBP attitudes over time [47, 48]. Social psychology [72, 73] indicates that interventions for changing attitudes need to address affective, behavioral and cognitive components and that such change is more likely to occur in the longer rather than shorter term. EBP interventions targeting attitudes in the short-term are thus less likely to find significant improvement in attitude towards EBP as students require time to assimilate knowledge and influences from clinical and academic environments . Teachings strategies incorporating regular feedback , opportunities to practice skills  and consideration of repeated or continuous strategies  have been suggested as ways to improve student engagement and facilitate sustained change.
Verbal persuasion (feedback), mastery of skills and vicarious experiences (role modelling) are three of the four sources of self-efficacy proposed by Bandura to promote self-efficacy for a specific task [50, 51]. SCT also proposes that individuals with higher self-efficacy for a specific activity will be more motivated to perform the activity [46, 50, 51]. While there is insufficient evidence in the systematic review to suggest SCT is the most effective theory for underpinning undergraduate EBP interventions, elements of the theory as discussed above have been reported in the literature [4, 66, 68] as well as the included studies [46, 48, 49] . Further consideration of these elements within teaching strategies for in EBP curricula is suggested for supporting student EBP self-efficacy and subsequent capability.
Synthesizing educational interventions presents many methodological challenges  and consequently there are several limitations to the review. Our initial search was targeted to find EBP teaching strategies for undergraduate students and retrieved a large number of papers which were carefully screened. It is unlikely but feasible that the decision to focus on the secondary aim of the review may have resulted in some specific theory-based papers being missed. Variation in international nomenclature for types of student and health professional categories is another limitation to the search process, as is the rapid expansion of studies being published in the field of EBP education. Although some repetition of reviews is acceptable for confirming results or uncovering different perspectives of a topic , following publication of recent reviews [7, 39], and advice from peer reviewers, we chose to focus on an aspect of the interventions which had not yet been addressed. We did not change the outcomes we were investigating, rather, synthesized the theoretical components of EBP educational interventions that were reported in studies obtained from our initial search. Modifications from original protocols are not uncommon [76, 77] however we acknowledge the impact this may have on certainty of the findings . The review presents elements which can however, be explored further by EBP educators for supporting successful EBP learning and behavior adoption. A solid theoretical base provides a standardized platform for delivering an intervention, which can subsequently aid in maintaining intervention fidelity despite need for any contextual adaptations .
EBP educational interventions for undergraduate health students are complex due to the cognitive and behavioral components necessary for success. Consequently, consideration of multiple domains, including clinical behaviors, attitudes and cognitive learning processes is required. Despite the requirements for undergraduate students to be capable EBP users after they graduate and the call for EBP education to be specific for the intended audience , the literature identifies limited theory-based evidence directed at undergraduate EBP education with a focus on preparing students to build capability and confidently use evidence in their professional practice.
Of the included studies, interventions grounded in theory were found to have a small but positive effect on EBP attitudes. Other common components were identified relating to time needed for learning as well as role modeling. Although this review was not able to determine the overall effect of these factors on specific outcomes due to heterogeneity in interventions, outcomes and measures, within and across the studies, such components require further investigation and their subsequent influence on EBP capability. Further research scoping the literature on undergraduate EBP curricula and underpinning theory is suggested.
Availability of data and materials
As this manuscript is a systematic review, all data generated or analyzed during this study are included in this published article or within the supplementary files. Further detail is available from the corresponding author on reasonable request.
Glasziou P, Burls A, Gilbert R. Evidence based medicine and the medical curriculum. BMJ (Clinical research ed). 2008;337(7672):704–5.
McEvoy MP, Williams MT, Olds TS. Evidence based practice profiles: differences among allied health professions. BMC Med Educ. 2010;10(1):69.
Kohn LT, Corrigan JM, Donaldson MS. To Err Is Human: Building a Safer Health System, vol. 627. Washington (DC): National Academies Press; 2000.
Forsman H, Wallin L, Gustavsson P, Rudman A. Nursing students’ intentions to use research as a predictor of use one year post graduation: a prospective study. Int J Nurs Stud. 2012;49(9):1155–64.
Dawes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, Porzsolt F, Burls A, Osborne J. Sicily statement on evidence-based practice. BMC Med Educ. 2005;5:1.
Young T, Rohwer A, Volmink J, Clarke M. What are the effects of teaching evidence-based health care (EBHC)? Overview of Systematic Reviews. PLOS One. 2014;9(1):e86706.
Kyriakoulis K, Patelarou A, Laliotis A, Wan AC, Matalliotakis M, Tsiou C, Patelarou E. Educational strategies for teaching evidence-based practice to undergraduate health students: systematic review. J Educ Eval Health Prof. 2016;13:1-10. https://doi.org/10.3352/jeehp.2016.13.34.
McEvoy MP, Crilly M, Young T, Farrelly J, Lewis LK. How comprehensively is evidence-based practice represented in Australian health professional accreditation documents? A Systematic Audit. Teach Learn Med. 2016;28(1):26–34.
Fraser SW, Greenhalgh T. Complexity science: coping with complexity: educating for capability. BMJ. 2001;323(7316):799–803.
Aglen B. Pedagogical strategies to teach bachelor students evidence-based practice: a systematic review. Nurse Educ Today. 2016;36:255–63.
Sackett D, Straus S, Richardson W, Rosenberg W, Haynes R. Evidence-based medicine: how to practice and teach EBM Edinburgh. Scotland: Churchill Livingstone; 2000.
Sackett DL, Rosenberg W, Gray J, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn't. BMJ. 1996;312:71-2.
Scurlock-Evans L, Upton P, Rouse J, Upton D. To embed or not to embed? A longitudinal study exploring the impact of curriculum design on the evidence-based practice profiles of UK pre-registration nursing students. Nurse Educ Today. 2017;58:12–8.
Johnson N, List-Ivankovic J, Eboh WO, Ireland J, Adams D, Mowatt E, Martindale S. Research and evidence based practice: using a blended approach to teaching and learning in undergraduate nurse education. Nurse Educ Pract. 2010;10(1):43–7.
Thomas A, Saroyan A, Dauphinee WD. Evidence-based practice: a review of theoretical assumptions and effectiveness of teaching and assessment interventions in health professions. Adv Health Sci Educ. 2011;16(2):253–76.
Coomarasamy A, Khan K. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004;329(7473):1017.
Coomarasamy A, Taylor R, Khan K. A systematic review of postgraduate teaching in evidence-based medicine and critical appraisal. Med Teach. 2003;25(1):77–81.
Flores-Mateo G, Argimon J. Evidence based practice in postgraduate healthcare education: a systematic review. BMC Health Serv Res. 2007;7(1):119.
Horsley T, Hyde C, Santesso N, Parkes J, Milne R, Stewart R. Teaching critical appraisal skills in healthcare settings (review). Cochrane Database Syst Rev. 2011;(11). Art. No. CD001270. https://doi.org/10.1002/14651858.CD001270.pub2.
Ilic D, Maloney S. Methods of teaching medical trainees evidence-based medicine: a systematic review. Med Educ. 2014;48(2):124–35.
Norman G, Shannon S. Effectiveness of instruction in critical appraisal (evidence-based medicine) skills: a critical appraisal. CMAJ. 1998;158(2):177–81.
Greenhalgh T, Howick J, Maskrey N. Evidence based medicine: a movement in crisis? BMJ. 2014;348(g3725). https://doi.org/10.1136/bmj.g3725.
Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement Sci. 2010;5(1):14.
Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14(1):26–33.
Grol RP, Bosch MC, Hulscher ME, Eccles MP, Wensing M. Planning and studying improvement in patient care: the use of theoretical perspectives. Milbank Q. 2007;85(1):93–138.
Thomas A, Menon A, Boruff J, Rodriguez AM, Ahmed S. Applications of social constructivist learning theories in knowledge translation for healthcare professionals: a scoping review. Implement Sci. 2014;9(1):54.
Improved Clinical Effectiveness through Behavioural Research Group. Designing theoretically-informed implementation interventions. Implement Sci. 2006;1(1):4.
Eccles MP, Grimshaw JM, Johnston M, Steen N, Pitts NB, Thomas R, Glidewell E, Maclennan G, Bonetti D, Walker A. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of managing upper respiratory tract infections without antibiotics. Implement Sci. 2007;2(1):26.
Davis R, Campbell R, Hildon Z, Hobbs L, Michie S. Theories of behaviour and behaviour change across the social and behavioural sciences: a scoping review. Health Psychol Rev. 2015;9(3):323–44.
Hawe P, Shiell A, Riley T. Theorising interventions as events in systems. Am J Community Psychol. 2009;43(3–4):267–76.
Eccles MP, Hrisos S, Francis J, Kaner EF, Dickinson HO, Beyer F, Johnston M. Do self-reported intentions predict clinicians' behaviour: a systematic review. Implement Sci. 2006;1(1):28.
Bandura A. The explanatory and predictive scope of self-efficacy theory. J Soc Clin Psychol. 1986;4(3):359–73.
Godin G, Bélanger-Gravel A, Eccles M, Grimshaw J. Healthcare professionals' intentions and behaviours: a systematic review of studies based on social cognitive theories. Implement Sci. 2008;3(36):1–12.
Grimshaw JM, Eccles MP, Steen N, Johnston M, Pitts NB, Glidewell L, Maclennan G, Thomas R, Bonetti D, Walker A. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of lumbar spine x-ray for low back pain in UK primary care practice. Implement Sci : IS. 2011;6(1):55.
Munro S, Lewin S, Swart T, Volmink J. A review of health behaviour theories: how useful are these for developing interventions to promote long-term medication adherence for TB and HIV/AIDS? BMC Public Health. 2007;7(1):104.
Wilkinson SA, Hinchliffe F, Hough J, Chang AM. Baseline evidence-based practice use, knowledge, and attitudes of allied health professionals: a survey to inform staff training and Organisational change. J Allied Health. 2012;41(4):177–84.
Tilson J, Kaplan SL, Harris JL, Hutchinson A, Ilic D, Niederman R, Potomkova J, Zwolsman SE. Sicily statement on classification and development of evidence-based practice learning assessment tools. BMC Med Educ. 2011;11(1):78.
Brown CE, Kim SC, Stichler JF, Fields W. Predictors of knowledge, attitudes, use and future use of evidence-based practice among baccalaureate nursing students at two universities. Nurse Educ Today. 2010;30(6):521–7.
Ramis M-A, Chang A, Nissen L. Undergraduate health students’ intention to use evidence-based practice after graduation: a systematic review of predictive modeling studies. Worldviews Evid-Based Nurs. 2017:1–9.
Ramis M-A, Chang A, Nissen L. Strategies for teaching evidence-based practice to undergraduate health students: a systematic review protocol. JBI Database System Rev Implement Rep. 2015;13(2):12-25.
Ahmadi N, McKenzie ME, MacLean A, Brown CJ, Mastracci T, McLeod RS. Teaching evidence based medicine to surgery residents-is journal Club the best format? A systematic review of the literature. J Surg Educ. 2012;69(1):91–100.
Tufanaru C, Munn Z, Aromataris E, Campbell J, Hopp L. Chapter 3: Systematic reviews of effectiveness. In: Aromataris E, Munn Z, editors. Joanna Briggs Institute Reviewer's Manual. Adelaide: The Joanna Briggs Institute; 2017. Available from https://reviewersmanual.joannabriggs.org/.
Cook DA, Beckman TJ, Bordage G. Quality of reporting of experimental studies in medical education: a systematic review. Med Educ. 2007;41(8):737–45.
Moher D, Liberati A, Tetzlaff J, Altman DG. The PRISMA group: preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(6):e1000097.
Ashktorab T, Pashaepour S, Rassouli M, Alavi Majd H. The effectiveness of evidence-based practice education in nursing students based on Roger's diffusion of innovation model. Middle East J Sci Res. 2014;16(5):684–91.
Kim SC, Brown CE, Fields W, Stichler JF. Evidence-based practice-focused interactive teaching strategy: a controlled study. J Adv Nurs. 2009;65(6):1218–27.
Liabsuetrakul T, Suntharasaj T, Tangtrakulwanich B, Uakritdathikarn T, Pornsawat P. Longitudinal analysis of integrating evidence-based medicine into a medical student curriculum. Fam Med. 2009;41(8):585–8.
Liabsuetrakul T, Sirirak T, Boonyapipat S, Pornsawat P. Effect of continuous education for evidence-based medicine practice on knowledge, attitudes and skills of medical students. J Eval Clin Pract. 2013;19(4):607–11.
Long JD, Gannaway P, Ford C, Doumit R, Zeeni N, Sukkarieh-Haraty O, Milane A, Byers B, Harrison L, Hatch D. Effectiveness of a technology-based intervention to teach evidence-based practice: the EBR tool. Worldviews Evid-Based Nurs. 2016;13(1):59–65.
Bandura A. Self-efficacy: toward a unifying theory of behavioral change. Psychol Rev. 1977;84(2):191–215.
Bandura A. Self-efficacy: the exercise of control. New York: W.H. Freeman and Company; 1997.
Rogers EM. Diffusion of innovations. 4th ed. New York: Simon and Schuster; 2010.
Dennen VP, Burner KJ: The cognitive apprenticeship model in educational practice. In: Handbook of research on educational communications and technology. Edited by Spector LM, Merrill MD, van Merrienbier J, Driscoll MP, 3rd edn. New York: Taylor & Francis; 2008: 425–439.
Sweller J. Cognitive load theory, learning difficulty, and instructional design. Learn Instr. 1994;4(4):295–312.
Patel VL, Yoskowitz NA, Arocha JF. Towards effective evaluation and reform in medical education: a cognitive and learning sciences perspective. Adv Health Sci Educ. 2009;14(5):791–812.
Rubin A, Parrish DE. Development and validation of the evidence-based practice process assessment scale: Preliminary findings. Res Soc Work Pract. 2010;20(6):629-40.
Johnston JM, Leung GM, Fielding R, Tin KYK, Ho L. The development and validation of a knowledge, attitude and behaviour questionnaire to assess undergraduate evidence-based practice teaching and learning. Med Educ. 2003;37(11):992–1000.
Ivanitskaya LV, Hanisko KA, Garrison JA, Janson SJ, Vibbert D. Developing health information literacy: a needs analysis from the perspective of preprofessional health students. J Med Libr Assoc. 2012;100(4):277.
Cheng HM, Guo FR, Hsu TF, Chuang SY, Yen HT, Lee FY, Yang YY, Chen TL, Lee WS, Chuang CL. Two strategies to intensify evidence-based medicine education of undergraduate students: a randomised controlled trial. Ann Acad Med Singap. 2012;41(1):4–11.
Ma X, Xu B, Liu Q, Zhang Y, Xiong H, Li Y. Effectiveness of evidence-based medicine training for undergraduate students at a Chinese Military Medical University: a self-controlled trial. BMC Med Educ. 2014;14(133):133.
Widyahening IS, van der Heijden GJ, Moy FM, van der Graaf Y, Sastroasmoro S, Bulgiba A. Direct short-term effects of EBP teaching: change in knowledge, not in attitude; a cross-cultural comparison among students from European and Asian medical schools. Med Educ Online. 2012;17.
Alt D. Assessing the contribution of a constructivist learning environment to academic self-efficacy in higher education. Learning Environ Res. 2015;18(1):47–67.
Kilgour JM, Grundy L, Monrouxe LV. A rapid review of the factors affecting healthcare Students' satisfaction with small-group, Active Learning Methods. Teach Learn Med. 2016;28(1):15–25.
Lim D, Hou X-Y, Tippett V. Teaching epidemiology to undergraduate paramedics. Australas epidemiol. 2016;23(1):24–6.
Phillips AC, Lewis LK, McEvoy MP, Galipeau J, Glasziou P, Hammick M, Moher D, Tilson JK, Williams MT. A systematic review of how studies describe educational interventions for evidence-based practice: stage 1 of the development of a reporting guideline. BMC Med Educ. 2014;14(1):152.
Spek B, Wieringa-de Waard M, Lucas C, Dijk N. Teaching evidence-based practice (EBP) to speech-language therapy students: are students competent and confident EBP users? Int J Lang Commun Disord. 2013;48(4):444–52.
Florin J, Ehrenberg A, Wallin L, Gustavsson P. Educational support for research utilization and capability beliefs regarding evidence-based practice skills: a national survey of senior nursing students. J Adv Nurs. 2012;68(4):888–97.
Gloudemans H, Schalk R, Reynaert W, Braeken J. The development and validation of a five-factor model of sources of self-efficacy in clinical nursing education. J Nurs Educ Pract. 2012;3(3):80.
Bozzolan M, Simoni G, Balboni M, Fiorini F, Bombardi S, Bertin N, Da Roit M. Undergraduate physiotherapy students' competencies, attitudes and perceptions after integrated educational pathways in evidence-based practice: a mixed methods study. Physiother Theory Pract. 2014;30(8):557–71.
Melnyk BM. The evidence-based practice Mentor: a promising strategy for implementing and sustaining EBP in healthcare systems. Worldviews Evid-Based Nurs. 2007;4(3):123–5.
Olsen NR, Lygren H, Espehaug B, Nortvedt MW, Bradley P, Bjordal JM. Evidence-based Practice Exposure and Physiotherapy Students' Behaviour during Clinical Placements: A Survey. Physio Res Int. 2014;19(4):238–47.
Breckler SJ. Empirical validation of affect, behavior, and cognition as distinct components of attitude. J Pers Soc Psychol. 1984;47(6):1191.
Eagly AH, Chaiken S. The psychology of attitudes: Harcourt brace Jovanovich college publishers; 1993.
Reed D, Price E, Windish D, Wright S, Gozu A, Hsu E, Beach M, Kern D, Bass E. Challenges in systematic reviews of educational intervention studies. Ann Intern Med. 2005;142:1080–9.
Petticrew M. Time to rethink the systematic review catechism? Moving from ‘what works’ to ‘what happens’. Syst Rev. 2015;4(1):36.
Kirkham JJ, Altman DG, Williamson PR. Bias due to changes in specified outcomes during the systematic review process. PLoS One. 2010;5(3):e9810.
Silagy CA, Middleton P. Hopewell S. Publishing protocols of systematic reviews: comparing what was done to what was planned. JAMA. 2002;287(21):2831.
Thank you to Peter Sondergeld, QUT Health Librarian, for advice on the search strategy for the review. Thank you also to Elia Barajas Alonso for assistance with critical appraisal.
Ethical approval and consent to participate
Not applicable for this systematic review.
No funding was directly attributable to this review.
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.