This article has Open Peer Review reports available.
Adopting a blended learning approach to teaching evidence based medicine: a mixed methods study
© Ilic et al.; licensee BioMed Central Ltd. 2013
Received: 19 September 2013
Accepted: 12 December 2013
Published: 17 December 2013
Evidence Based Medicine (EBM) is a core unit delivered across many medical schools. Few studies have investigated the most effective method of teaching a course in EBM to medical students. The objective of this study was to identify whether a blended-learning approach to teaching EBM is more effective a didactic-based approach at increasing medical student competency in EBM.
A mixed-methods study was conducted consisting of a controlled trial and focus groups with second year graduate medical students. Students received the EBM course delivered using either a didactic approach (DID) to learning EBM or a blended-learning approach (BL). Student competency in EBM was assessed using the Berlin tool and a criterion-based assessment task, with student perceptions on the interventions assessed qualitatively.
A total of 61 students (85.9%) participated in the study. Competency in EBM did not differ between the groups when assessed using the Berlin tool (p = 0.29). Students using the BL approach performed significantly better in one of the criterion-based assessment tasks (p = 0.01) and reported significantly higher self-perceived competence in critical appraisal skills. Qualitative analysis identified that students had a preference for the EBM course to be delivered using the BL approach.
Implementing a blended-learning approach to EBM teaching promotes greater student appreciation of EBM principles within the clinical setting. Integrating a variety of teaching modalities and approaches can increase student self-confidence and assist in bridging the gap between the theory and practice of EBM.
Asking a clinical question that is constructed using the PICO (patient, intervention, comparison, outcome) framework;
Acquiring the evidence via a systematic and efficient search of the literature;
Appraising the evidence through the application of critical appraisal techniques;
Applying the evidence to the clinical scenario; and,
Assessing the EBM process as it relates to the clinical context .
Each step within the EBM process requires a different level of knowledge and skill (i.e. competence) from the user . Achieving a high level of competency in EBM can only be achieved when the user is able to effectively undertake all five steps, which incorporate adequate levels of knowledge, skills and behavioural elements . Achieving competency in the principles of EBM can provide the user with the ability to achieve lifelong learning within the clinical setting.
Learning is influenced by a variety of factors including the student, teacher, course/curriculum and educational environment . In creating a supportive educational environment, educators must consider the different learning styles preferred by students including; visual, auditory, kinaesthetic, procedural, or a combination of these . Continuing medical education has traditionally been facilitated through the use of didactic lectures [7, 8]. Recent educational research has shifted the focus on self-directed and adult educational pedagogies through a variety of delivery modalities (lectures, interactive workshops, practice-based interventions, problem-based learning and simulation through eLearning) for optimal educational outcomes [7–12].
Limited research has been conducted into evidence to inform the best method of teaching EBM. A 2004 systematic review identified two randomised controlled trials (RCTs) and seven non-RCTs that examined the impact of post-graduate teaching in EBM . The authors concluded that that standalone teaching improved student knowledge, but not skills, attitudes or behaviour in EBM. Conversely, integrating teaching of EBM with clinical activities resulted in improvements across all four outcomes .
Few rigorous studies have explored methods of teaching EBM to medical students. A 2005 RCT evaluated computer-assisted self-directed learning with workshops in EBM with undergraduate medical students . This study concluded no difference in student knowledge, skill or attitudes on EBM across the two interventions . Conversely, a 2010 study with medical undergraduates assessed the integration of online learning of EBM skills with clerkships during the third year of study . Using a before and after methodology it identified that student competency in EBM was significantly improved over the duration of the course .
A 2008 RCT explored the impact of teaching EBM using a computer-based approach compared to traditional didactic lectures to medical undergraduate students . The study demonstrated equivalency in EBM knowledge and attitude scores between students who received the computer-based intervention compared to students receiving the course via didactic lectures . These findings were also reflected in an early study with medical post-graduates exploring the same delivery modes . More recently, a RCT demonstrated that teaching EBM via a case conference resulted in significantly higher knowledge and personal application of EBM related content in final year medical students, compared to those receiving the same information in a didactic format .
Over the last decade many medical schools have reduced the amount of didactic teaching and implemented a problem-based learning (PBL) approach to teaching clinical skills to medical students. Within this context, a PBL approach utilises authentic clinical queries from which students utilised their existing knowledge to explore and construct new knowledge, skills, attitudes and behaviours . Implementing a PBL, or case-based approach, within a medical curriculum provides an opportunity for students to contextualise their learning within the clinical environment . A 2009 RCT examined the effectiveness of delivering an EBM course to medical students using a PBL approach compared to usual teaching methods (lecture plus tutorial) . This RCT identified that the PBL approach was less effective than usual teaching at improving student knowledge in EBM, but was more effective at increasing positive attitudes toward EBM .
An extension of utilising the PBL approach in teaching EBM is blended-learning. Utilising a PBL approach to teaching EBM attempts to add the element of ‘clinical realism’ to the case. Blended-learning, whereby the use of digital technology and other ‘non-traditional’ teaching methods are integrated to add greater flexibility to the teaching curriculum, but also account for differing learning styles exhibited by students [21, 22]. Relatively few studies have empirically examined the effectiveness of blended-learning in medicine, with all studies focusing on the impact of blended-learning in a clinical discipline. Results of those published studies commonly report an increase in student satisfaction with the content, better use of time in class, increase in knowledge and promote self-directed learning [23–25].
Currently there is a lack of consensus within the medical literature as to the most effective method of teaching medical students the principles of EBM. The overall aim of this study was to identify whether a blended-learning approach to teaching EBM was more effective than a didactic-learning approach at increasing medical student competency in EBM. Student perceptions regarding the strengths and limitations of each mode of delivery were also sought.
A mixed methods approach consisting of a controlled trial and focus group was adopted for this study .
Study design and setting
Overview of the EBM course content
Key EBM content covered
1. Introduction to EBM
• Rationale for EBM in medicine
• How to construct a clinical question
2. Searching the medical literature
• Overview of relevant medical and healthcare databases
• How to construct a search strategy
• Overview of biostatistical concepts including;
▪ Categorical versus numerical data
▪ Use of appropriate statistical analysis
4. Critical appraisal of studies of therapy (part 1)
• Introduction to RCTs
• Measures of effect (relative risk, number needed to treat, absolute risk)
• P-values and confidence intervals
• Critical appraisal techniques for studies of therapy
5. Critical appraisal of studies of therapy (part 2)
• Continuation of session 4.
6. Critical appraisal of studies of harm (part 1)
• Introduction to cohort studies
• Measures of effect (Odds ratios and number needed to harm)
• Critical appraisal techniques for studies of harm (specific to cohort studies)
7. Critical appraisal of studies of harm (part 2)
• Introduction to case–control studies
• Measures of effect (Odds ratios and number needed to harm)
• Critical appraisal techniques for studies of harm (specific to case–control studies)
8. Critical appraisal of studies of diagnosis
• Overview of concepts specific to diagnosis including;
▪ Sensitivity and specificity
▪ Positive and negative predictive values
▪ Positive and negative likelihood ratios
• Critical appraisal techniques for studies of diagnosis
9. Critical appraisal of studies of prognosis
• Overview of concepts specific to prognosis including;
▪ Longitudinal study designs (including time series)
▪ Use of survival curves and hazard ratios
• Critical appraisal techniques for studies of prognosis
10. Critical appraisal of systematic reviews
• Introduction to how systematic reviews are constructed
• Overview of how to interpret meta-analysis including;
▪ Forest plots
▪ Sensitivity analysis
▪ Significance of heterogeneity
• Critical appraisal techniques for systematic reviews
Second year graduate medical students were recruited from four teaching hospitals associated with the course (Traralgon, Warragul, Sale and Peninsula). In order to meet eligibility, participants were required to be a second year Monash graduate MBBS student at the time of the study. Students who were unwilling to participate in the study, or did not wish to provide consent, were excluded from the recruitment process.
Blended-learning (BL)EBM delivery
Students allocated to the blended-learning model received a one-day ‘block’ workshop, which covered all the EBM concepts that are delivered in the existing tutorial-based delivery of the EBM course. This ‘block’ workshop utilised two tutorial sessions worth of time. Students were directed to additional EBM content, accessible through the Monash University library website, to support self-directed learning. The remaining eight tutorials designated to EBM teaching were used in this group for students to present their patient-based EBM scenarios and generate discussion with the tutor, who in the BL approach acted as a facilitator rather than a tutor, in order to facilitate discussion within the group and promote peer to peer learning . Peer to peer learning was facilitated through the use of a quasi-journal club delivery method . At the beginning of each tutorial session, the facilitator would divide students into small groups, with each student given a specific health topic, or intervention/exposure, to investigate. Students were then required to identify a patient during their clinical rotation, for which the scenario would be applicable. Students were required to take a detailed medical history from the patient, adopt the principles of EBM and identify and critically appraise an article on the topic that could be applied to the patient. At the following tutorial session students were required to present their patient and related EBM content as part of a patient-based presentation.
Didactic (DID)EBM delivery
Students allocated to this group received the EBM course delivered via a didactic-learning approach, which is the existing mode of delivery for the course. In this version of the EBM course, students attend 10 two-hour tutorial sessions. An outline of the EBM course is presented in Table 1. All of the sessions begin with the tutor providing a short presentation on the relevant EBM concept for the session. This is followed by students completing small group tasks and participating in large group discussions, based on the teaching materials, with the tutors leading the discussion. Tutors in the DID group led the tutorial with structured activities and were therefore classified as ‘tutors’, rather than promoting peer learning and facilitating discussion within the group as per the ‘facilitators’ in the BL group.
The primary outcome measured in this study was competency in EBM. Competency in EBM was measured using the previously validated Berlin tool . The Berlin tool consists of 15 multiple choice questions, which assesses knowledge and skills in EBM. The maximum score on the Berlin tool is 15. During the EBM course, students complete two criterion-based course assessment tasks, which assesses student competency across the first four steps of the EBM continuum. Both tasks require students to (i) identify an appropriate clinical scenario, (ii) based on the scenario, construct a clinical question, (iii) identify an appropriate study from the literature to answer the question, (iv) critically appraise the article, (v) implement the findings to their clinical scenario). The first assignment is based on a ‘therapy’ scenario, whilst the second assessment task is based on a ‘harm’ scenario. Both assessment tasks are criterion-based, with a final score calculated out of 100%. Both assignments were graded by EBM tutors participating in this study based on a previously developed rubric. No psychometric testing of the marking rubric was performed. All outcome measures were assessed at the conclusion of the respective EBM courses. Students also completed a questionnaire that assessed their self-perceived competence across the various EBM skills and attitudes toward the course. All questions were measured on a five-point Likert scale (1 = strongly disagree, 5 = strongly agree). The questionnaire was specifically developed for this study, but did not undergo any psychometric testing. All outcomes were assessed at the conclusion of the EBM program during the second year of the graduate program.
Qualitative data collection
A phenomenological approach to collecting qualitative data, through the use of focus groups, was adopted to identify student perceptions on the delivery of the EBM course using the existing DID versus the BL learning approach . Collection of qualitative data through focus groups provides a collective perspective on the topic of interest, and facilitated quick access to a larger sample compared to in-depth interviews . Focus groups were conducted with students across all four clinical sites at the conclusion of their respective EBM courses. All students were recruited via convenience sampling through a bulk email sent to each clinical site at the conclusion of the EBM teaching program. Students interested in participating in a focus group were required to contact the clinical site administrator, who then organised a suitable time and date. Each focus group consisted of between six to eight students per clinical site. All students volunteered to participate in the focus groups and were not paid for their contribution. The two focus groups at the Peninsula clinical site were moderated by the same facilitator (DI), an experienced facilitator who is also the coordinator of the EBM course at Monash University. A semi-structured interview guide was developed from a review of the literature before the commencement of focus groups. The use this guide ensured that all discussion points were consistent across the focus groups. The remaining three focus groups across the Traralgon, Warragul and Sale were facilitated by an independent researcher, using the same discussion points as used in the Peninsula focus groups.
Quantitative data were assessed for Normality before analysis. Difference in EBM competency based on the Berlin tool and the assessment tasks was assessed using the two-tailed, non-parametric Mann–Whitney U test. Differences in student self-perceived competency in EBM, and attitudes toward EBM, was also assessed using the two-tailed, non-parametric Mann–Whitney U test. All focus groups were audio-taped with a digital recorder, downloaded onto computer and transcribed verbatim by an administrator within the Department of Epidemiology & Preventive Medicine. All transcripts were de-identified to preserve the anonymity of participants. All transcripts were analysed independently by two investigators (DI and MM) using the principles of thematic analysis, with the assistance of the NVivo program . Themes were identified by coding features of the data, then collating into relevant themes, before finalising the specifics of each theme . Both investigators independently coded and categorised emerging themes from the data, before a consensus on the overall themes was reached.
Ethics approval for this study was received by the Monash University Standing Committee on Ethics in Research Involving Humans.
Assessment of student competency in EBM using the Berlin tool and a criterion-based course assessment task
BL(n = 34)
DID(n = 27)
Berlin tool (mean score (95% CI))
BL (n = 36)
DID (n = 35)
Assessment Task 1 (mean percentage (95% CI))
Assessment Task 2 (mean percentage (95% CI))
Student self-perceptions about EBM competency and attitudes about EBM
BL(n = 27) (mean score (95% CI))
DID(n = 34) (mean score (95% CI))
1. I can confidently construct an answerable question using the PICO framework
2. I can conduct an effective literature search using MEDLINE
3. I understand how biases (selection, performance, attrition, detection) may affect the validity of a study
4. I can confidently calculate and interpret different measures of effect (i.e. RR, RRR, ARR, NNT)
5. I can confidently critically appraise studies of ‘therapy’ and apply the findings to a clinical context
6. I can confidently critically appraise studies of ‘harm’ and apply the findings to a clinical context
7. I can confidently critically appraise studies of ‘diagnosis’ and apply the findings to a clinical context
8. I can confidently critically appraise studies of ‘prognosis’ and apply the findings to a clinical context
9. I can interpret a systematic review and apply the findings to a clinical context
10. This unit enabled me to achieve its learning objectives
11. I found the unit to be intellectually stimulating
12. Overall I was satisfied with the quality of this unit
13. I have used my EBM skills when studying this year
14. The workload for each EBM session was reasonable
15. I believe that I will use my EBM skills during my clinical career
16. I believe that practicing evidence based medicine is critical in being a good clinician
Using a blended-learning approach
“I like that we got to base the research on a real patient. We got to dissect the different aspects as they relate to EBM and come up with a relevant topic that we could discuss.”
“The presentations each week that the (facilitator) provided were pretty good as well. It means that you’re doing your own research, and when you do that you kind of consolidate what you’re learning.”
“The (patient) case-based learning presentations are very interactive - you’re applying it as you learn it (on the ward).”
“Something I found really frustrating was that you’d get this topic and you’d go home and look it up, and you’d start looking into it and you’d realise that it didn’t make sense, because we didn’t know anything about it.”
“I think the sessions are good, but it would also be good to use the time to go through certain concepts.”
“I didn’t mind having it all on the one go at the start: It was good, it allows us to consolidate all the content in a day, and then hopefully apply it throughout the year.”
“I think we were doing refresher tutes throughout the year as well. Because sometimes I feel I’m out of touch with certain concepts and equations.”
Role of the tutor/facilitator
“I think it would be good to have clinicians because they have been through the process of searching journal articles to look up for the latest treatments and all that. So they make the teaching more relevant to us, in a sense.”
“When you’re presenting the case and you’re trying to form a question, you really want someone that knows exactly what goes on.”
“It would be useful for them (the facilitators) to help us read up on what might be appropriate on the topic, then whoever is presenting for the week would know that they have to go and find a patient who has a thyroid issue or whatever it may be…”
Using the didactic approach
“The presentations, I thought, were taught better than they were last year. They actually made sense. Last year was all jumbled up, but I thought this year was a bit more structured, with what we were supposed to get out of it.”
“The way in which they (the tutors) were delivering the material was boring. It didn’t seem to me like here were two people who had sat down and thought ‘how can we best deliver this material?’ It was like, ‘well, here are the slides, we’ll read through them and deliver the material.’ I think if they’d used the two hours – like, if every week we’d had that two hours used more effectively, we would have been really, really strong in this subject, and I don’t think we are.”
“Look, we go back to ‘back to base’, where we get clumped up into a bigger group. I would be more than happy to study this subject in a bigger group, with one expert in EBM teaching. I’d love it. It would be better over this eight people in a broken up group with somebody who can’t teach it.”
Role of the tutor
“You have lawyers come and try and teach us law, that’s not appropriate. But if you have lawyers who know their topic very well and understand that they’re giving it to medical students, it’s still really useful. The same with this (EBM).”
Common themes across both groups
Use of a dedicated library session
“The librarian actually taught us how to use stuff we needed to know… that, sort of practical ‘how do you go about doing it? type of stuff.” (DID group)
“They’re (the assessment tasks) pretty comprehensive; you’ve got to cover a lot. It’s good to know the ins and outs of assessing articles and knowing whether they’re good or not.” (BL group)
“The assignments tested what we were supposed to be taught very well.” (DID group)
The use of EBM as clinicians
“I think it’s an essential part of being a clinician. It’s kind of what separates us from quacks – to be able to critically appraise evidence, and also to use those tools to further medicine as well.” (DID group)
“As we’re specialising, and trying to keep up to date with all the different things, that’s when we’ll use it the most – to see if this new information is valid or not.” (BL group)
This study generates novel findings on the impact of adopting a blended-learning approach to EBM in graduate-entry medical students. Our findings also demonstrated no difference in EBM competency between students who received a traditional didactic, tutorial-based implementation of an EBM course compared to a blended-learning approach. Conversely, it identified that students prefer utilising a blended-learning approach to learning EBM as it is perceived to offer a greater opportunity to integrate the theoretical concepts of EBM with the practical situations of clinical practice.
Findings from this study concur with those of a systematic review that concluded that standalone teaching may only improve knowledge, but not attitudes, skills and behaviour in EBM in postgraduate students . Similarly, it provides further evidence that utilising a PBL approach to EBM may increase student attitudes and behaviour towards adopting the principles of EBM in clinical practice . Students exposed to the BL approach found the EBM unit more intellectually stimulating, were able to translate their EBM skills to other components of their study and appreciated the link between theory and practice.
If an evidence-based approach to medicine is to be practiced by clinicians, then these future clinicians need to be taught how to use EBM as students during their clinical years. Providing evidence, be it physically or the tools to effectively search, identify, evaluate and implement, to busy clinicians increases the extent to which evidence is sought and incorporated into medical decision making . Integrating EBM teaching alongside bedside and other PBL and blended-learning approaches provides students with an opportunity to improve competence in both their EBM and clinical skills – a nexus that it essential if EBM is to be applied in the clinical setting.
EBM has been criticised as ‘cookbook’ medicine and something that can only be practiced by those in ivory towers . The principles of EBM rely on the integration of evidence, clinical expertise and patient values – all of which will differ across clinical scenarios. Studies have also demonstrated that clinicians, who practice their EBM skills in their limited downtime, can incorporate evidence and practice EBM in ‘real-time’ . The proportion of clinicians incorporating and practicing EBM in their daily clinical workload varies considerably [34, 35]. Barriers to successful implementation as practicing clinicians may include a lack of time, resources, patient-related factors or influence of peers . Providing medical students with the knowledge and skills in EBM increases their ability to implement such skills in the clinical setting . It remains uncertain whether the influence of the above mentioned barriers negates the transfer of their EBM skills in clinical practice.
This study demonstrates that adopting blended-learning approach to teaching and learning EBM provides a framework that integrates with the existing steps of the EBM process. The blended-learning approach is clinically focused, with the problem-based aspect encouraging learners to rely on their existing EBM knowledge whilst implementing their EBM skills to identify, evaluate and implement evidence relevant to the clinical scenario. This approach demonstrates to medical students at an early clinical phase of their education that EBM is not ‘cookbook’ medicine, but a lifelong tool that can be applied in the clinical environment [38, 39].
The principles of EBM place the RCT as the ‘gold’ standard since in study design since many methodological issues including selection, performance, attrition and detection biases may be controlled. This study was not a RCT, but a pragmatic trial, since it was not possible to randomise and blind individual students to the intervention. The use of a mixed methods approach, integrating quantitative and qualitative data further contextualised and triangulates the results of this study. This study has demonstrated the effectiveness of adopting a blended-learning approach to teaching EBM. This blended-learning approach was successfully implemented in a small teaching hospital. The feasibility of implementing this approach in a large teaching hospital remains uncertain. Student numbers will dictate how many facilitators are required, of which few seem to have both the clinical and EBM expertise so often desired by students.
DI is the coordinator of the EBM program, but also facilitated the focus group discussions. This raises the possibility that this dual role may influence the manner in which students express their perceptions about the BL and DID learning styles. During the recruitment and conduct of the focus groups, it was strongly reiterated that participants may openly express any views on the EBM course; which would seem to be reflected in the responses provided. Assess of EBM competency was assessed by the Berlin tool, which has been previously validated and psychometrically tested for this purpose. Both the assessment tasks and self-reported perception questionnaire have not been psychometrically validated.
The findings from this study suggest that a blended-learning approach to teaching EBM promotes greater student appreciation and increase in self-confidence in using the EBM principles within the clinical setting. This direct application to the clinical environment provides an opportunity to bridge the gap between theory and practice. Future research is required to investigate whether similar findings are apparent in undergraduate-based medical students and the feasibility of implementing such a program among a large student cohort.
DI is an Associate Professor in Evidence Based Clinical Practice at the School of Public Health & Preventive Medicine, Monash University.
WH is Professor Hart is the Foundation Head of Medicine at Curtin University.
PF is the Director of Undergraduate Clinical Education and Clinical Training at Peninsula Health and Adjunct Associate Clinical Professor Monash University.
MM is the Head of the Evidence Synthesis Program at the School of Public Health & Preventive Medicine, Monash University.
EV is an Associate Professor in Public Health and the Director of Research at the Gippsland Medical School, Monash University.
The authors wish to thank all students who kindly participated in the study.
- Finkel M, Brown H, Gerber L, Supino P: Teaching evidence-based medicine to medical students. Med Teach. 2003, 25: 202-204. 10.1080/0142159031000092634.View ArticleGoogle Scholar
- Straus S, Glasziou P, Richardson W, Haynes R: Evidence-based medicine. How to practice and teach it. 2011, Toronto: Churchill Livingstone ElsevierGoogle Scholar
- Ilic D: Assessing competency in evidence based practice: strengths and limitations of current tools in practice. BMC Med Educ. 2009, 9: 53-10.1186/1472-6920-9-53.View ArticleGoogle Scholar
- Holmboe E, Hawkins R: Methods for evaluating the clinical competence of residents in internal medicine: a review. Ann Intern Med. 1998, 129: 42-48. 10.7326/0003-4819-129-1-199807010-00011.View ArticleGoogle Scholar
- Hutchinson L: ABC of learning and teaching. Educational environment. BMJ. 2003, 326: 810-812. 10.1136/bmj.326.7393.810.View ArticleGoogle Scholar
- Brown N: What makes a good educator? The relevance of meta programmes. Assessment and Evaluation in Higher Education. 2004, 29: 515-553. 10.1080/0260293042000197618.View ArticleGoogle Scholar
- Davis D, O’Brien M, Freemantle N, Wolf F, Mazmaniam P, Taylor-Vaisey A: Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes?. JAMA. 1999, 282: 867-874. 10.1001/jama.282.9.867.View ArticleGoogle Scholar
- Grimshaw J, Eccles M: Is evidence-based implementation of evidence-based care possible?. MJA. 1810, 2004: S50-S51.Google Scholar
- Koh G, Khoo H, Wong M, Koh D: The effects of problem-based learning during medical school on physician competency: a systematic review. CMAJ. 2008, 178: 34-41. 10.1503/cmaj.070565.View ArticleGoogle Scholar
- Ruiz J, Mintzer M, Leipzig R: The impact of e-learning in medical education. Acad Med. 2006, 81: 207-212. 10.1097/00001888-200603000-00002.View ArticleGoogle Scholar
- Varkey P, Karlapudi S, Rose S, Nelson R, Warner M: A systems approach for implementing practice-based learning and improvement and systems-based practice in graduate medical education. Acad Med. 2009, 84: 335-339. 10.1097/ACM.0b013e31819731fb.View ArticleGoogle Scholar
- McGaghie W, Issenberg S, Petrusa E, Scalese J: A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010, 44: 50-63. 10.1111/j.1365-2923.2009.03547.x.View ArticleGoogle Scholar
- Coomarasamy A, Khan K: What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004, 329: 1017-10.1136/bmj.329.7473.1017.View ArticleGoogle Scholar
- Bradley P, Oterholt C, Herrin J, Nordheim L, Bjorndal A: Comparison of directed and self-directed learning in evidence-based medicine: a randomised controlled trial. Med Educ. 2005, 39: 1027-1035. 10.1111/j.1365-2929.2005.02268.x.View ArticleGoogle Scholar
- Aronoff S, Evans B, Fleece D, Lyons P, Kaplan L, Rojas R: Integrating evidence based medicine into undergraduate medical education: combining online instruction with clinical clerkships. Teach Learn Med. 2010, 22: 219-213. 10.1080/10401334.2010.488460.View ArticleGoogle Scholar
- Davis D, Crabb S, Rogers E, Zamora J, Khan K: Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomized controlled trial. Med Teach. 2008, 30: 302-307. 10.1080/01421590701784349.View ArticleGoogle Scholar
- Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A: Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomised controlled trial. BMC Med Educ. 2007, 7: 23-10.1186/1472-6920-7-23.View ArticleGoogle Scholar
- Cheng H, Guo F, Hsu T, Chuang S, Yen H, Lee F, Yang Y, Chen T, Lee W, Chuang C, et al: Two strategies to intensify evidence-based medicine education of undergraduate students: a randomised controlled trial. Annals of Academic Medicine Singapore. 2012, 41: 4-11.Google Scholar
- Dochy F, Segers M, Van Den Bossche P, Struyven K: Students’ perceptions of a problem-based learning environment. Learn Environ Res. 2005, 8: 41-66. 10.1007/s10984-005-7948-x.View ArticleGoogle Scholar
- Johnston J, Schooling M, Leung G: A randomised-controlled trial of two educational modes for undergraduate evidence-based medicine learning in Asia. BMC Med Educ. 2009, 9: 63-10.1186/1472-6920-9-63.View ArticleGoogle Scholar
- Ruiz J, Mintzer M, Issenberg S: Learning objects in medical education. Med Teach. 2006, 28: 599-605. 10.1080/01421590601039893.View ArticleGoogle Scholar
- Osguthorpe R, Graham C: Blended learning environments: definitions and directions. Quarterly Review of Distance Education. 2003, 4: 227-233.Google Scholar
- Lehmann R, Bosse H, Simon A, Nikendei C, Huwendiek S: An innovative blended learning approach using virtual patients as preparation for skills laboratory training: perceptions of students and tutors. BMC Med Educ. 2013, 13: 23-10.1186/1472-6920-13-23.View ArticleGoogle Scholar
- Grasl M, Pokieser P, Gleiss A, Brandstaetter J, Sigmund T, Erovic B, Fischer M: A new blended learning concept for medical students in otolaryngology. Arch Otolaryngol Head Neck Surg. 2012, 138: 358-366. 10.1001/archoto.2012.145.View ArticleGoogle Scholar
- Woltering V, Herrler A, Spitzer K, Spreckelsen C: Blended learning positively affects students’ satisfaction and the role of the tutor in the problem-based learning process: results of a mixed-method evaluation. Adv Health Sci Educ. 2009, 14: 725-738. 10.1007/s10459-009-9154-6.View ArticleGoogle Scholar
- Johnson B, Onwuegbuzie A: Mixed methods research: a research paradigm whose time has come. Educ Res. 2004, 33: 14-26.View ArticleGoogle Scholar
- Lincoln M, McAllister L: Peer learning in clinical education. Med Teach. 1993, 15: 17-25. 10.3109/01421599309029007.View ArticleGoogle Scholar
- Harris J, Kearley K, Heneghan C, Meats E, Roberts N, Perera R, Kearley-Shiers K: Are journal clubs effective in supporting evidence-based decision making? A systematic review. BEME Guide No. 16. Med Teach. 2011, 33: 9-23. 10.3109/0142159X.2011.530321.View ArticleGoogle Scholar
- Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer H, Kunz R: Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ. 2002, 325: 1338-1341. 10.1136/bmj.325.7376.1338.View ArticleGoogle Scholar
- Braun V, Clarke V: Using thematic analysis in psychology. Qual Res Psychol. 2006, 3: 77-101. 10.1191/1478088706qp063oa.View ArticleGoogle Scholar
- Liamputtong P: Research methods in health. Foundations for evidence-based practice. 2010, South Melbourne: Oxford University PressGoogle Scholar
- Sackett D, Straus S: Finding and applying evidence during clinical rounds. JAMA. 1998, 380: 1336-1338.View ArticleGoogle Scholar
- Sackett D, Rosenberg W, Gray M, Haynes R, Richardson W: Evidence based medicine: what it is and what it isn’t. BMJ. 1996, 312: 71-10.1136/bmj.312.7023.71.View ArticleGoogle Scholar
- McAlister F, Graham I, Karr G, Laupacis A: Evidence-Based Medicine and the practicing clinician. J Gen Intern Med. 1999, 14: 236-242. 10.1046/j.1525-1497.1999.00323.x.View ArticleGoogle Scholar
- Upton D, Upton P: Knowledge and use of evidence-based practice of GPs and hospital doctors. J Eval Clin Pract. 2006, 12: 376-384. 10.1111/j.1365-2753.2006.00602.x.View ArticleGoogle Scholar
- Zwolsman S, te Pas E, Hooft L, Wieringa-de Waard M, van Dijk N: Barriers to GPs’ use of evidence-based medicine: a systematic review. Br J Gen Pract. 2012, 62: e511-e521. 10.3399/bjgp12X652382.View ArticleGoogle Scholar
- Sastre E, Denny J, McCoy J, McCoy A, Spickard A: Teaching evidence-based medicine: Impact on students’ literature use and inpatient clinical documentation. Med Teach. 2011, 33: e306-e3012. 10.3109/0142159X.2011.565827.View ArticleGoogle Scholar
- Rengerink K, Thangaratinam S, Barnfield G, Suter K, Horvath A, Walczak J, Welminksa A, Weinbrenner S, Meyerrose B, Arvanitis T, et al: How can we teach EBM in clinical practice? An analysis of barriers to implementation of on-the-job EBM teaching and learning. Med Teach. 2011, 33: e125-e130. 10.3109/0142159X.2011.542520.View ArticleGoogle Scholar
- Thangaratinam S, Barnfield G, Weinbrenner S, Meyerrose B, Arvanitis T, Horvath A, Zanrei G, Kunz R, Suter K, Walczak J, et al: Teaching trainers to incorporate evidence-based medicine (EBM) teaching in clinical practice: the EU-EBM project. BMC Med Educ. 2009, 9: 59-10.1186/1472-6920-9-59.View ArticleGoogle Scholar
- The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1472-6920/13/169/prepub
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.