A hierarchy of effective teaching and learning to acquire competence in evidenced-based medicine
© Khan and Coomarasamy; licensee BioMed Central Ltd. 2006
Received: 12 May 2006
Accepted: 15 December 2006
Published: 15 December 2006
A variety of methods exists for teaching and learning evidence-based medicine (EBM). However, there is much debate about the effectiveness of various EBM teaching and learning activities, resulting in a lack of consensus as to what methods constitute the best educational practice. There is a need for a clear hierarchy of educational activities to effectively impart and acquire competence in EBM skills. This paper develops such a hierarchy based on current empirical and theoretical evidence.
EBM requires that health care decisions be based on the best available valid and relevant evidence. To achieve this, teachers delivering EBM curricula need to inculcate amongst learners the skills to gain, assess, apply, integrate and communicate new knowledge in clinical decision-making. Empirical and theoretical evidence suggests that there is a hierarchy of teaching and learning activities in terms of their educational effectiveness: Level 1, interactive and clinically integrated activities; Level 2(a), interactive but classroom based activities; Level 2(b), didactic but clinically integrated activities; and Level 3, didactic, classroom or standalone teaching.
All health care professionals need to understand and implement the principles of EBM to improve care of their patients. Interactive and clinically integrated teaching and learning activities provide the basis for the best educational practice in this field.
Evidence-based medicine (EBM) is often taught and learnt through attendance at courses, conferences, workshops, journal clubs, educational meetings, surveillance of medical literature and guidelines, and investments in textbooks . Many of these activities fall under broader concepts such as Continuing Professional development (CPD) , Knowledge Translation (KT)  and Continuing Medical Education (CME). The process of EBM  includes formulating structured queries from specific clinical problems, searching and acquiring relevant literature, appraising it for quality and, if appropriate, applying the findings taking into account patient's own preferences and values. EBM aims to incorporate more holistic perspectives, enlisting effective implementation strategies using the influence of CME . The trigger for enlisting in EBM related educational activities usually isn't the identification of an information need to resolve a particular clinical problem from a specific patient encounter, but rather a general wish to update oneself.
The different approaches to teaching and learning EBM are likely to be associated with varying levels of effectiveness in improving outcomes such as knowledge, skills, attitudes and clinician's behaviour. How can we develop a hierarchy of effective teaching and learning in imparting and acquiring competence in EBM? We considered the evidence on interventions for changing clinician behaviour, educational effectiveness of CME (CPD), and effective learning of EBM to produce a meaningful rank order of practical teaching and learning activities that are likely to be successful. Where there is absence of empirical evidence on the effectiveness of different strategies, we have relied on theoretical considerations or existing consensus to arrive at our conclusions.
We sought for literature on the following structured question:
Interventions and comparisons
Various methods of teaching and learning EBM and improving professional behaviour and performance.
Knowledge, skills, professional attitudes and behaviours, and health outcomes.
Randomised controlled trials (RCT), non-randomised controlled studies, before-and-after studies, theoretical or consensus articles.
We searched the Cochrane Library, Cochrane Effective Practice and Organisation of Care (EPOC) Group database , MEDLINE, EMBASE, ERIC, and Best Evidence Medical Education (BEME) database to identify relevant articles using "evidence based medicine", "continuing medical education", "continuing professional development" and their word variants as search terms. We consulted with experts directly as well as via an Internet discussion forum  and sifted through our personal files on EBM and medical education. The searches were updated in 2006.
Searching for papers on empirical evidence was guided by selection criteria based on the structured question above. We reviewed existing systematic reviews of primary literature on the effectiveness of CME, e-learning and EBM teaching activities. Searching for papers concerning educational theory and principles in electronic databases was not as straightforward because of inconsistent indexing and the absence of a specific keyword for the relevant publication types. Several of these concepts have been developed in areas other than CME and EBM. Conducting a broad literature search to capture every single potentially relevant paper would have been impossible if not inappropriate, especially since precise estimation is not the objective of such a review. Once a set of relevant concepts had been elucidated, there was no additional value in reviewing more papers explaining the same concept. This is known as theoretical saturation , a principle that guided our search and selection for papers on theories and principles. We discuss and summarise our finding and deliberations below. After considering empirical evidence, educational theory and related educational principles, we develop a hierarchy for teaching and learning of EBM.
Outcomes and impact of evidence-based medicine (EBM) Teaching and learning
Learners' views on the learning experience during educational activity (e.g. post teaching questionnaire)
Modification of attitudes/perceptions
Change in attitudes or perceptions between participant groups as a result of educational activity (e.g. difference between pre and post teaching questionnaire on attitudes)
Modification of knowledge or skills
Change in knowledge or skills between participant groups as a result of educational activity (e.g. difference in performance between pre and post tests)
Transfer of learning to the workplace or willingness of learners to apply new knowledge or skills (e.g. difference between pre and post questionnaire on attitudes)
Change in organisational practice
Wider changes in the organisation/delivery of care, attributable to educational activity (e.g. updating of guidelines or care pathways)
Benefit to patients/clients
Improvement in the health and wellbeing of patients as a result of educational activity (e.g. audit of outcomes in practice)
Interactive vs didactic education activities
EBM teaching and learning during post-graduate and continuing medical education takes place in the CME context. A traditional CME event can be didactic, interactive or mixed and can be either a single or sequenced event . Didactic sessions comprise of lectures, but may include question and answer periods. The interactive sessions are those that involve some form of interaction amongst the participants, which may take the format of small-group work, role-play, case discussions, or the opportunity to practice skills . Mixed sessions include both didactic and interactive elements. A Cochrane review , which updated previous systematic reviews [15, 17], evaluated educational activities such as lectures, workshops and courses. There were 32 studies (35 comparisons) that evaluated educational meetings of various types against no intervention, and of these, 24 studies (26 comparisons) reported significant improvement in professional practice (in at least one major outcome measure). Eight studies reported on patient health outcomes, and three of these found a significant improvement with educational meetings compared to no intervention. Overall, the evidence suggested that educational interventions can improve both professional practice and health outcomes. However, there was substantial heterogeneity in the types of educational interventions and their effects, necessitating an analysis beyond simply focussing on the overall results, and raising the question: What particular types of educational activities produce the benefit noted in some of the studies in the systematic reviews, and what features are associated with no or limited improvements in outcomes?
Eight studies, consisting of seven RCTs and one non-randomised study, evaluated interactive workshops compared to either no intervention (six studies) or other formats , In 7/8 studies, interactive workshops were found to result in significant improvements in practice in at least one major outcome measure. One study reported patient outcome and found that interactive sessions resulted in a significant reduction in asthma symptoms among paediatric patients. On the other hand seven RCTs evaluated a presentation or a lecture targeted at specific behaviours , In six of seven studies, there were no significant differences between the trial arms. One study reported a statistically significant, but small, effect in one of four skin cancer screening behaviours. A direct comparison between interactive workshops and didactic presentations reported no differences between the two groups. Although the available evidence did not allow an examination to assess what makes interactive workshops more effective than others, the evidence is clear: interactive workshops can improve education and patient outcomes whilst didactic teaching alone is unlikely to result in improvements. Therefore educational activities with an interactive format should rank high in any hierarchy of effective EBM teaching.
Educational activities based on e-Learning
E-learning technologies are increasingly being used to develop interactive curricula on key aspects of EBM [18, 19]. The Internet provides an important forum for EBM teaching and learning and its role is likely to expand in the future. It provides an efficient and increasingly interactive delivery system that can handle complex and layered information. Furthermore, it is not limited by time or geography and can be integrated into practice with easy availability of information and communication technology within clinical practice areas. Moreover, the learner sets the pace and the depth of learning. Self-assessment and feedback, as well as interactivity and networking with other participants, are all possible with eLearning. What is the evidence for the effects of eLearning? We identified a systematic review that had identified 16 RCTs that evaluated the effectiveness of Internet-based education in medical students or practicing healthcare professionals . Six studies showed a positive change in participants' knowledge, and three showed a change in practice in comparison to traditional formats. There were no data on health outcomes. These results show that e-Learning can be effective – however, as the evidence relates to a rapidly changing technology and there is extensive heterogeneity in teaching methods, delivery systems, assessment methods [21, 22] as well as other features of the existing studies, it is not possible to establish which elements contribute to an effective e-Learning strategy in EBM.
Integrated vs stand alone educational activities
We carried out a systematic review  of existing literature on the effectiveness of teaching EBM to post-graduates, to evaluate if the incorporation of teaching into clinical practice had any impact on outcomes. The review  included randomised, non-randomised controlled as well as before-and-after comparison studies, although greater weight was given to randomised evidence in the inferences. There were 23 studies, of which three were randomised trials, seven were non-randomised controlled studies and 13 were before-and-after comparison studies. We compared classroom – either didactic, interactive or mixed – versus clinically integrated teaching. Eighteen studies (including two randomised trials) evaluated a standalone teaching method, whilst five studies (including one randomised trial) evaluated a clinically-integrated teaching method. Due to poverty of reporting and substantial heterogeneity in populations, teaching methods, outcome definitions, assessment tools (most unvalidated), and methodological quality, we carried out a qualitative data synthesis in the form of what is often described as 'vote-counting'. Synthesis was conducted within broad subgroups of teaching methods and educational outcomes stratified by study methodology. This type of approach to minimise bias in 'vote-counting' by incorporating quality has previously been used to synthesise heterogeneous results . Standalone teaching improved knowledge, but not skills, attitudes or behaviour. Clinically-integrated teaching, on the other hand, improved knowledge, skills, attitudes and behaviour. None of the studies evaluated patient outcomes. EBM teaching integrated into clinical practice was, therefore, found to be superior to classroom teaching in improving educational outcomes, including positive changes in attitudes and behaviours of clinicians.
In a rapidly changing world, health care practitioners need to approach their profession with a view to lifelong learning . They need to identify their educational needs and develop a strategy to meet those needs in realistic and effective manner. This is likely to be achieved best with an a priori outline of a personal learning plan, which can serve as a vehicle for guiding educational activities over a specified time period when the progress can be assessed and a follow up learning plan formulated. In generating a learning plan health care practitioners should identify their own needs and set learning objectives accordingly. Their personal learning plan is based on the principles of adult learning [30–33], whereby they take responsibility for their own learning through a systematic programme of acquisition, renewal, upgrading and completion of knowledge and skills outlined in their professional development objectives. The objectives should be based on what they and their organisation need to learn. If there is cognateness between individual and organisational learning needs, the learning plan is more likely to be supported . There should be a practical, flexible and achievable strategy for learning which has a realistic hope of meeting the learning objectives within the limits of the resources available to the individual and their organisation. The choices should be driven by elements of effective CME outlined above.
Ideally, the plan for learning should be entirely voluntary and it should be driven by personal motivation of learner . Individuals should make the choice of the subject matter they wish to learn, the manner in which they will go about learning it, and they should learn independently without the need for close supervision. The ability to fulfil learning objectives will depend on critical reflection on experiences throughout the planned learning period . Therefore the plan should delineate learning outputs and outcomes to assess whether learning objectives have been met. The experience should help them critically analyse their performance against standard criteria using both formative and summative style assessments . In this manner a self-directed individual can be nurtured in the course of a CME cycle. Such an approach when applied diligently can help with personal development, revalidation etc., but above all it has a real chance of making a difference to our patients' outcomes.
Below we list elements of CME that we consider would enhance the value for the learner, and ultimately for their patients. Although some of the elements are based on empirical evidence, the others are based on educational theory and practice.
Various other effective approaches exist, such as educational outreach for prescribing, reminders, opinion leaders, as well as audit and feedback [37, 38]. Moreover, not surprisingly, multifaceted interventions targeting different barriers have been shown to be more likely to be effective than single interventions [37, 38]. These, when combined with effective teaching and learning, are likely to bring about desired outcomes for EBM.
Practitioners should view CME concerning teaching and learning of EBM in a holistic manner. CPD encompasses traditional CME as well the acquisition of other skills such as administration, management, teaching, and communication, and embodies elements such as self-directed, patient-centred and individualised learning as well as continuing appraisal . Work based learning and KT are elements of in-house CPD that focus on knowledge and skill application in the workplace . KT is the process whereby information is transferred to clinicians and applied in practice, a process that requires understanding of complex interactions between various individuals (medical and non-medical) and organizations . It focuses on health outcomes and changing behaviour . EBM, which can change physician's practice  and thus holds the potential to change health outcomes, can be a critical tool in KT. Whilst CME or CPD is primarily located in teaching settings, KT activities are centred in practice settings, and place a greater focus on group or team learning . KT is an element of CPD that is receiving wider attention in the field of CME.
A hierarchy of effective EBM teaching and learning
A hierarchy of evidence-based medicine (EBM) teaching and learning
Interactive, and clinically integrated teaching and learning activities
(a) Interactive, classroom based teaching and learning activities
(b) Didactic, but clinically integrated teaching and learning activities
Didactic, and classroom or standalone teaching and learning activities
Level 1: Interactive and clinically integrated teaching and learning activities
Substantial empirical evidence exits to support interactive teaching over didactic teaching. Clinically integrated EBM teaching is more effective than classroom teaching because it interrelates and unifies clinical subjects with clinical epidemiology creating a meaningful whole. Taking into account that educational theory and evidence on changing physician's behaviour are consistent with clinically integrated teaching being better than classroom teaching and learning. Therefore, an interactive and integrated learning should be the ideal that EBM practitioners should aim for as this probably represents the most effective way of learning. It is reflective of practice, it allows identification of gaps between current desired levels of competence, it identifies solutions that are practically testable, and it allows re-evaluation with the opportunity for further reflection and continuum of learning . Interactivity encourages deeper learning, which is important for understanding, manipulation and transference of learnt materials into practice.
Level 2: (a) Interactive, classroom based teaching, or (b) didactic, but clinically integrated teaching
Many of the modern teaching activities fall into the former category in which although teaching is located in a classroom, efforts are made to make the sessions interactive with small-group work, role plays and case discussions. Activity is the key to effective training, and this is the defining feature of interactive learning. On the other hand, teaching can be didactic but clinically integrated – an example of this would be a one-way discourse by a clinical teacher to students on a ward-round (classical bed-side teaching). Basing the discourse on a patient problem is likely to help with showing the relevance and application of the EBM knowledge. Such teaching can be easily turned into the interactive format with the likelihood of greater educational benefit. Such an approach, with live video linking and ability to interact with large groups at remote sites, makes it possible to convert classical teaching methods for wider application. E-health is an emerging field and teaching and learning medicine via videoteleconferencing will no doubt develop in the future [40, 41].
Level 3: Didactic, and classroom or standalone teaching
Many traditional teaching activities fall into this category, and they are unlikely to be effective in improving clinicians' performance or heath outcomes for the patients. Whilst they may have their own benefits (such as allowing networking between those with an interest in a particular topic), unless at least they contain elements of interactivity (for example, small group work or case discussions), their worth is likely to be limited. This is because lack of interactivity encourages superficial (rote and regurgitation) learning.
Limitations and barriers
The concepts behind the above hierarchy are gaining in popularity. However, it needs to be recognised that many caveats need consideration. We believe that we have taken into account both empirical evidence and theory judiciously in generating this ranking of teaching and learning activities. However, there may be concern that empirical evidence has been weighted more than theoretical considerations. On balance, we think that the weight to be attached to theoretical evidence should be considered very carefully, as theory so often does not materialise as expected when put into practice. Thus, we are confident that our emphasis on empirical evidence, particularly on evidence from a systematic synthesis of literature, is justified.
Decision-making about choice of teaching and learning methods should be guided by comparison of effects and costs associated with various educational strategies. Our aim was indeed to generate numerical measures of the effect size(s), however, due to poverty of reporting and substantial heterogeneity (in populations, teaching methods, outcome definitions, assessment tools and study quality, amongst other features), we were unable to provide any statistical summaries. Two very important issues where there were no or scant data relate to long term effects of the teaching methods and the effect on clinical outcomes. Pragmatic decisions have to be made based on the information available and in the absence of meta-analytic summaries, qualitative syntheses within broad subgroups of teaching methods and educational outcomes stratified by study methodology minimise biased inferences . When dominance can be so obviously demonstrated, a basic economic evaluation does not require statistically sophisticated analyses. For example, in a recent study , the average cost of providing a critical appraisal workshop was approximately £250 per person but no improvement was demonstrated in knowledge or attitudes. These findings challenge the policy of funding 'one-off' educational interventions aimed at enhancing EBM. Cost-effectiveness analyses are often simple and straightforward when the issue of dominance is not finely balanced . Is it reasonable to propose that standalone teaching should be abandoned, when an effort without desirable benefits does incur substantial costs? We think yes.
There are many barriers to the feasibility of an interactive and integrated approach and to its acceptability to both teachers and learners. Those who endeavour to embark on higher level of EBM teaching methods in our hierarchy will need to study these carefully as part of an implementation plan [44–47]. These methods may be considered similar to an innovation in many settings with the need for various phases for embedding them in practice , For example within problem- based learning, there has been discussion about the utility of introducing of interactive approaches to learners who have yet to acquire a baseline level of knowledge. Interactive approaches are also seen as challenging across different cultures that have a tradition of didactic teaching. In these situations, methods lower down in our hierarchy may provide the pre-requisite knowledge or encouragement to learners before they engage in interactive and clinically-integrated teaching. It is important to remember that education concerning EBM can take one through the key initial stages of change before one is prepared to adopt EBM in practice. EBM is a strategy for just-in-time learning  that increasingly is possible in a clinical environment. Teachers and learners should carefully examine their learning environment and circumstances when developing an implementation strategy  for their chosen EBM related educational activities.
As the stated aim of EBM is to benefit patient care, it becomes necessary that teachers and learners of EBM consciously find ways of integrating and incorporating teaching and learning into routine clinical practice . Where resources and facilities are available, such learning can form part of a real-time ward-round or clinic with the dual purposes of learning EBM skills and attempting to improve patient care with best available evidence [44, 52]. If the provisions for a real-time teaching are not available, then even traditional learning settings, such as a journal club [39, 46, 52–54], can be adapted to be based on real and current clinical problems, thus illustrating the process is not a mere academic exercise, but it informs patient care . These teaching and learning methods have the potential to demonstrate how that those receiving care could make decisions (wherever possible), informed by the knowledge of their care providers, within the context of available resources. Learning of EBM should therefore be moved from classrooms to clinical settings. Indeed this approach should be generalisable to other clinical topics – not just EBM – and integration of teaching and learning into practice should be considered for all topics in health care too.
Contributions by Mary Publicover, Derek Yates, Neelima Deshpande, Julie Hadley and James Davis in searching the literature and teaching EBM in our clinical setting are greatly valued. The input from peer reviewers is appreciated. EBM teaching and learning in our organisation has been funded/supported by many organisations including West Midlands Deanery, Evidence-supported Medicine Union, European Social Fund/Learning and Skills Council, and European Union Leonardo da Vinci programme. The funding bodies have played no role in study design; in the collection, analysis, and interpretation of data; in the writing of the manuscript; and in the decision to submit the manuscript for publication.
- Green ML: Evidence-based medicine training in graduate medical education: past, present and future. 2000Google Scholar
- al MP: Continuing Professional Development: Report of a Working Party (RCOG,UK). 2000, London, RCOG PressGoogle Scholar
- Davis D, Evans M, Jadad A, Perrier L, Rath D, Ryan D, Sibbald G, Straus S, Rappolt S, Wowk M, Zwarenstein M: The case for knowledge translation: shortening the journey from evidence to effect. BMJ 327(7405):33-5,. 2003Google Scholar
- Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS: Evidence based medicine: what it is and what it isn't. BMJ 312(7023):71-2,. 1996Google Scholar
- Norman GR: Examining the assumptions of evidence-based medicine. Journal of Evaluation in Clinical Practice. 1999, 5: 139-147. 10.1046/j.1365-2753.1999.00197.x.View ArticleGoogle Scholar
- Cochrane Effective Practice and Organisation of Care (EPOC) Group database. [www epoc uottawa ca]. 2006Google Scholar
- Evidence-based health discussion forum. [www jiscmail ac uk/lists/EVIDENCE-BASED-HEALTH html]. 2006Google Scholar
- Lilford RJ, Richardson A, Stevens A, Fitzpatrick R, Edwards S, Rock F, Hutton JL: Issues in methodological research: perspectives from researchers and commissioners. Health Technol Assess. 2001, 5: 1-57.View ArticleGoogle Scholar
- Belfield C, Thomas H, Bullock A, Eynon R, Wall D: Measuring effectiveness for best evidence medical education: a discussion. Medical Teacher. 2001, 23: 164-170. 10.1080/0142150020031084.View ArticleGoogle Scholar
- Miller GE: The Assessment of Clinical Skills Competence Performance. Academic Medicine. 1990, 65: S63-S67. 10.1097/00001888-199009000-00045.View ArticleGoogle Scholar
- Shaneyfelt T, Baum KD, Bell D, Feldstein D, Houston TK, Kaatz S, Whelan C, Green M: Instruments for evaluating education in evidence-based practice - A systematic review. Journal of the American Medical Association. 2006, 296: 1116-1127. 10.1001/jama.296.9.1116.View ArticleGoogle Scholar
- Khan KS, ter Riet G, Glanville J, Sowden AJ, Kleijnen J: Undertaking Systematic Reviews of Research on Effectiveness. CRD's Guidance for Carrying Out or Commissioning Reviews. CRD Report Number 4 (2nd edition). 2001, York, NHS Centre for Reviews and Dissemination, University of York, [http://www.york.ac.uk/inst/crd/report4.htm]2ndGoogle Scholar
- Khan KS, Kunz R, Kleijnen J, Antes G: Five steps to conducting a systematic review. J R Soc Med. 2003, 96: 118-121. 10.1258/jrsm.96.3.118.View ArticleGoogle Scholar
- Khan KS, Kunz R, Kleijnen J, Antes G: Systematic reviews to support evidence-based medicine: How to review and apply findings of systematic reviews. 2003, London, Royal Society of Medicine, [http://www.rsmpress.co.uk/bkkhan.htm]Google Scholar
- Davis D, O'Brien MA, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A: Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes?. JAMA 282(9):867-74,. 1999Google Scholar
- Thomson O'Brien MA, Freemantle N, Oxman AD, Wolf F, Davis DA, Herrin J: Continuing education meetings and workshops: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews (2):CD003030,. 2001Google Scholar
- Davis DA, Thomson MA, Oxman AD, Haynes RB: Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA 274(9):700-5,. 1995Google Scholar
- Schilling K, Wiecha J, Polineni D, Khalil S: An interactive web-based curriculum on evidence-based medicine: Design and effectiveness. Family Medicine. 2006, 38: 126-132.Google Scholar
- Davis J, Chryssafidou E, Coomarasamy A, Davies D, Zamora J, Khan KS: Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: A randomised controlled trial. BMC Medical Education. 2006Google Scholar
- Wutoh R, Boren SA, Balas EA: eLearning: a review of Internet-based continuing medical education. Journal of Continuing Education in the Health Professions 24(1):20-30,. 2004Google Scholar
- Davis MH, Harden RM: E is for everything - e-learning?. Medical Teacher. 2001, 23: 441-444. 10.1080/01421590120063349.View ArticleGoogle Scholar
- Harden RM: E-learning and all that jazz. Medical Teacher. 2002, 24: 225-226. 10.1080/01421590220120696.View ArticleGoogle Scholar
- Coomarasamy A, Khan KS: What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ 329(7473):1017,. 2004Google Scholar
- Terriet G, Kleijnen J, Knipschild P: Acupuncture and Chronic Pain - A Criteria-Based Metaanalysis. Journal of Clinical Epidemiology. 1990, 43: 1191-1199. 10.1016/0895-4356(90)90020-P.View ArticleGoogle Scholar
- Harden RM, Sowden S, Dunn WR: Some Educational-Strategies in Curriculum-Development - the Spices Model. Medical Education. 1984, 18: 283-297.Google Scholar
- Organisation WH: Innovative schools for health personnel. Offset Publication No 102. 1987Google Scholar
- Brown G, Atkins M: Studies of student learning. Effective teaching in higher education. Edited by: Brown G and Atkins M. 1988, London, Routledge, 150-158.Google Scholar
- Gibbs G: The nature of quality of learning. Improving the quality of student learning. Edited by: Gibbs G. 1992, Technical and Educational Services Ltd, 1-11.Google Scholar
- Harden RM, Laidlaw JM: Effective continuing education: the CRISIS criteria. Med Educ. 1992, 26: 408-422.View ArticleGoogle Scholar
- Carroll RG: Implications of adult education theories for medical school faculty development programmes. Med Teach. 1993, 15: 163-170.View ArticleGoogle Scholar
- Green ML, Ellis PJ: Impact of an evidence-based medicine curriculum based on adult learning theory. Journal of General Internal Medicine. 1997, 742-750. 10.1046/j.1525-1497.1997.07159.x.Google Scholar
- MS K: The adult learner: a neglected species. 2nd ed (Houston, Gulf). 1978Google Scholar
- MS K: Andragogy in action: applying modern principles of adult learning. San Francisco, Jossey-Bass. 1984Google Scholar
- Mennin SP, Kaufman A: The Change Process and Medical-Education. Medical Teacher. 1989, 11: 9-16.View ArticleGoogle Scholar
- Brigley S, Young Y, Littlejohns P, McEwen J: Continuing education for medical professionals: A reflective model. Postgraduate Medical Journal. 1997, 73: 23-26.View ArticleGoogle Scholar
- Brigley S, Littlejohns P, Young Y, McEwen J: Continuing medical education: The question of evaluation. Medical Education. 1997, 31: 67-71.View ArticleGoogle Scholar
- Grimshaw JM, Shirran L, Thomas R, Mowatt G, Fraser C, Bero L, Grilli R, Harvey E, Oxman A, O'Brien MA: Changing provider behavior: an overview of systematic reviews of interventions. Medical Care 39(8 Suppl 2):II2-45,. 2001Google Scholar
- Oxman AD, Thomson MA, Davis DA, Haynes RB: No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ Canadian Medical Association Journal 153(10):1423-31,. 1995Google Scholar
- Khan KS, Bachmann LM, Steurer J: The Medical Journal Club - A Tool for Knowledge Refinement and Transfer in Healthcare. Knowledge Media in Healthcare: Opportunities and Challenges. Edited by: Grütter R. 2001, Hersey - London Melbourne Singapore, Idea Group Publishing, 176-186.Google Scholar
- Harden RM: A new vision for distance learning and continuing medical education. Journal of Continuing Education in the Health Professions. 2005, 25: 43-51. 10.1002/chp.8.View ArticleGoogle Scholar
- Harden RM, Hart IR: An international virtual medical school (IVIMEDS): the future for medical education?. Medical Teacher. 2002, 24: 261-267. 10.1080/01421590220141008.View ArticleGoogle Scholar
- Taylor R, Reeves B, Ewings P, Taylor R: Critical appraisal skills training for health care professionals: a randomized controlled trial [ISRCTN46272378]. BMC Medical Education. 2004, 4: 30-10.1186/1472-6920-4-30.View ArticleGoogle Scholar
- Nixon J, Khan KS, Kleijnen J: Summarising economic evaluations in systematic reviews: a new approach. British Medical Journal. 2001, 322: 1596-1598. 10.1136/bmj.322.7302.1596.View ArticleGoogle Scholar
- Deshpande N, Publicover M, Gee H, Khan KS: Incorporating the views of obstetric clinicians in implementing evidence-supported labour and delivery suite ward rounds: a case study. Health Info Libr J. 2003, 20: 86-94. 10.1046/j.1471-1842.2003.00422.x.View ArticleGoogle Scholar
- Watkins C, Timm A, Gooberman-Hill R, Harvey I, Haines A, Donovan J: Factors affecting feasibility and acceptability of a practice-based educational intervention to support evidence-based prescribing: a qualitative study. Fam Pract. 2004, 21: 661-669. 10.1093/fampra/cmh614.View ArticleGoogle Scholar
- Khan KS, Gee H: A new approach to teaching and learning in journal club. Medical Teacher. 1999, 21: 289-293. 10.1080/01421599979554.View ArticleGoogle Scholar
- McCluskey A, Lovarini M: Providing education on evidence-based practice improved knowledge but did not change behaviour: a before and after study. BMC Medical Education. 2005, 5: 40-10.1186/1472-6920-5-40.View ArticleGoogle Scholar
- Rogers EM: Diffusion of innovations. 2003, New York, Free Press, 5thGoogle Scholar
- Slawson DC, Shaughnessy AF: Teaching evidence-based medicine: should we be teaching information management instead?. Acad Med. 2005, 80: 685-689. 10.1097/00001888-200507000-00014.View ArticleGoogle Scholar
- Moulding NT, Silagy CA, Weller DP: A framework for effective management of change in clinical practice: dissemination and implementation of clinical practice guidelines. Qual Saf Health Care. 1999, 8: 177-183.View ArticleGoogle Scholar
- Dawes M, Summerskill W, Glasziou P, Cartabellotta A, Martin J, Hopayian K, Porzsolt F, Burls A, Osborne J: Sicily statement on evidence-based practice. BMC Medical Education. 2005, 5: 1-10.1186/1472-6920-5-1.View ArticleGoogle Scholar
- Dwarakanath LS, Khan KS: Modernizing the journal club. Hospital Medicine. 2000, 61: 425-427.View ArticleGoogle Scholar
- Afifi Y, Davis J, Khan KS, Publicover M, Gee H: The journal club: a modern model for better service and training. The Obstetrician and Gyncecologist. 2006, 8: 186-189. 10.1576/toag.188.8.131.52256.View ArticleGoogle Scholar
- Khan KS, Dwarakanath LS, Pakkal M, Brace V, Awonuga A: Postgraduate journal club as a means of promoting evidence-based obstetrics and gynaecology. J Obstet Gynaecol. 1999, 19: 231-234. 10.1080/01443619964968.View ArticleGoogle Scholar
- The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1472-6920/6/59/prepub
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.