This article has Open Peer Review reports available.
Introducing evidence based medicine to the journal club, using a structured pre and post test: a cohort study
© Cramer and Mahoney; licensee BioMed Central Ltd. 2001
Received: 28 August 2001
Accepted: 6 November 2001
Published: 6 November 2001
Journal Club at a University-based residency program was restructured to introduce, reinforce and evaluate residents understanding of the concepts of Evidence Based Medicine.
Over the course of a year structured pre and post-tests were developed for use during each Journal Club. Questions were derived from the articles being reviewed. Performance with the key concepts of Evidence Based Medicine was assessed. Study subjects were 35 PGY2 and PGY3 residents in a University based Family Practice Program.
Performance on the pre-test demonstrated a significant improvement from a median of 54.5 % to 78.9 % over the course of the year (F 89.17, p < .001). The post-test results also exhibited a significant increase from 63.6 % to 81.6% (F 85.84, p < .001).
Following organizational revision, the introduction of a pre-test/post-test instrument supported achievement of the learning objectives with a better understanding and utilization of the concepts of Evidence Based Medicine.
The Residency Review Committee guidelines for Family Medicine require teaching the skills necessary for the critical appraisal of the medical literature. Most residency training programs accomplish this in the setting of a monthly journal club. At our program, faculty involved with organizing these sessions noted an ongoing frustration with the attendance, preparation, participation and focus of the residents.
Within the framework of a needs assessment, residents were surveyed to identify issues directly related to the utility of the Journal Club. This opportunity to reorganize also allowed reexamination of the goals and objectives for the Journal Club in line with those suggested by others . A formal curriculum review with new goals and objectives was put in place, acknowledged by renaming the sessions as "Evidence Based Medicine /Journal Club". Paramount was a focus on critical appraisal and clinical epidemiology from the Evidence-based perspective.
A syllabus of material on the approach to Evidence-based Medicine was chosen to provide the skills necessary to allow the residents to become comfortable with identifying valid articles, potentially impacting their practice of medicine . For this component of the objectives we combined the search for "poem's" (Patient Oriented Evidence that Matters) from Slawson and Shaunessey[3, 4] with Sackett's emphasis on clinical epidemiology and the rigorous assessment of validity . The importance of this knowledge base was reinforced by using the first didactic session of the academic year to provide a formal review of the materials. At this session, incoming residents began with a formal test of their skills. They were then presented with the syllabus of selected articles from the JAMA series on the Users Guides to the Literature , Slawson and Shaunsessey's work on becoming information masters [3, 4], web site references and access to relevant texts.
An approach to evaluating the impact of these changes represents the balance of this article.
Participants were family medicine residents in a University-based residency program. The PGY1 information was collected but not included in this report, as they were not available at the initiation of the revised Journal Club in Module 10 of our 13 Module academic year.
In order to measure the progress in meeting the objectives of the revised Journal Club, a written pre-test and post-test was chosen as a primary method of evaluating each Journal Club session. This consisted of 10 to 12 questions focused on the principles of Evidence Based Medicine and clinical epidemiology, as well as the key content of the articles. The items were prepared by the Director for Journal Club (JSC) and appeared on both the Pre and Post-tests. Keeping in mind the subjective nature of creating appropriate questions it was felt that well constructed questions could still demonstrate both reliability and validity. Reliability has been equated to the repeatability or stability of responses on repeated administration. Face validity was demonstrated via clear unambiguous questions. Content validity was achieved through inclusion of questions incorporating the key concepts of Evidence Based Medicine and clinical epidemiology. This can be seen in a representative quiz presented initially to faculty and annually to incoming residents to assess their EBM skills (appendix 1).
Analysis of results was accomplished using SPSS version 6.1 for Windows on a microcomputer. Where appropriate ANOVA and two-tailed t-tests for paired samples were used.
Pre-test and Post-test Results for the Median and Mean
Pre to Post
p < 0.001
p < 0.001
p ~ 0.046
Pre to Post
p < 0.001
p < 0.001
p ~ 0.046
The discussions during the monthly Journal Club began with an obvious focus derived from the content of the Pre-test. It was expected that the Pre-test would help to identify participants who had not read or understood the impact of the articles and help to focus the discussion. The Pre-test also allowed the department to begin to track, in a sequential fashion, objective measures of performance for individual residents. The Post-test represented the group educational dynamic.
Identifying residents with problems on the Pre-test or a lack of response on the Post-test was obvious and allowed timely information to be forwarded to their academic advisors for remediation. As a bonus, tracking attendance via a turned in quiz was facilitated and this information also was forwarded to the residents academic advisor.
When critical appraisal skills among resident physicians have been formally assessed, correct responses have varied between 33 and 42 % suggesting limited integration of these concepts . This was confirmed at our institution with an overall average of 32% for incoming PGY1 residents over the preceding three years. Even after specific targeted interventions, correct responses have increased only to 67 % . The attainment of a Pre-test median score of 79.0 % at the end of the first year of our intervention was certainly comparable and significantly exceeded those results .
Studies have also indicated that there is no correlation between self-assessed competence and actual ability in these skills . With the Pre-test there is direct feedback regarding comprehension and competence. This therefore gives us a quantitative assessment of the individual residents facility with these critical concepts, as they evolve over time.
There also appears to be little evidence that didactic Continuing Medical Education can yield any effect on performance [11, 12]. According to these authors the emphasis of successful educational programs is based upon a pre-course assessment of needs, opportunities to practice relevant skills and after course activities that reinforce and facilitate change. Our Pre-test assesses needs, the discussion offers opportunities to practice skills and our Post-test reinforces and facilitates change .
The introduction of a pre-test and post-test helped to provide a uniformity in focus for the Journal Club. More importantly this effort yielded improvements in objective measures of preparation and comprehension of the key concepts of Evidence-based Medicine. Despite the additional faculty workload, the ability to meet structured learning goals made the continuing effort worthwhile.
Appendix 1: Evidence Based Medicine / Journal Club Example Quiz. Word document.
- Valentini RP, Daniels SR: The Journal Club. Postgrad Med J. 1997, 73: 81-85.View ArticleGoogle Scholar
- Haynes RB: Where's the Meat in Clinical Journals?. ACP Journal Club. 1993, 119: A22-A23.Google Scholar
- Slawson DC, Shaughnessy AF, Bennett JH: Becoming a Medical Information Master: Feeling Good About Not Knowing Everything. JFP. 1994, 38: 505-513.Google Scholar
- Shaugnessy AF, Slawson DC, Bennett JH: Becoming an Information Master: A Guidebook to the Medical Information Jungle. JFP. 1994, 39: 489-499.Google Scholar
- Sackett DL: How to Read Clinical Journals: V: To Distinguish Useful from Useless or Even Harmful Therapy. CMAJ. 1981, 124: 1156-1162.Google Scholar
- Evidence Based Medicine Working Group: Users' Guides to the Medical Literature. JAMA. 1993, 270: 2093-2095. (I), 2598-2601 (II a), 1994,271: 55-63 (II b), 703-707 (III), 1615-1619 (IV), 1994,272: 234-237 (V), 1367-1371 (VI), 1995,273: 1292-1295 (VII a), 1630-1632 (VII b), 1995,274: 570-574 (VIII a), 1630-1632 (VIII b), 1800-1804 (IX), 1996,275: 554-558 (X), 1435-1439 (XI), 1997,277: 1232-1237 (XII), 1552-1557 (XIII a), 1802-1806 (XIII b).View ArticleGoogle Scholar
- SPSS Version 6.1 for Windows. SPSS Inc. Chicago. 1993Google Scholar
- Linzer M, Brown JT, Frazier CM, et al: Impact of a Medical Journal Club on House-Staff Reading Habits, Knowledge and Critical Appraisal Skills: A Randomized Control Trial. JAMA. 1988, 260: 2537-2541. 10.1001/jama.260.17.2537.View ArticleGoogle Scholar
- Stern DT, Linzer M, O'Sullivan PS, Weld L: Evaluating Medical Residents' Literature-appraisal Skills. Acad Med. 1995, 70: 152-154.View ArticleGoogle Scholar
- Emerson JS: Use of Statistical Analysis in the New England Journal of Medicine. NEJM. 1983, 309: 709-713.View ArticleGoogle Scholar
- Davis DA: The Science and Practice of Continuing Medical Education: A Study in Dissonance. ACP Journal Club. 1993, 118: A-18.Google Scholar
- Linzer M, DeLong ER, Hupart KH: A Comparison of Two Formats for Teaching Critical Reading Skills in a Medical Journal Club. J of Med Educ. 1987, 62: 690-692.Google Scholar
- The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1472-6920/1/6/prepub
This article is published under license to BioMed Central Ltd. This is an Open Access article: verbatim copying and redistribution of this article are permitted in all media for any purpose, provided this notice is preserved along with the article's original URL.