- Research article
- Open Access
- Open Peer Review
Evidence-based choices of physicians: a comparative analysis of physicians participating in Internet CME and non-participants
© Casebeer et al; licensee BioMed Central Ltd. 2010
- Received: 23 October 2009
- Accepted: 10 June 2010
- Published: 10 June 2010
The amount of medical education offered through the Internet continues to increase, providing unprecedented access for physicians nationwide. However, the process of evaluating these activities is ongoing. This study is a continuation of an earlier report that found online continuing medical education (CME) to be highly effective in making evidence-based decisions.
To determine the effectiveness of 114 Internet CME activities, case vignette-based surveys were administered to U.S.-practicing physicians immediately following participation, and to a representative control group of non-participants. Survey responses were analyzed based on evidence presented in the content of CME activities. An effect size for each activity was calculated using Cohen's d to determine the amount of difference between the two groups in the likelihood of making evidence-based clinical decisions.
In a sample of 17,142 U.S. physicians, of the more than 350,000 physicians who participated in 114 activities, the average effect size was 0.82. This indicates an increased likelihood of 48% that physicians participating in online activities were making clinical choices based on evidence.
Physicians who participated in online CME activities continue to be more likely to make evidence-based clinical choices than non-participants in response to clinical case vignettes.
- Continue Medical Education
- Case Vignette
- Average Effect Size
- Assessment Question
- Continue Medical Education Activity
Continuing medical education (CME) activities provide opportunities for medical practitioners to keep up with new information affecting the delivery of medical care, and ongoing participation is required by most physician state licensing boards [1, 2]. Participation in CME or other medical education activities is also required by the licensing boards for other types of healthcare providers, such as physician assistants and nurse practitioners. CME providers sponsor a variety of activities, such as courses, regularly scheduled series, or enduring materials, defined as instructional materials that can be accessed at a time chosen by the participant.
The number of hours of Internet-based enduring materials provided by Accreditation Council for CME (ACCME)-accredited providers increased dramatically in recent years, from 16,802 hours in 2002 to 57,944 hours in 2008 [3, 4]. This three-fold increase was accompanied by an even larger increase in the number of participants choosing Internet-based enduring materials; the number of physician participants increased from 305,410 individuals in 2002 to 4,365,014, nearly a ten-fold increase.
Given the increasing number of CME activities offered on the Internet, and the even larger growth of participation in these activities, assessing the effectiveness of Internet-based CME is crucial. Reviews of studies comparing results from online and traditional CME materials conclude that Internet-based CME was as effective as the traditional CME delivery formats [5, 6]. A recent review analyzed data pooled from published comparisons of participating in Internet-based CME activities vs. traditional CME activities and comparisons of participating in Internet-based CME activities vs. not participating in CME activities . The review concluded that Internet-based CME improved participant knowledge, skills, and practice decisions, with results that were comparable to those obtained after participation in traditional CME activities.
The goal of this study was to assess the evidence-based decisions in response to clinical case vignettes by physicians participating in CME activities of varied formats and to compare those decisions with those of a similar group of physicians who did not participate in the CME activities. The CME activity formats included case-based, multimedia, and interactive text. We hypothesized that physicians who participated in each type of Internet CME activity would more frequently make evidence-based clinical choices in response to clinical case vignettes when compared to physicians who did not participate.
To assess the evidence-based choices of physicians who participate in Internet CME activities, a group of 114 different Internet CME certified stand-alone activities posted during 2006-2008 was studied, including activities in the following formats: case-based, multimedia, and interactive text. 114 Internet CME activities were identified as eligible for assessment by meeting the following criteria: 1) designed for physicians, 2) posted between January 1, 2006 and December 31, 2008, 3) certified for CME credit (1 credit for majority of activities), and 4) presented in an on-demand archived format. Of the 114 Internet CME activities assessed during the period of the study, 40 were interactive CME case activities, 64 were interactive text-based activities, and 10 were multimedia activities. Interactive CME activities contain questions within the activity that participants respond to and receive immediate feedback, usually involving patient cases. Interactive text-based activities are mainly conference coverage, special reports, and basic clinical updates. Multimedia activities are mainly live or roundtable presentations with video lectures.
The controlled trial conducted compared the evidence-based clinical choices of a group of 8,550 participant physicians with those of a demographically matched control group of 8,592 non-participant physicians. Following participation, physicians were asked to respond to a series of clinical case questions related to application of the CME content to practice. Physicians who participated in these activities were eligible for inclusion in the study if they practiced in the U.S., represented the target audience for the activity, and completed assessment questions following participation. A random sample of participants meeting the eligibility criteria for each activity was drawn from each overall participant group. A demographically similar group of non-participant physicians selected at random from the American Medical Association (AMA) Master File was recruited to participate and also respond to the same clinical case questions. An average total sample size of between 100 and 200, with a minimum of 50 participants and non-participants, was used for individual activities. A sample size of 50 is the minimum number required for sufficient statistical power (p < 0.05). Participant and non-participant samples were matched on the following characteristics: physician specialty, degree, years in practice, whether or not direct patient care was their primary responsibility, and the average number of patients seen per week with the disease of interest. As Medscape members are likely to have participated in more than one activity, it is likely that individuals are recorded multiple times in the total participant number of 8,550. However, for each activity, only one score per individual is recorded.
A consistent assessment approach was developed and used across all of the 114 CME activities included in the study. This evaluation approach included: 1) using brief case descriptions to assess clinical practice choices, 2) presenting clinical choices using a multiple-choice format, 3) using a standard hypertext mark-up language programming approach to presenting assessment questions, 4) applying this assessment approach to specific content reflected in each individual activity, and 5) collecting assessment data from CME participants in each individual clinical assessment. Case vignette studies were reviewed by Western Institutional Review Board (WIRB; Olympia, WA) in 2004 prior to this study.
A standard assessment template consisting of two clinical vignettes and five to eight clinical questions using a multiple choice format was developed; evidence-based responses to the case vignettes were identified from content and references developed by the faculty for each activity. Content for the activities was written and referenced to clinical evidence by the faculty member for each activity. Only content referenced to peer-reviewed publications or guidelines was considered eligible for the development of clinical case questions. Case vignettes and the assessment questions were developed by clinical experts who were not involved in the design and content of the CME activity. Content validity of the case vignettes was established by review from physician medical editors of the online portal; editors represented the appropriate clinical area for each set of case vignettes. An example case and assessment questions are shown in the Appendix.
A statistical analysis software package (Statistical Package for Social Sciences 17.0; SPSS; Chicago, IL) was used in data extraction and transformation, and statistical analyses. Participant and non-participant case vignette responses were scored according to their concordance with the evidence-informed content presented within each activity. Overall mean scores and pooled standard deviations were calculated for both the participant and non-participant groups for each of the activities. These were used to calculate the educational effect size using Cohen's d formula (i.e., the difference in mean divided by the square root of the pooled standard deviation) in order to determine the average amount of difference between participants and non-participants . Effect size representing the difference between the two groups was expressed as a percent of non-overlap between participants and non-participants.
Demographics of physician Internet CME participants and control group
N = 8550
N = 8592
Years since graduation from medical school
Gender, number (%)
Degree, number (%)
Direct patient care as major professional activity, number (%)
Effect size of 114 Internet CME activities by format
Effect size (average)
% non-overlap between participants and non-participants
All Internet CME activities (n = 114)
Interactive text-based (n = 64)
Interactive case-based (n = 40)
Multimedia (n = 10)
Effect size of 114 Internet CME activities by therapeutic area
Average effect size
% non-overlap between participants and non-participants
From our work in analyzing a large sample of physician participants in 114 different Internet-based CME activities, combining the analyses presented herein with those presented in a previous report , it is clear that these Internet-based CME activities were effective; responses of CME participants to questions about the case vignettes were more likely to reflect evidence-based clinical choices than the responses of matched non-participants. These findings are consistent with the recent meta-analysis by Cook et al. concluding that Internet-based CME improved participant knowledge, skills, and practice decisions, with outcomes that were comparable to those obtained after participation in traditional CME activities . The data also supports Wong's assertion, in a letter to the editor in response to Cook's meta-analysis - the data puts to rest the issue of whether this innovative educational method is efficacious .
Internet CME participants continue to demonstrate the usefulness of interactive Internet CME activities for experienced clinicians, as the participants in the study had an average of 20 years of experience in practice. The searchability of the Internet at the time that a clinician has a question may also contribute to the effectiveness of Internet CME activities .
Several recent reports develop consistent methods for comparing outcomes and applying the methods to multiple Internet-based CME activities. In one study, standardized tests administered before and after participation in courses offered at a Canadian CME web site revealed increases in participant knowledge, confidence, and self-reported change in practice patterns resulting from participation . Improvements were noted for the majority of the 10 different courses that were assessed. In another recent study, standardized questionnaires including questions about case vignettes were administered to participants who had completed CME courses offered at a U.S. CME web site . The results reported here support a recent comparison of the responses of CME-participants with the responses of matched non-participants that revealed that participants had a higher likelihood of making evidence-based clinical choices for 48 different courses .
The analysis reported here used responses to questions about clinical vignettes to measure the effectiveness of the CME activities, which is an indirect method of assessing a physician's practice patterns. However, clinical case vignettes have been shown to be valid tools for measuring the quality of clinical practice [14, 15]. A limitation of the study is that the questions were administered immediately after participation in the CME activity; thus these analyses did not assess whether the improvements in physician performance were maintained over time. We also did not assess the effects of participating in the CME activities on patient health outcomes. Strengths of the study include the large number of physician participants and the varied Internet CME formats assessed.
Another possible limitation of these analyses is the exclusion of non-physician healthcare providers. The rapid growth in the number of non-physician participants in Internet-based CME activities parallels that of physician participants . Furthermore, in 2008, more non-physician participants chose Internet-based enduring materials than any other ACCME-accredited CME category . Given the predicted shortage of physicians and the increasing costs of health care, the roles of non-physician healthcare providers, such as nurse practitioners and physician assistants, are likely to continue to expand, and effective CME will be important in ensuring continued quality medical care .
In summary, this study demonstrated that physicians who participated in varied formats of selected Internet CME activities were more likely, following participation, to make evidence-based clinical choices in response to case vignettes than non-participants. These data support the assertion that Internet CME activities are effective and offer a searchable, credible, available on-demand, high-impact source of CME for physicians.
What treatment options would you consider? (select only one)
▪ Dopamine agonist
▪ Deep brain stimulation
▪ Physical therapy
What would you do at this point? (select only one)
▪ Reconsider the diagnosis
▪ Increase her dose of levodopa
▪ Add another agent
▪ Refer to a neurologist
The authors gratefully acknowledge the contributions of Kate Loughney, PhD to the background literature review contributing to this article and assistance of Joi Tisdale in data collection. Financial support for the outcomes studies reported here was received from MedscapeCME and supporters; Abbott, Amylin, Bayer, Boehringer Ingelheim, Biogen, BMS, Centers for Disease Control, Cephalon, Eli Lilly, Forest, Genentech, GlaxoSmithKline, Janssen, Merck, Novartis, Ortho McNeil, Ortho McNeil Janssen, Otsuka America Pharmaceuticals, Pharmaceutical Research and Manufacturers Association (PhRMA), Pfizer, Sanofi-aventis, Schering Plough, Solvay Pharmaceuticals, Susan G Komen Foundation, and Wyeth.
- Moores LK, Dellert E, Baumann MH, Rosen MJ: American College of Chest Physicians Health and Science Policy Committee. Introduction: Effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest. 2009, 135: 5S-7S. 10.1378/chest.08-2512.View ArticleGoogle Scholar
- American Medical Association: Continuing Medical Education for Licensure Reregistration. State Medical Licensure Requirements and Statistics. 2010, [http://www.ama-assn.org/ama1/pub/upload/mm/40/table16.pdf]Google Scholar
- ACCME Annual Report Data. 2002, [http://www.accme.org]
- ACCME Annual Report Data. 2007, [http://www.accme.org]
- Wutoh R, Boren SA, Balas EA: eLearning: a review of Internet-based continuing medical education. J Contin Educ Health Prof. 2004, 24: 20-30. 10.1002/chp.1340240105.View ArticleGoogle Scholar
- Fordis M, King JE, Ballantyne CM, Jones PH, Schneider KH, Spann SJ, Greenberg SB, Greisinger AJ: Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA. 2005, 294: 1043-1051. 10.1001/jama.294.9.1043.View ArticleGoogle Scholar
- Cook DA: The failure of e-learning research to inform education practice, and what we can do about it. Med Teach. 2009, 31: 158-162. 10.1080/01421590802691393.View ArticleGoogle Scholar
- Cohen J: Statistical Power Analysis for the Behavioral Sciences. 1988, Hillsdale, NJ: Lawrence Earlbaum Associates, 2Google Scholar
- Smart D: Physician characteristics and distribution in the U.S. 2009 Edition. 2009, American Medical Association, Chicago, IllinoisGoogle Scholar
- Casebeer L, Engler S, Bennett N, Irvine M, Sulkes D, DesLauriers M, Zhang S: A controlled trial of the effectiveness of internet continuing medical education. BMC Med. 2008, 6: 37-10.1186/1741-7015-6-37.View ArticleGoogle Scholar
- Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM: Internet-based learning in the health professions: a meta-analysis. JAMA. 2008, 300: 1181-1196. 10.1001/jama.300.10.1181.View ArticleGoogle Scholar
- Wong G: Internet-Based education for health professionals. JAMA. 2008, 301: 598-599. 10.1001/jama.2009.69.View ArticleGoogle Scholar
- Curran V, Lockyer J, Sargeant J, Fleet L: Evaluation of learning outcomes in Web-based continuing medical education. Acad Med. 2006, 81 (10 Suppl): S30-34. 10.1097/01.ACM.0000236509.32699.f5.View ArticleGoogle Scholar
- Peabody JW, Liu A: A cross-national comparison of the quality of clinical care using vignettes. Health Policy Plan. 2007, 22: 294-302. 10.1093/heapol/czm020.View ArticleGoogle Scholar
- Peabody JW, Luck J, Glassman P, Jain S, Hansen J, Spell M, Lee M: Measuring the quality of physician practice by using clinical vignettes: a prospective validation study. Ann Intern Med. 2004, 141: 771-780.View ArticleGoogle Scholar
- Larson EH, Hart LG: Growth and Change in the Physician Assistant Workforce in the United States, 1967-2000. J Allied Health. 2007, 36: 121-30.Google Scholar
- The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1472-6920/10/42/prepub
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.