To assess the evidence-based choices of physicians who participate in Internet CME activities, a group of 114 different Internet CME certified stand-alone activities posted during 2006-2008 was studied, including activities in the following formats: case-based, multimedia, and interactive text. 114 Internet CME activities were identified as eligible for assessment by meeting the following criteria: 1) designed for physicians, 2) posted between January 1, 2006 and December 31, 2008, 3) certified for CME credit (1 credit for majority of activities), and 4) presented in an on-demand archived format. Of the 114 Internet CME activities assessed during the period of the study, 40 were interactive CME case activities, 64 were interactive text-based activities, and 10 were multimedia activities. Interactive CME activities contain questions within the activity that participants respond to and receive immediate feedback, usually involving patient cases. Interactive text-based activities are mainly conference coverage, special reports, and basic clinical updates. Multimedia activities are mainly live or roundtable presentations with video lectures.
Participant Selection
The controlled trial conducted compared the evidence-based clinical choices of a group of 8,550 participant physicians with those of a demographically matched control group of 8,592 non-participant physicians. Following participation, physicians were asked to respond to a series of clinical case questions related to application of the CME content to practice. Physicians who participated in these activities were eligible for inclusion in the study if they practiced in the U.S., represented the target audience for the activity, and completed assessment questions following participation. A random sample of participants meeting the eligibility criteria for each activity was drawn from each overall participant group. A demographically similar group of non-participant physicians selected at random from the American Medical Association (AMA) Master File was recruited to participate and also respond to the same clinical case questions. An average total sample size of between 100 and 200, with a minimum of 50 participants and non-participants, was used for individual activities. A sample size of 50 is the minimum number required for sufficient statistical power (p < 0.05). Participant and non-participant samples were matched on the following characteristics: physician specialty, degree, years in practice, whether or not direct patient care was their primary responsibility, and the average number of patients seen per week with the disease of interest. As Medscape members are likely to have participated in more than one activity, it is likely that individuals are recorded multiple times in the total participant number of 8,550. However, for each activity, only one score per individual is recorded.
Assessment
A consistent assessment approach was developed and used across all of the 114 CME activities included in the study. This evaluation approach included: 1) using brief case descriptions to assess clinical practice choices, 2) presenting clinical choices using a multiple-choice format, 3) using a standard hypertext mark-up language programming approach to presenting assessment questions, 4) applying this assessment approach to specific content reflected in each individual activity, and 5) collecting assessment data from CME participants in each individual clinical assessment. Case vignette studies were reviewed by Western Institutional Review Board (WIRB; Olympia, WA) in 2004 prior to this study.
A standard assessment template consisting of two clinical vignettes and five to eight clinical questions using a multiple choice format was developed; evidence-based responses to the case vignettes were identified from content and references developed by the faculty for each activity. Content for the activities was written and referenced to clinical evidence by the faculty member for each activity. Only content referenced to peer-reviewed publications or guidelines was considered eligible for the development of clinical case questions. Case vignettes and the assessment questions were developed by clinical experts who were not involved in the design and content of the CME activity. Content validity of the case vignettes was established by review from physician medical editors of the online portal; editors represented the appropriate clinical area for each set of case vignettes. An example case and assessment questions are shown in the Appendix.
Analysis
A statistical analysis software package (Statistical Package for Social Sciences 17.0; SPSS; Chicago, IL) was used in data extraction and transformation, and statistical analyses. Participant and non-participant case vignette responses were scored according to their concordance with the evidence-informed content presented within each activity. Overall mean scores and pooled standard deviations were calculated for both the participant and non-participant groups for each of the activities. These were used to calculate the educational effect size using Cohen's d formula (i.e., the difference in mean divided by the square root of the pooled standard deviation) in order to determine the average amount of difference between participants and non-participants [8]. Effect size representing the difference between the two groups was expressed as a percent of non-overlap between participants and non-participants.