Long-term impact of four different strategies for delivering an on-line curriculum about herbs and other dietary supplements
© Beal et al; licensee BioMed Central Ltd. 2006
Received: 22 May 2006
Accepted: 07 August 2006
Published: 07 August 2006
Previous research has shown that internet education can lead to short-term improvements in clinicians' knowledge, confidence and communication practices. We wished to better understand the duration of these improvements and whether different curriculum delivery strategies differed in affecting these improvements.
As previously described, we conducted a randomized control trial comparing four different strategies for delivering an e-curriculum about herbs and other dietary supplements (HDS) to clinicians. The four strategies were delivering the curriculum by: a) email over 10 weeks; b) email within one week; c) web-site over 10 weeks; d) web-site within one week. Participants were surveyed at baseline, immediately after the course and 6–10 months after completing the course (long-term). Long-term outcomes focused on clinicians' knowledge, confidence and communication practices.
Of the 780 clinicians who completed the course, 385 (49%) completed the long-term survey. Completers and non-completers of the long-term survey had similar demographics and professional characteristics at baseline. There were statistically significant improvements from baseline to long-term follow-up in knowledge, confidence and communication practices; these improvements did not differ by curriculum delivery strategy. Knowledge scores improved from 67.7 ± 10.3 at baseline to 78.8 ± 12.3 at long-term follow-up (P < 0.001). Confidence scores improved from 53.7 ± 17.8 at baseline to 66.9 ± 12.0 at long term follow-up (P < 0.001); communication scores improved from 2.6 ± 1.9 at baseline to 3.6 ± 2.1 (P < 0.001) at long-term follow-up.
This e- curriculum led to significant and sustained improvements in clinicians' expertise about HDS regardless of the delivery strategy. Future studies should compare the impact of required vs. elective courses and self-reported vs. objective measures of behavior change.
Herbs and dietary supplements (HDS) are the most commonly used complementary medical therapies purchased in the United States , leading to concerns about HDS safety and efficacy. Health care professionals have expressed a strong interest in HDS training courses[3–5]. However, face-to-face Continuing Medical Education (CME) courses often fail to result in sustained changes in physician behaviors [6, 7]. On the other hand, online CME training has shown improved behavior and knowledge .
We previously reported the short-term outcomes of our randomized controlled trial (RCT) comparing four different strategies of delivering an on-line course about HDS to diverse clinicians . The short-term results suggested that all four strategies of the e-curriculum similarly and significantly improved clinicians' knowledge, confidence, and communication practices.
To answer questions about the duration of these improvements and whether any differences between delivery strategies would emerge over a longer follow-up, we prospectively followed up study participants from the earlier RCT six to ten months after they'd completed the initial study.
We conducted a prospective 6 to 10 month follow-up of an RCT comparing four different strategies for delivering an e-curriculum about herbs and dietary supplements to diverse health professionals . Baseline surveys questions regarding demographics, professional characteristics, knowledge, confidence and communication scales have been reported previously [9, 10]. Dieticians, nurses, pharmacists, physicians, physician assistants, and trainees in one of these health professions were eligible for the study.
The intervention and delivery strategies have been described previously [9, 10]. Briefly, the curriculum consisted of 40 case-based self-instructional modules, each of which contained links to evidence-based on-line HDS resources. Enrollees were randomized to one of four different curriculum delivery groups: email delivery over ten weeks (push-drip), email delivery over four days (push-bolus), web availability over ten weeks (pull-drip), and web availability over four days (pull-bolus). The curriculum was delivered in fall, 2004 (concluding in 12/04) and in spring, 2005 (concluding in 4/05). Immediate outcomes were assessed 11 – 15 weeks after randomization.
During the second week of October 2005 (approximately ten months after the first group and six months after the second group had completed the course) all original enrollees were asked to complete a final course evaluation. The email request contained a link to a web page which included the exact same questions as the immediate outcome survey to assess long-term retention and maintenance of knowledge, confidence, and communication practices among course enrollees. Non-respondents received up to three email requests to complete the survey before the November 30, 2005 deadline.
The primary study outcomes have also been described previously [9, 10]. Briefly, knowledge scores were the percent of the knowledge questions answered correctly (potential range 0, 100%). A confidence scale score with a possible range of 19 to 95 was derived from responses to 19 Likert-type questions such as "I feel confident responding to patients' questions about HDS;" it had a Cronbach alpha reliability statistic of 0.96. Respondents who had seen patients within the past 30 days completed the communications practices scale, with a range of scores from 0 to 10; the Cronbach alpha reliability statistic was 0.84 for baseline and 0.92 for the immediate outcome assessments for this scale.
Chi-square methods were used for evaluation of associations of categorical variables. For continuous outcomes measures, t tests or analysis of variance (ANOVA) were utilized for normally distributed data, and Mann-Whitney U tests or Kruskal-Wallis tests for non-normally distributed variables. For repeated measures outcomes, paired samples t-tests or Wilcoxon signed rank tests were used, depending on data characteristics. Analyses were performed using SPSS 14.0 (SPSS Inc., Chicago, IL).
This study was approved as "exempt" as an educational research project by the Wake Forest University School of Medicine Institutional Review Board.
Baseline Characteristics of Non-Completers and Completers of Long-term Follow-up Questionnaire
Age Years (mean)
41.8 ± 12.5
42.5 ± 12.7
Herb Use (at baseline)
Baseline Total HDS Use
5.8 ± 6.4
5.6 ± 5.3
Course Fee Paid
Baseline Knowledge (% correct)
66.2 ± 10.8
67.7 ± 10.3
Baseline Confidence score
53.6 ± 18.3
53.7 ± 17.8
Baseline Communication Practices |
2.2 ± 2.0
2.2 ± 1.9
Factors Associated with Changes in Expertise by Univariate Analysis
Baseline to Long-term Follow-Up CHANGES IN Knowledge Scores
Baseline to Long-term Follow-Up CHANGES IN CONF Scores
Baseline to Long-term Follow-Up CHANGES IN COMM Scores**
5.4 ± 13.5
0.4 ± 25.9
1.2 ± 1.7
11.7 ± 12.0
18.0 ± 12.9
1.0 ± 1.7
8.9 ± 12.7
11.8 ± 20.2
.9 ± 1.7
12.2 ± 12.5
15.9 ± 16.3
1.6 ± 1.6
In this long-term follow-up study, the on-line curriculum resulted in significant and sustained improvements in knowledge, confidence, and communication for diverse clinicians regardless of delivery strategy. Outcomes were only related to semester of enrollment and being a trainee versus a practitioner. Those who took the course in the spring had significantly greater improvements in knowledge and confidence scores than those enrolled in the fall. The differences between fall and spring may be because fall completers had substantially more time to forget learned information than their spring counterparts.
Similarly, trainees had significantly greater improvements than practitioners in all three outcomes (knowledge, confidence, and communication). These differences may be due to two factors. First, trainees had lower baseline scores than practitioners, allowing for greater opportunity for improvement. Secondly, trainees presumably have fewer experiences and habits to unlearn than practitioners.
As expected, knowledge scores decreased from initial follow-up to the long-term follow-up. However, even six to ten months after completing the course, knowledge scores were significantly higher than the baseline scores. This suggests significant knowledge retention of the curriculum material. Confidence and communication scores progressively increased from baseline to the long-term follow-up. These observations are consistent with the hypothesis that as individuals had more opportunity to practice the material they had learned, they could reinforce it and feel increasingly more confident and communicate with patients more comfortably.
The results of this study are consistent with previous research that demonstrate the effectiveness of online CME courses [6, 7, 9]. Although changes in communication in this study were statistically significant, the actual improvements were small. This is consistent with previous research which suggests low communication with patients regarding HDS use . Although previous research has indicated that clinicians' behavior can be improved following training courses [8, 12, 13], the results of this study indicate that these behavior changes continue to improve long-term. However, additional strategies still need to be developed to more effectively improve clinician's communication practices.
This long-term follow-up study has several limitations. First, the sample consisted of self selected enrollees who elected to learn more about HDS, which limits generalizability to elective courses; it is possible that outcomes would differ for participants in required courses. Another limitation is the low response rate to the long-term follow-up. This limits the generalizability of the outcomes to those individuals who have a greater willingness to complete surveys even after the completion of the initially planned study. Those who are willing to complete such voluntary questionnaires (which were not part of the original study "contract") may have been more knowledgeable and confident about their ability to do well. This conjecture is supported by the observation that those who completed the long-term follow-up had slightly, but significantly higher knowledge scores than the non-respondents. Also, the study relied on self-reported changes in confidence and communication, which may overestimate actual behavioral changes [14, 15, 16]; future studies in this field should corroborate self-report with objective measures of clinician behavior. Finally, we did not collect information on the actual costs of delivering the curriculum through each method because study personnel were engaged in both offering and studying the intervention and did not separately allocate research and education efforts. However, it is our impression that bolus-pull delivery is the least expensive to deliver for participants such as those in this study.
Despite these limitations, results from this long-term follow-up study have important implications for professional education and future research. Online case-based curriculum with evidence-based resource links results in significant and sustained improvements in knowledge, confidence, and communication. These improvements are substantial and do not appear to depend on the delivery strategy, at least among motivated clinicians. Therefore, educators can choose to offer on-line Continuing Education (CE) courses with confidence. Because the delivery strategy of online curriculum does not affect attainment of learning goals, the most convenient and low-cost delivery method can be utilized. Future studies about one-line CME should focus on whether required curriculum would have similar outcomes as elective courses as well as developing interventions that would further improve clinicians' communication patterns.
We would like to thank Jessica Gobble, Michael Lischke, and the Northwest AHEC staff for their invaluable assistance; and Eleanor Russell and the Wake Forest University Physician Assistant Department for their support throughout this project. This work was supported by NIH grant R01 LM007709 from the NIH National Library of Medicine and by the Fullerton Foundation of Gaffney, South Carolina.
- Izzo AA: Herb-drug interactions: an overview of the clinical evidence. Fundam Clin Pharmacol. 2005, 19: 1-16. 10.1111/j.1472-8206.2004.00301.x.View ArticleGoogle Scholar
- Kemper KJ, Amata-Kynvi A, Dvorkin L, Whelan JS, Woolf A, Samuels RC, Hibberd P: Herbs and other dietary supplements: healthcare professionals' knowledge, attitudes, and practices. Altern Ther Health Med. 2003, 9: 42-49.Google Scholar
- Kreitzer MJ, Mitten D, Harris I, Shandeling J: Attitudes toward CAM among medical, nursing, and pharmacy faculty and students: a comparative analysis. Altern Ther Health Med. 2002, 8: 44-3.Google Scholar
- Whitcomb ME: CME reform: an imperative for improving the quality of medical care. Acad Med. 2002, 77: 943-944.View ArticleGoogle Scholar
- Fordis M, King JE, Ballantyne CM, Jones PH, Schneider KH, Spann SJ, Greenberg SB, Greisinger AJ: Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA. 2005, 294: 1043-1051. 10.1001/jama.294.9.1043.View ArticleGoogle Scholar
- Davis D, O'Brien MA, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A: Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes?. JAMA. 1999, 282: 867-874. 10.1001/jama.282.9.867.View ArticleGoogle Scholar
- Kemper KJ, Amata-Kynvi A, Sanghavi D, Whelan JS, Dvorkin L, Woolf A, Samuels RC, Hibberd P: Randomized trial of an internet curriculum on herbs and other dietary supplements for health care professionals. Acad Med. 2002, 77: 882-889. 10.1097/00001888-200209000-00014.View ArticleGoogle Scholar
- Kemper KJ, Gardiner P, Gobble J, Mitra A, Woods C: Randomized Controlled Trial Comparing Four Strategies for Delivering e-Curriculum to Health Care Professionals [ISRCTN88148532]. BMC Med Educ. 2006, 6: 2-10.1186/1472-6920-6-2.View ArticleGoogle Scholar
- Kemper KJ, Gardiner P, Gobble J, Woods C: Expertise about herbs and dietary supplements among diverse health professionals. BMC Complement Altern Med. 2006, 6: 15-10.1186/1472-6882-6-15.View ArticleGoogle Scholar
- Jaski ME, Schwartzberg JG, Guttman RA, Noorani M: Medication review and documentation in physician office practice. Eff Clin Pract. 2000, 3: 31-34.Google Scholar
- Cockayne NL, Duguid M, Shenfield GM: Health professionals rarely record history of complementary and alternative medicines. Br J Clin Pharmacol. 2005, 59: 254-258. 10.1111/j.1365-2125.2004.02328.x.View ArticleGoogle Scholar
- Hudson K, Brady E, Rapp D: What you and your patients should know about herbal medicines. JAAPA. 2001, 14: 27-4.Google Scholar
- Fowles JB, Rosheim K, Fowler EJ, Craft C, Arrichiello L: The validity of self-reported diabetes quality of care measures. Int J Qual Health Care. 1999, 11: 407-412. 10.1093/intqhc/11.5.407.View ArticleGoogle Scholar
- Glintborg B, Andersen SE, Spang-Hanssen E, Dalhoff K: Disregarded use of herbal medical products and dietary supplements among surgical and medical patients as estimated by home inspection and interview. Pharmacoepidemiol Drug Saf. 2005, 14: 639-645. 10.1002/pds.1049.View ArticleGoogle Scholar
- Veninga CC, Denig P, Pont LG, Haaijer-Ruskamp FM: Comparison of indicators assessing the quality of drug prescribing for asthma. Health Serv Res. 2001, 36: 143-161.Google Scholar
- The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1472-6920/6/39/prepub
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.