Skip to main content
  • Research article
  • Open access
  • Published:

Competency-based evaluation tools for integrative medicine training in family medicine residency: a pilot study

Abstract

Background

As more integrative medicine educational content is integrated into conventional family medicine teaching, the need for effective evaluation strategies grows. Through the Integrative Family Medicine program, a six site pilot program of a four year residency training model combining integrative medicine and family medicine training, we have developed and tested a set of competency-based evaluation tools to assess residents' skills in integrative medicine history-taking and treatment planning. This paper presents the results from the implementation of direct observation and treatment plan evaluation tools, as well as the results of two Objective Structured Clinical Examinations (OSCEs) developed for the program.

Methods

The direct observation (DO) and treatment plan (TP) evaluation tools developed for the IFM program were implemented by faculty at each of the six sites during the PGY-4 year (n = 11 on DO and n = 8 on TP). The OSCE I was implemented first in 2005 (n = 6), revised and then implemented with a second class of IFM participants in 2006 (n = 7). OSCE II was implemented in fall 2005 with only one class of IFM participants (n = 6).

Data from the initial implementation of these tools are described using descriptive statistics.

Results

Results from the implementation of these tools at the IFM sites suggest that we need more emphasis in our curriculum on incorporating spirituality into history-taking and treatment planning, and more training for IFM residents on effective assessment of readiness for change and strategies for delivering integrative medicine treatment recommendations. Focusing our OSCE assessment more narrowly on integrative medicine history-taking skills was much more effective in delineating strengths and weaknesses in our residents' performance than using the OSCE for both integrative and more basic communication competencies.

Conclusion

As these tools are refined further they will be of value both in improving our teaching in the IFM program and as competency-based evaluation resources for the expanding number of family medicine residency programs incorporating integrative medicine into their curriculum. The next stages of work on these instruments will involve establishing inter-rater reliability and defining more clearly the specific behaviors which we believe establish competency in the integrative medicine skills defined for the program.

Peer Review reports

Background

In 2000, a set of suggested curriculum guidelines [1] for family medicine residents in Complementary/Alternative Medicine (CAM) were published. These guidelines were endorsed by the Board of the Society of Teachers of Family Medicine in 1999, and represented the first published set of recommendations for residency-level teaching in this area. However, no well-defined set of measurable competencies has been identified and described for family medicine residents, and no competency-based evaluation tools have been developed and widely adopted to date. This article describes the ongoing efforts at six residency programs participating in a pilot program in "Integrative Family Medicine" to develop competencies in this area and the tools to measure them effectively. The term CAM has been widely replaced by "Integrative Medicine" (IM), which is defined as "healing-oriented medicine that takes account of the whole person (body, mind, and spirit), including all aspects of lifestyle. It emphasizes the therapeutic relationship and makes use of all appropriate therapies, both conventional and alternative. [2]"

Beginning in 2003, six residency programs joined with the University of Arizona Program in Integrative Medicine to collaboratively develop a four-year combined family medicine/integrative medicine residency program. The two goals of the program are to develop and implement an accredited model for a four year Integrative Family Medicine (IFM) training program which combines training in Integrative Medicine with conventional Family Medicine residency training, and to train physicians capable of practicing healing oriented medicine within the context of the whole person (mind, body, and spirit) and with the ability to employ all appropriate therapeutic options, both conventional and alternative.

The six sites participating in the IFM Program are the University of Arizona, Beth Israel/Albert Einstein College of Medicine, Maine Medical Center, Middlesex Hospital (University of Connecticut), Oregon Health Sciences University, and the University of Wisconsin. The Accreditation Council of Graduate Medical Education Family Medicine Residency Review Committee (RRC) awarded this pilot program experimental status in 2003. IFM residents from participating sites are identified in the early part of their PGY-2 year, and begin the integrative medicine component of the training experience in January of that year. Each of the six sites has a faculty member trained in integrative medicine who serves as the mentor for the resident's clinical experience during the program.

The core elements of the IFM program are outlined in Table 1. The details of the IFM curriculum have been described elsewhere, as has its early impact on recruitment at the six pilot sites [3]. This paper describes our efforts to develop and test a set of measurement tools to assess competencies for integrative medicine at the residency level.

Table 1 Core IFM Program Elements

Development of the IFM competencies

Several sets of competencies in integrative medicine, developed for use in other settings, were reviewed as we began to describe the required competencies for the IFM program. [4] In an iterative process of dialogue involving the faculty members at the six sites, we arrived at the set of competencies listed in Table 2.

Table 2 IFM Competencies with ACGME domains

The most significant challenge to reaching consensus was deciding whether a number of the communication and patient-centered care competencies were already being adequately taught and measured in standard Family Medicine residency training, and thus should be omitted from the specific competencies for this program; or whether these areas – so central to the vision of integrative medicine – should be included here, despite the obvious overlap with the conventional family medicine curriculum. A decision was made to include both the more general communication competencies and the integrative medicine-specific competencies for this first round of evaluation. These competencies were then mapped to the six Accreditation Council for Graduate Medical Education competency domains: patient care, medical knowledge, practice based learning and improvement, interpersonal and communication skills, professionalism, and system-based practice. The IFM competencies and their relationship to the ACGME competencies are depicted in Table 2.

Competency-based evaluation strategies

Once competencies were identified, evaluation strategies specific to the IFM program were developed based on faculty experience, a review of the evaluation literature, recommendations from the ACGME Outcomes Project,[5] and adoption of methods currently used in the family medicine residency programs of the participating institutions. These strategies include direct observation, reviews of written treatment plans, Objective Structured Clinical Examinations (OSCEs) to be done during the second and third residential weeks at the University of Arizona during the PGY 3 and PGY 4 years, and ongoing formative evaluation meetings with faculty. Knowledge-based evaluation tools used in the University of Arizona Integrative Medicine Fellowship curriculum were also incorporated. The overall evaluation plan is presented in Table 3, and each of the evaluation strategies developed for the IFM program is described.

Table 3 ACGME-linked IFM Competencies and evaluation strategies

Direct observation evaluation tool

We developed a direct observation checklist that delineates the specific behaviors expected of IFM participants in the direct care of patients. Behaviors included in that instrument are reflective of program competencies and move across the ACGME General Competencies. The tool builds on a number of other previous efforts in assessing competence in graduate medical education, including the widely-used Mini-CEX. [6] The tool developed here expands on these efforts by incorporating specific behaviors reflecting the competencies in integrative medicine outlined above. Evaluating faculty rate these behaviors as emerging (beginning to show this skill), established (basic knowledge/skills attained and demonstrated routinely) and integrated (uses knowledge/skills flexibly as part of an overall repertoire). If behaviors have not been observed directly by the faculty or if input is insufficient reflection of their competency in these areas, behaviors are rated as DNO (did not observe). Behaviors that are repeatedly rated as DNO will be deleted in the finalized version of the IFM Direct Observation tool.

Each resident is evaluated approximately semi-annually; scores for each resident are compared to expected outcomes based on year of residency and then compared across years of training to determine professional growth and change. To emphasize the formative aspect of the methodology, each direct observation experience is followed by a debriefing session, which incorporates both (a) review of the behaviors observed by faculty and (b) reflection about the encounter experience. We believe such reflection is critical to good practice of integrative medicine. In the Debrief component of the Direct Observation evaluation, the resident is asked to describe: (a) How did you feel during the encounter? (b) What, if anything, did this encounter teach you about yourself? and (c) Reflecting back, is there anything else you could have done to enhance healing during this encounter?

Treatment plan evaluation tool

Similarly, we developed a treatment plan evaluation tool that delineates the specific behaviors and language expected of IFM participants when developing and modifying evidence-based integrative treatment plans for specific patients. The same rating scale is used for the behaviors, which again are linked to the IFM program competencies as well as the ACGME General Competencies. Two treatment plans from each resident are evaluated annually. These scores are shared with the resident during a debriefing session.

Objective Structured Clinical Examinations (OSCEs)

The OSCE component of our evaluation strategy has been implemented via two OSCE experiences: the first, OSCE I, takes place in the middle of the PGY-3 year, and the second, OSCE II, in the middle of the PGY-4 year. OSCE I examines integrative medicine history-taking skills; OSCE II assesses the development of an integrative medicine treatment plan and the effective communication of that plan to the patient. For OSCE I, residents are presented with a patient experiencing migraines; for OSCE II, with a patient at risk for cardiovascular disease. Each OSCE represents an encounter with a single Standardized Patient. We do recognize that using a single patient raises potential problems with content specificity and with attaining a generalizable estimate of clinical performance; however for this initial round of OSCE implementation logistical constraints limited us to a single patient encounter. We hope to develop multiple OSCE stations for future iterations of this process.

The data from the first implementation of OSCE I (2005) were reported elsewhere [4], and led to a significant revision of OSCE I (2006) for the next class, which is presented here. Additionally, the first implementation of OSCE II (Fall 2005) is reported here. OSCE behaviors for both OSCE II (2005) and OSCE I (2006) were again linked to IFM program competencies and ACGME General Competencies.

Methods

The direct observation (DO) and treatment plan (TP) evaluation tools were implemented by faculty at each of the six sites during the PGY-4 year (n = 11 on DO and n = 8 on TP). The OSCE I was implemented first in 2005 (n = 6), revised and then implemented with a second class of IFM participants in 2006 (n = 7). OSCE II was implemented in fall 2005 with only one class of IFM participants (n = 6). This pilot study was approved by the University of Arizona Institutional Review Board.

Data from the initial implementation of these tools were then examined (a) to attempt to evaluate the effectiveness of our teaching and (b) to begin to evaluate the actual tool for its utility in our evaluation process. If residents performed poorly on any competencies across IFM sites and evaluation strategies, this might represent a deficiency in our teaching program. Contrastingly, if particular behaviors could not be evaluated consistently across sites (a given competency was repeatedly rated as DNO), this might mean that the competency (a) was not clearly defined or stated or (b) that the competency is not one well-suited to the type of evaluation (direct observation, review of treatment plans or standardized patients) for which it was proposed. Each of these findings would be used to suggest changes to the next iteration of this evaluation tool. Descriptive statistics were used to describe the data collected from the DO and TP tools.

Results

Direct observation and treatment plan evaluation tools

In order to evaluate the effectiveness of teaching and to determine areas of weakness within the training program, scores for individual competencies on the Direct Observation and Treatment Plan Evaluation Tools across participants/program sites were examined. We hypothesized that such weakness would emerge as consistently low scores across sites for a given item; for the purposes of this analysis, we considered items where the mean score was 2.5 (83%) or below out of a possible three. We then examined the data across sites for behaviors that observers consistently (>= 50% of the time) identified as "unable to evaluate" or "did not observe" in order to evaluate the utility of the evaluation tools. The results of the Direct Observation evaluation are listed in Table 4, and the results of the Treatment Plan evaluation are reported in Table 5.

Table 4 Scores on Direct Observation Evaluation Tool Across Program Sites (n = 11)
Table 5 Scores on Treatment Plan Evaluation Tool Across Program Sites (n = 9)

OSCE results

Results from the OSCE I (2005) were previously reported [4]. Results of OSCE I (2006) are displayed in Table 6. The range of scores on OSCE I (2005) was 80–100%, with a mean score of 90%. As can be seen from the sampling of scores in Table 6, the change in scores between 2005 and 2006 suggests that these revisions in OSCE competencies were helpful; competencies now identified in OSCE I are more specific to integrative medicine and less repetitive of skills associated with family medicine residency. In OSCE I (2005) there was almost no one who failed to meet the "too-basic" patient-care and communication competencies. Total scores in OSCE I (2006) dropped, providing much more useful information for us as to what areas of the IFM participants training need more attention in the PGY-2/3 years.

Table 6 OSCE I (2006) IFM Competency Scores*

Outcomes from OSCE I (2006) also suggest that although we can assume basic communication skills as a prerequisite already established in our IFM participants by the middle of PGY-3 year, we still have significant work to do in the area of comprehensive history-taking, both conventional and integrative. Perhaps most surprising is that only 57% of participants asked about current and past use of CAM, only 57% adequately screened for depression, and only 57% recognized that both stress and neck pain could be contributing to the migraines in the OSCE patient. Only 29% inquired in depth on how the migraines might be affecting work or personal relationships for the patient.

OSCE II was designed to assess knowledge base and treatment planning and communication skills. OSCE II (2005) results (Table 7) suggest that IFM participants need more training regarding the role of spirituality in cardiovascular health, as well as in the use of fish oils, botanical medicines, and nutritional supplements for cardiovascular risk reduction. They also suggest – at least as it pertains to diet change and exercise recommendations – that despite our emphasis on patient empowerment and informed choice, residents may need to be reminded to include explanations of why a set of recommendations might be beneficial to the patient in their delivery of those recommendations. Of course it should be acknowledged that because of the small sample size, the fragility of these percentages in our OSCE results makes drawing definitive conclusions difficult.

Table 7 OSCE II (2005) IFM Competency Scores

Discussion

Two fundamental goals directed our development of competency-based evaluation tools for family physicians training in integrative medicine. Our immediate goal was to develop an effective evaluation strategy for the participants in the Integrative Family Medicine pilot program to meet internal evaluation needs as well as those of the Family Medicine Residency Review Committee (RRC). Our long-term goal is to contribute measurement tools for broad use in Family Medicine Residency programs. As more integrative medicine educational content is integrated into conventional family medicine teaching, the need for effective evaluation strategies grows. Current curriculum being introduced into family medicine residency training includes the use of botanicals and nutritional supplements, the use of specific dietary strategies such as the anti-inflammatory diet and the use of mind-body approaches. Evidence-based discussion of these areas regularly appears in American Family Physician reviews, [7] and the number of questions on these topics on the American Board of Family Practice certification exam continues to increase. A recent Institute of Medicine report recommends that "...health profession schools (e.g. schools of medicine, nursing, pharmacy, and allied health) [should] incorporate sufficient information about CAM into the standard curriculum at the undergraduate, graduate and post graduate levels to enable licensed professionals to competently advise their patients about CAM."[8]

However there are no competency-based evaluation tools that effectively assess competencies in evidence-based integrative medicine for use in residency programs. Our intention is to share our experience in developing and utilizing tools that may be modified for use in residency programs.

We found the direct observation checklist a useful tool, particularly when incorporating a debriefing session following the observation. The addition of this element allowed us to re-emphasize the "personal process" dimension of the integrative interview which incorporates an emphasis on reflection and countertransference as clinical skills essential to the integrative approach. The limitation, as with all such tools, relates to the difficulty of standardizing across observers and sites the exact behaviors that must be observed to demonstrate competency in a certain area. We plan to establish better face validity for the next iteration of these tools by defining exact behaviors more clearly using input from experts in the field of integrative medicine outside of the IFM faculty. Whereas the six IFM programs all have an "expert" faculty member who can rely on their own experience in assessing competency in herbal prescribing, or incorporation of mind-body strategies into treatment planning, other residencies might lack such expertise among the faculty. Therefore defining more clearly the behaviors which represent competency will be critical to the potential broader use of this instrument in non-IFM residencies.

The reliability of these tools also needs to be established if they are to enter wider use in residency programs. At our next faculty meeting, we plan to hold an "observation rating session" in which the six faculty in the program all observe one resident in a patient encounter and rate their performance using the DO tool. We will then use this data to measure the inter-rater reliability for this tool. We are also planning a similar process to establish inter-rater reliability for the treatment plan evaluation tool during the coming year.

Our initial OSCE experience revealed that we were measuring basic patient-centered communication skills which more properly belong in the domain of conventional family medicine. IFM participants in the initial OSCE I (2005) confirmed that by the PGY-3 year these skills have been thoroughly mastered and should be considered prerequisites for the IFM training rather than objectives for the program. Our second iteration of the OSCE I (2006) focused more narrowly on integrative medicine history-taking – competencies 6 through 14 specifically – and was much more effective in delineating strengths and weaknesses in our residents' performance. We are hopeful that as we develop new OSCEs more focused on these specific integrative medicine competencies that these might be useful, either wholly or in part, for residencies to incorporate into their ongoing OSCE assessment strategies.

Recognizing the infancy of both the IFM program and these evaluation strategies – and the fact that we do not know if areas of weakness in performance identified are a function of actual resident performance or of a weakness in the design of our tools – there were two areas of relatively weak performance which appeared both on the DO/TP assessment and on the OSCEs. First, as discussed above, there is an apparent pattern of weakness in the area of how treatment recommendations are communicated to patients – i.e. how often the obstacles to implementation are discussed adequately, and how effective residents are in assessing readiness for change. Second is the fact that apparently spirituality is often not discussed with patients by IFM participants. Since both of these findings appear in both evaluation settings, our hypothesis is that these do represent real weaknesses in performance rather than weaknesses in our measurement strategies. Of course it is also possible that due to our small "n" and variability across sites, these results represent evaluation artifact or confounding rather than actual weakness in performance and/or curriculum. We plan to more actively address both of these areas in our curriculum in the coming year, and our hope is that this will lead to improvement in competency scoring on the next round of DO/TP and OSCE evaluation scheduled for 2007 and to a change in how our rising PGY-4s perform in these areas.

Conclusion

The Integrative Family Medicine program has created a unique need and opportunity to develop specific integrative medicine competencies for residents as well as the tools to measure them. To date, OSCEs exploring the expanded integrative history and treatment plan have been developed, assessed and modified. These skills have been further assessed with a direct observation tool and a treatment plan review tool. As these tools are refined further they are likely to be of value to the expanding number of family medicine residency programs incorporating integrative medicine into their curriculum.

The next stages of work on these instruments will involve establishing inter-rater reliability by having faculty independently rate the same clinical encounter, and establishing face validity more clearly by defining more clearly, using opinion solicited from experts in the field of integrative medicine outside of the IFM faculty, the specific behaviors which we feel represent competency in the integrative medicine skills we have defined for the program.

References

  1. Kligler B, Gordon A, Stuart M, Sierpina V: Suggested curriculum guidelines on complementary and alternative medicine: recommendations of the Society of Teachers of Family Medicine Group on Alternative Medicine. Family Medicine. 2000, 2 (1): 30-3.

    Google Scholar 

  2. Maizes V, Horwitz R: Ethics, education and integrative medicine. Ethics JAMA. 2004, 6 (11): Accessed February 1, 2007., [http://www.ama-assn.org/ama/pub/category/13194.html]

    Google Scholar 

  3. Kligler B, Maizes V, Schachter S, Park CM, Gaudet T, Benn R, Lee R, Remen RN: Education Working Group, Consortium of Academic Health Centers for Integrative Medicine. Core competencies in integrative medicine for medial school curricula: a proposal. Acad Med. 2004, 79 (6): 521-31. 10.1097/00001888-200406000-00006.

    Article  Google Scholar 

  4. Maizes V, Silverman H, Lebensohn P, Koithan M, Kligler B, Rakel D, Schneider C, Kohatsu W, Hayes M, Weil A: The integrative family medicine program: an innovation in residency education. Acad Med. 2006, 81 (6): 583-589. 10.1097/01.ACM.0000225225.35399.e4.

    Article  Google Scholar 

  5. Accreditation Committee on Graduate Medical Education Outcomes Project. accessed May 18, 2006, [http://www.acgme.org/outcome/comp/compFull.asp]

  6. Norcini JJ: The Mini Clinical Evaluation Exercise (mini-CEX). The Clinical Teacher. 2005, 2 (1): 25-30. 10.1111/j.1743-498X.2005.00060.x.

    Article  Google Scholar 

  7. Wright J: "Inside AFPÄFP Begins Updates on Complementary and Alternative Medicine.". January 1, 2003. Accessed 6/27/06., [http://www.aafp.org/afp/20030101/inside.html]

  8. Institute of Medicine Committee on the Use of Complementary and Alternative Medicine: Complementary and Alternative Medicine in the United States. 2005, National Academy of Sciences. Washington DC

    Google Scholar 

Pre-publication history

Download references

Acknowledgements

David Rakel, M.D., Howard Silverman, M.D. and Wendy Kohatsu, M.D, also participated in developing the evaluation tools and in collecting evaluation data at the IFM sites. Paula Cook, MS, assisted in data entry and analysis of the results.

The contents of his paper were developed under a grant from the U.S. Department of Education. However, those contents do not necessarily represent the policy of the Department of Education, and readers should not assume endorsement by the federal government.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin Kligler.

Additional information

Competing interests

The author(s) declare that they have no competing interests.

Authors' contributions

All of the authors participated in conceiving the evaluation process and tools described in the article. BK coordinated the writing of the manuscript and assisted in the analysis of the evaluation data. MK was responsible for analysis of the evaluation data and contributed to the writing of the methods, results and discussion sections. VM oversees the IFM program and contributed to the writing of the Background and Discussion sections. MH, CS, PL and SH were responsible for collecting the evaluation data on the participants and contributed significantly to the writing of the article. All authors read and approved of the final manuscript.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Kligler, B., Koithan, M., Maizes, V. et al. Competency-based evaluation tools for integrative medicine training in family medicine residency: a pilot study. BMC Med Educ 7, 7 (2007). https://doi.org/10.1186/1472-6920-7-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-7-7

Keywords