We’re sorry, something doesn't seem to be working properly.
Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.
Impact of subspecialty elective exposures on outcomes on the American board of internal medicine certification examination
BMC Medical Educationvolume 12, Article number: 94 (2012)
The American Board of Internal Medicine Certification Examination (ABIM-CE) is one of several methods used to assess medical knowledge, an Accreditation Council for Graduate Medical Education (ACGME) core competency for graduating internal medicine residents. With recent changes in graduate medical education program directors and internal medicine residents are seeking evidence to guide decisions regarding residency elective choices. Prior studies have shown that formalized elective curricula improve subspecialty ABIM-CE scores. The primary aim of this study was to evaluate whether the number of subspecialty elective exposures or the specific subspecialties which residents complete electives in impact ABIM-CE scores.
ABIM-CE scores, elective exposures and demographic characteristics were collected for MedStar Georgetown University Hospital internal medicine residents who were first-time takers of the ABIM-CE in 2006–2010 (n=152). Elective exposures were defined as a two-week period assigned to the respective subspecialty. ABIM-CE score was analyzed using the difference between the ABIM-CE score and the standardized passing score (delta-SPS). Subspecialty scores were analyzed using percentage of correct responses. Data was analyzed using GraphPad Prism version 5.00 for Windows.
Paired elective exposure and ABIM-CE scores were available in 131 residents. There was no linear correlation between ABIM-CE mean delta-SPS and the total number of electives or the number of unique elective exposures. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures (143.4 compared to 129.7, p=0.051). Repeated electives in individual subspecialties were not associated with significant difference in mean ABIM-CE delta-SPS.
This study did not demonstrate significant positive associations between individual subspecialty elective exposures and ABIM-CE mean delta-SPS score. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures suggesting there may be an “ideal” number of elective exposures that supports improved ABIM-CE performance. Repeated elective exposures in an individual specialty did not correlate with overall or subspecialty ABIM-CE performance.
Medical knowledge is an Accreditation Council for Graduate Medical Education (ACGME) core competency. Scores on the American Board of Internal Medicine Certification Examination (ABIM-CE) are one of several methods used to assess medical knowledge in graduating medical residents. ACGME requirements have dramatically changed residency education over the last five years and limitations on resident duty hours implemented in 2003 have placed a burden on program directors to ensure that residents’ time is well distributed between service duties and educational experiences . Concomitantly with the duty hours limitations, there has been a shift towards measurable-educational outcomes in resident training. Program directors have to balance the resident educational experience, to ensure that residents gain the required medical knowledge to pass the ABIM-CE, while also permitting resident autonomy to redirect their training experience to achieve their intended career goals [2–4].
To assist programs in evaluating resident performance, the ACGME has defined a toolbox of methods for assessing the six required competencies, including medical knowledge. Many programs have now adopted these methods which include the Internal Medicine In-Training Examination (IM-ITE), the American Board of Internal Medicine Certification Examination (ABIM-CE), standardized patients, objective-structured clinical examinations (OSCE), 360 multi-source evaluations, program director medical knowledge score, patient surveys, portfolios, oral exams and checklists . Performance on the IM-ITE is a predictor of performance on the ABIM-CE [6, 7] and studies have shown outcomes on the IM-ITE can be improved with conference attendance and self directed reading of electronic knowledge resources [8, 9]. While standardized test performance does not encompass all of the competencies required from a graduating resident, satisfactory performance on the ABIM-CE is a goal common to all graduating residents and to all internal medicine residency programs. Therefore, it is important to assess the factors that impact resident performance on the ABIM-CE.
The choice of elective exposures is one of the few components of residency training over which individual residents maintain autonomy, yet there is a paucity of data to guide residents in selecting subspecialty electives. For a variety of reasons, residents may pursue several electives in one particular subspecialty but not rotate through other subspecialties at all. Prior studies have shown that subspecialty ABIM-CE performance can be improved by developing structured curricula within the elective experience . However, data comparing the impact of subspecialty elective exposures and ABIM-CE performance is lacking and would be of great value to program directors as they reorganize and develop graduate medical education programs.
The primary aim of the current study is to evaluate the impact of individual subspecialty elective exposures on resident ABIM-CE scores within a university based residency program. While we predict exposure to individual subspecialty electives might be associated with subspecialty ABIM-CE performance, we hypothesize that repeated exposures in a single specialty may not further improve ABIM-CE performance.
This study was approved by the MedStar Georgetown University Hospital (MGUH) Institutional Review Board. ABIM-CE score reports, elective exposures, and demographic characteristics were collected for all internal medicine categorical residents enrolled in the MGUH Internal Medicine residency program who took the ABIM-CE for the first time between 2006 and 2010. ABIM-CE scores were released by the program director and de-identified by an investigator not involved in medical education to ensure confidentiality. Residents who declined to release their ABIM-CE score report to the program director as well as those who had transferred into the program were excluded from the analysis due to incomplete elective exposure and ABIM-CE data.
Demographic information was obtained for each resident, including gender, age at time of ABIM-CE, and whether the resident graduated from a US or international medical school.
ABIM-CE score abstraction and calculation of delta-SPS
ABIM-CE scores include a standardized test score and a report of the total number of items correct, with a breakdown of correct responses by subspecialty. The standardized test score incorporates a weighting based on test characteristics, it is therefore not directly comparable from year to year. To accommodate for this, and to allow more accurate measure of spread of scores, the difference between the individual resident standardized score and the standardized passing score for the exam year was calculated (the delta-standardized passing score or delta-SPS). Raw scores were additionally used to compute percentage of items correct in each subspecialty.
Elective exposure data collection
Using on-call schedule records, elective exposure data was collected on all residents included in the study. To allow consistent comparisons between electives of differing duration, an elective exposure was defined as a two-week period in the respective subspecialty. Thus, a resident completing a four-week elective in gastroenterology would be considered to have had two two-week exposures to gastroenterology. Residents who transferred into the MGUH residency program did not have accurate data on elective exposures prior to enrollment in the MGUH program; therefore these residents were excluded from further analysis.
Program director medical knowledge score
The ABIM requires internal medicine program directors to submit ratings of their residents in overall clinical competence and its essential components prior to the participation in the ABIM-CE. These evaluations are made on a 9-point Likert scale with 1–3 considered “unsatisfactory,” 4–6 considered “satisfactory,” and 7–9 considered “superior.” Candidates with program director medical knowledge Scores of less than 4 are not eligible to participate in the examination. The MGUH program director medical knowledge scores were available for residents sitting the ABIM-CE in years 2007–2010 and were compared to corresponding resident ABIM-CE scores.
Descriptive statistics were calculated for sociodemographic characteristics. D’Agostino and Pearson omnibus normality testing was performed to assess data distribution, and associations were analyzed using unpaired t-tests, one-way analysis of variance (ANOVA), chi-squared test and linear regression using GraphPad Prism version 5.00 for Windows (GraphPad Software, San Diego, CA, USA). P-values less than 0.05 were considered significant. In the five years under study 7 residents failed the ABIM-CE (5 women, 2 men) giving a pass rate of 95%, with 90% pass rate for women residents and 97% pass rate for men (p=0.07 chi-square test).
Of the 152 residents enrolled in the MGUH internal medicine residency who were first-time takers of the ABIM-CE from 2006–2010, 131 had paired elective exposure and ABIM-CE score data available.
Sociodemographic characteristics were comparable for each annual resident cohort (Table 1). There was no significant association between ABIM-CE score and age. While the annual cohorts had similar age distributions, the male to female ratio was higher in 2006–2008 but more evenly distributed in 2009–2010. For the whole group, the mean delta-SPS was lower for women than men (mean 117.4 compared to 146.6, p=0.0485, Table 2).
International medical graduates
The number of international medical graduates (IMG) in the MGUH program was small (n=9), but it was similar in all years studied. IMGs did not exhibit significant differences in mean delta-SPS or ABIM-CE pass rate compared to US medical graduates in this program.
Comparison of ABIM-CE score and number of elective exposures
The mean number of elective exposures per resident has been steadily increasing from 14 in 2006 to 19 in 2010. There was no linear correlation between total number of elective exposures, or number of unique elective exposures and ABIM-CE mean delta-SPS (Table 1). Residents with fewer than 14 elective exposures had higher ABIM-CE mean delta-SPS than those with 15 or more elective exposures (mean delta-SPS 143.4 compared to 129.7, p=0.051) suggesting there may be an “ideal” number of elective exposures that supports improved ABIM-CE performance, but above which performance does not further improve.
Comparison of ABIM-CE score and individual subspecialty elective exposures
ABIM-CE mean delta-SPS (95% CI) for residents exposed compared to unexposed to particular subspecialty electives is shown in Table 3, broken down by annual cohort, and for the entire dataset. There was a negative association between cardiology elective exposure and mean delta-SPS (mean 128.6 for exposed residents, compared to mean 170.1 for unexposured, p=0.05). In the 2008 cohort both pulmonary and rheumatology electives were associated with significantly higher mean delta-SPS. When data from all cohorts was combined, no significant difference was seen in ABIM-CE mean delta-SPS for exposed compared to unexposed residents for any of the individual subspecialty electives.
Comparison of ABIM-CE score and repeated elective exposures
ANOVA analysis comparing ABIM-CE scores in subjects with 0, 1–2 and more than 2 elective exposures in each subspecialty did not show a significant association between repeated elective exposures and improved performance on the ABIM-CE based either on the total percentage correct or the subspecialty percentage correct (Tables 4 and 5).
Comparison of elective exposure and ABIM-CE subspecialty score
Residents completing the pulmonary elective exhibited higher mean scores on pulmonary questions on the ABIM-CE (mean 78.8% for exposed residents compared to 73.2% for unexposed, p=0.0092, Table 6). No significant associations were seen between other elective exposures and subspecialty percentage correct.
Program director medical knowledge score
The program director medical knowledge score was significantly associated with ABIM-CE mean delta-SPS, r 2 0.365, p<0.001. Despite the association between the program director medical knowledge score and ABIM-CE score, there was no gender discrepancy in the program director medical knowledge score (mean for female residents 7.06, CI 6.65-7.48, mean for male residents 7.20, CI 6.86-7.55, p=0.77).
Resident elective selection is dependent on numerous factors including career preference, subspecialty interest, real or perceived quality of subspecialty education, opportunity to secure faculty recommendation letters, desire to improve knowledge in the subspecialty, auditioning for fellowship opportunities, guidance of an advisor or program director, and even such things as ease of the schedule or rigorousness of the rotation.
Although analysis of the data from 2008 suggested a positive association between exposure to the pulmonary and rheumatology electives and improved ABIM-CE scores, analysis of subspecialty elective exposures for all years combined failed to demonstrate a statistically significant positive relationship between specific elective exposures and ABIM-CE scores. It is possible that the sample size in this study was too small to demonstrate differences in outcomes simply because the residency program studied has relatively high numbers of elective exposures and high overall pass rates on the ABIM-CE. Another factor which may have contributed to lack of association between elective exposure and ABIM-CE performance in this study is that while all major electives in our program have a formal curriculum, there is wide variation in the structure, goals and objectives of the individual electives. Some electives include both inpatient and outpatient experiences, while others focus on only one or the other. The elective experience may vary dependent on the teaching experience of the attending physician. Finally, electives vary as to inclusion of pre- and post-testing, required reading lists, and subspecialty conference exposures. Prior studies have shown that formalized elective curricula [11, 12] or inclusion of elective specific multiple-choice testing program  improves resident performance on standardized tests of medical knowledge.
In our study participation in the cardiology elective was associated with worse performance on the ABIM-CE. This finding may reflect the nature of this elective as a more service driven elective, with a focus on repetitive, protocol driven, hospital-based experience and fewer one-on-one resident-attending interactions. Investigation of the characteristics of specific electives, such as the inpatient and outpatient focus, inclusion of reading lists, subspecialty conferences and pre- and post-testing these relationships was outside the scope of this study, but merits further investigation in a larger prospective study.
Our study did not demonstrate a statistically significant relationship between repeated elective exposures in a single specialty and ABIM-CE performance, suggesting that repeated exposure to a particular subspecialty may not offer additional improvement in medical knowledge. Although the population size limited the statistical significance of this finding, the data suggests that there may be a “threshold” number of elective exposures associated with improved ABIM-CE scores, but above which further elective experiences do not further improve ABIM-CE performance. This observation is helpful to program directors planning schedules and supports recommendations to move residency education from a structure and process based “time dependent” system towards a more competency based program where the major outcome is knowledge acquisition driven by the learner and assessed using multiple outcome measures .
While there is an increasing push towards self-directed learning and autonomy in selection of electives it is well recognized that residents themselves are not always good judges of their own medical knowledge . As in prior studies , our data showed strong correlation between the program director medical knowledge score and the ABIM-CE score, reinforcing the continued predictive value of program director evaluations, and supporting use of these evaluations in guiding residents throughout residency.
An unexpected finding of this study was the statistically significant negative association between female gender and ABIM-CE performance in this program. In the United States the number of women applying to medical school is increasing . Women now make up more than 50% of matriculating medical students, and 25% of practicing physicians . A similar negative association between female gender and ABIM-CE performance was reported in a larger study evaluating internal medicine residents’ performance on the IM-ITE and correlating this with quality of life, burnout and educational debt . Our findings suggest that female residents may experience barriers to education during residency that impact ABIM-CE performance, an observation that merits further investigation.
Our study has some limitations that warrant further discussion. The study was conducted in a single, university-based residency program with an ABIM-CE mean pass rate above the national average. It may therefore have been underpowered to show a difference in ABIM-CE score with subspecialty elective exposure. Residents in our program are permitted numerous subspecialty electives throughout the three years of training, and this may have contributed to the absence of detectable difference for high enrollment electives (such as infectious diseases). Furthermore, the observed results may not be widely applicable to training programs with fewer subspecialty elective opportunities.
It should also be noted that we studied only subspecialty exposures and did not assess reasons for elective selection or avoidance. There may be a selection bias in that residents who were interested in a subspecialty may be more likely to study in that subspecialty. Additionally, in our program, residents with poor subspecialty performance on the annual IM-ITE are counseled to participate in an elective in that subspecialty. Family and economic issues may further confound the relationship between ABIM-CE performance and elective exposures. Residents who become parents during residency may time their elective exposures to dovetail with their parental leave, and sleep deprivation and other stresses may thus impact their elective experience. Educational debt may be another confounder in the relationship with ABIM-CE performance. Prior studies have shown that educational debt is associated with lower mean IM-ITE scores . In many internal medicine residencies, the somewhat lighter call schedule during electives affords residents an opportunity to take on additional “moonlighting” shifts. During the time period of this study our institution had a moonlighting policy in place, which permitted residents with high satisfactory evaluations to moonlight while on elective. This may have resulted in residents with higher educational debt taking more electives, or selecting electives with lighter duties possibly skewing the associations between elective exposures and ABIM-CE performance.
The data from Table 1 show that mean delta-SPS scores declined over the five years under study, while the number of electives taken increased. In addition to the confounders discussed above, ACGME limitations on duty hours may play a role in these observations. The ACGME changed duty hour requirements in July 2003 and July 2011, so all cohorts in the current study fell under the auspices of the 2003 requirements including: a) 80-hour limits on the resident work week; b) 30-hour limit on overnight/continuous duty shifts; c) one day in seven (averaged) free of all duties; d) no more frequent than every third night overnight call, and e) “adequate” rest periods. Since data is not available for residents taking the ABIM-CE prior to 2005 it is not possible to compare the cohort under study with a historical cohort prior to the implementation of the 2003 ACGME duty hour requirements. A further change in ACGME duty hours was implemented in 2011, so it would be interesting to see if this trend towards declining ABIM-CE performance persists in future cohorts.
We did make an attempt to investigate the effect of timing of subspecialty elective exposure on IM-ITE scores in order to compare scores before and after an elective as well as to help measure knowledge retention throughout residency. However, although residents in our program are expected to participate in this annual exam; some residents were unable to complete the exam in each of the three years of residency because of personal issues or scheduling conflicts. Due to the paucity of data we could not evaluate impact of elective exposure over time in our cohort, but this would be important to study in a larger or prospective cohort.
Residency education is continuously evolving and adapting to the learning environment. With the advent of the Next Accreditation System (NAS)  there will be an emphasis on the responsibility of the sponsoring institution to ensure quality of the learning environment. With this in mind it is important for residency programs to evalute the impact of elective exposures on outcomes and to identify innovations that improve the quality of these exposures.
In this small retrospective study of a single university-based internal medicine residency, we did not find positive associations between subspecialty elective participation and ABIM-CE performance. Residents with fewer than 14 elective exposures had higher ABIM-CE mean delta-SPS than those with 15 or more elective exposures. Repeated subspecialty elective exposures did not correlate with either total or subspecialty score on the ABIM-CE suggesting that residents should be cautioned against repeated electives in a single specialty since it is unlikely to further broaden medical knowledge. This data should be of interest to residents planning their elective selections, and to program directors as they adapt resident education to respond to the Next Accreditation System.
Holt KD, Miller RS, Philibert I, Heard JK, Nasca TJ: Residents’ perspectives on the learning environment: data from the accreditation council for graduate medical education resident survey. Acad Med. 2010, 85 (3): 512-518. 10.1097/ACM.0b013e3181ccc1db. 510.1097/ACM.1090b1013e3181ccc1091db
Frank JR, Snell LS, Cate OT, et al: Competency-based medical education: theory to practice. Med Teach. 2010, 32 (8): 638-645. 10.3109/0142159X.2010.501190.
Holmboe E, Bowen J, Green M, et al: Reforming internal medicine residency training. a report from the Society of General Internal Medicine’s task force for residency reform. J Gen Intern Med. 2005, 20 (12): 1165-1172. 10.1111/j.1525-1497.2005.0249.x.
Nasca TJ, Philibert I, Brigham T, Flynn TC: The Next GME Accreditation System — Rationale and Benefits. N Engl J Med. 2012, 366 (11): 1051-1056. 10.1056/NEJMsr1200117. Epub 2012 Feb 22
Chaudhry S, Holmboe E, Beasley B: The state of evaluation in internal medicine residency. J Gen Intern Med. 2008, 23 (7): 1010-1015. 10.1007/s11606-008-0578-0.
O’Dell JR: In-training examination in internal medicine. Ann Intern Med. 1995, 122 (1): 73-74.
Holmboe ES, Hawkins RE: Methods for Evaluating the Clinical Competence of Residents in Internal Medicine: A Review. Ann Intern Med. 1998, 129 (1): 42-48.
McDonald FS, Zeger SL, Kolars JC: Associations of Conference Attendance With Internal Medicine In-Training Examination Scores. Mayo Clin Proc. 2008, 83 (4): 449-453. 10.4065/83.4.449.
McDonald F, Zeger S, Kolars J: Factors Associated with Medical Knowledge Acquisition During Internal Medicine Residency. J Gen Intern Med. 2007, 22 (7): 962-968. 10.1007/s11606-007-0206-4.
Haponik EF, Bowton DL, Chin R, et al: Pulmonary Section Development Influences General Medical House Officer Interests and ABIM Certifying Examination Performance. Chest. 1996, 110 (2): 533-538. 10.1378/chest.110.2.533.
O’Dell JR, Klassen L, MOore G: The use of outcome measures to evaluate clincial rheumatology curriculum changes. J Rheumatol. 1993, 20 (6): 1033-1036.
Hellmann D, Flynn J: Development and evaluation of a coordinated, ambulatory rheumatology experience for internal medicine residents. Arthritis Care Res. 1999, 12 (5): 325-330. 10.1002/1529-0131(199910)12:5<325::AID-ART3>3.0.CO;2-O.
Mathis B, Warm E, Schauer D, Holmboe E, Rouan G: A multiple choice testing program coupled with a year-long elective experience is associated with improved performance on the internal medicine in-training examination. J Gen Intern Med. 2011, 26 (11): 1253-1257. 10.1007/s11606-011-1696-7.
Iobst WF, Sherbino J, Cate OT, et al: Competency-based medical education in postgraduate medical education. Medical Teacher. 2010, 32 (8): 651-656. 10.3109/0142159X.2010.500709.
Jones R, Panda M, Desbiens N: Internal medicine residents do not accurately assess their medical knowledge. Advances in Health Sciences Education. 2008, 13 (4): 463-468. 10.1007/s10459-007-9058-2.
Norcini J, Webster G, Grosso L, Blank L, Benson JJ: Ratings of residents’ clinical competence and performance on certification examination. J Med Educ. 1987, 62 (6): 457-462. 10.1097/00001888-198706000-00001.
Hall FR, Mikesell C, Cranston P, Julian E, Elam C: Longitudinal trends in the applicant pool for U.S. medical schools, 1974–1999. Acad Med. 2001, 76 (8): 829-834. 10.1097/00001888-200108000-00017.
Levinson W, Lurie N: When most doctors are women: what lies ahead?. Ann Intern Med. 2004, 141 (6): 471-474.
West CP, Shanafelt TD, Kolars JC: Quality of life, burnout, educational debt, and medical knowledge among internal medicine residents. JAMA. 2011, 306 (9): 952-960. 10.1001/jama.2011.1247.
The pre-publication history for this paper can be accessed here:http://www.biomedcentral.com/1472-6920/12/94/prepub
Dr. Shanmugam is currently supported by award numbers KL2RR031974 and UL1RR031975 from the National Center for Research Resources and R0101388801 from the National Institute of Nursing Research.
The funding agency had no role in the design or conduct of this study.
The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Center for Research Resources, the National Institute of Nursing Research or the National Institutes of Health.
All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Dr. Adams is currently the Program Director of the MedStar Georgetown University Hospital internal medicine residency program.
VKS had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Study concept and design: VKS, KT, MA. Acquisition of data: VKS, KT, SM, AS. Statistical analysis: VKS, MM, SD. Analysis and interpretation of data: VKS, KT, MM, SD. Drafting of the manuscript: VKS, KT. Critical revision of the manuscript for important intellectual content: VKS, KT, MA, MM, SD, SM. Administrative, technical or material support: VKS, MA, SM, AS Study supervision: VKS, MA. All authors read and approved the final manuscript.