The American College of Radiology (ACR) in-training examination is broadly used as an objective measure of radiology resident performance in North America. This examination provides feedback to the resident and program director and assesses their progress in preparation for their end of training examination.
Previous studies have identified the ACR score as an effective predictor of future written board performance [1, 2]. Our study confirms that a similar relationship exists with the RCPSC diagnostic radiology exam. The mean ACR scores of residents who passed their RCPSC radiology board exam were significantly higher than for those residents who failed. There was strong correlation between ACR scores and board exam results. The odds ratios in all four years of training were less than 1 and statistically significant; therefore, higher ACR scores were associated with a reduced probability of board examination failure.
Not surprisingly, the average ACR score was the best predictor of exam failure when compared to ACR scores of each year of training. It showed the highest correlation with RCPSC exam result. This result was consistent with prior studies [1, 2]. Baumgartner and Brothers Peterman attributed these results to the reduction in variability related to the resident taking the in-training examination multiple times.
Additionally, residents’ ACR scores correlated with each other in a strongly positive manner in consecutive years. This indicated that those residents who performed poorly on their ACR examination early in their training also tended to perform poorly on subsequent ACR exams throughout their training.
After confirming the utility of the ACR in-training examination as a future predictor of board examination performance, our objective was to then provide program directors with useful tools to give residents feedback on their current knowledge and study habits. We used receiver operator characteristic (ROC) curves and logistic regression for this purpose.
The concept of using receiver operator characteristic (ROC) curves for prediction of exam performance was used previously in a similar study performed by our anaesthesia colleagues . However, we used these curves in a slightly different manner. We chose to minimize the risk of failing the exam with a high ACR score (false positive rate), thereby creating a novel threshold ACR score above which there was a negligible risk of failing the board exam. Our threshold scores ranged from the 32nd to the 63rd percentile. These ACR scores were higher than the previously reported ACR scores of 20th percentile or less which predicted poor board performance [1, 2]. This was not surprising as we were measuring a different parameter. We wanted program directors to be able to provide residents a realistic goal ACR score to attempt to surpass in each year of their training. This score would not just be predictive of better than poor performance on boards, but would (at least in theory) essentially eliminate the risk of board exam failure.
Logistic regression allowed us to provide program directors and radiology residents with three more pieces of information to chart training progress. Firstly, we calculated odds ratios for each year of training and used the formula: 100 × (odds ratio – 1) to provide residents with the percentage reduction in the odds of failing for each additional percentile point achieved.
We then used logistic regression curves to plot the probability of board exam failure versus the ACR percentile score. These curves allowed estimation of the ACR score at which point there was a 50% risk of RCPSC radiology board examination failure (ACR 50). These values are surprisingly low, ranging from the 4th to 8th percentile. However, including the upper limit of the 95% confidence interval as a more conservative measure, places the range from the 13th to 23rd percentile depending on the year of training. This more conservative number could be considered a risk of “poor” board exam performance. Interestingly, this ACR score range is similar to the prior American data in which a poor or failing board exam performance was seen in the ACR score range of less than 20 [1, 2].
Finally, the logistic regression curves have been provided as easy-to-use board exam prediction models for each year of training. Each resident and program director can estimate their risk of examination failure (with 95% confidence intervals) based on their ACR score. It should be noted that the majority of the reduction in the probability of examination failure occurs up to an ACR score of approximately 50 (the steep portion of the curve in all four years of training).
The limitations of this study may include applicability to other radiology programs with different demographics. We are a medium-sized radiology program of six to eight residents per year, including one to two international medical graduate positions. We likely place more emphasis on certain aspects of our training than other programs. This limitation could be addressed with a multi-centre study involving multiple Canadian radiology residency programs.
Additionally, we are limited by the data provided to us from the College regarding exam performance. Residents and residency programs are provided simply with a pass or fail result for the entire examination. Absolute scores are not provided, nor are the examination results (pass/fail) for the individual components (multiple-choice, OSCE and oral exams). The ACR examination is written in a multiple-choice format and presumably would be a better predictor of the multiple-choice component of the boards than the other two. Unfortunately, we are unable to test this hypothesis.
Predicting future exam performance is fraught with uncertainty. Countless confounders can and do make certain success far less certain. This is particularly true with the oral exam component of the RCPSC radiology examination, which places additional emphasis on ‘exams-manship’. However, despite these limitations, the ACR exam has been shown to be a useful predictor of exam performance [1, 2]. Our study agreed with these findings. In addition, we have developed several tools we hope will help radiology residents with self-assessment, and enable program directors to better guide their residents in their board examination preparation.