Skip to main content

Identifying and supporting students at risk of failing the National Medical Licensure Examination in Japan using a predictive pass rate

Abstract

Background

Students who fail to pass the National Medical Licensure Examination (NMLE) pose a huge problem from the educational standpoint of healthcare professionals. In the present study, we developed a formula of predictive pass rate (PPR)” which reliably predicts medical students who will fail the NMLE in Japan, and provides an adequate academic support for them.

Methods

Six consecutive cohorts of 531 medical students between 2012 and 2017, Gifu University Graduate School of Medicine, were investigated. Using 7 variables before the admission to medical school and 10 variables after admission, we developed a prediction formula to obtain the PPR for the NMLE using logistic regression analysis. In a new cohort of 106 medical students in 2018, we applied the formula for PPR to them to confirm the capability of the PPR and predicted students who will have a strong likelihood of failing the NMLE.

Results

Medical students who passed the NMLE had the following characteristics: younger age at admission, graduates of high schools located in the surrounding area, high scores in the graduation examination and in the comprehensive computer-based test provided by the Common Achievement Test Organization in Japan. However, total score of examination in pre-clinical medical sciences and Pre-CC OSCE score in the 4th year were not correlated with the PPR. Ninety-one out of 531 students had a strong likelihood of failing the NMLE between 2012 and 2017 and 33 of these 91 students failed NMLE. Using the PPR, we predicted 12 out of 106 students will have a strong likelihood of failing the NMLE. Actually, five of these 12 students failed NMLE.

Conclusions

The PPR can be used to predict medical students who have a higher probability of failing the NMLE. This prediction would enable focused support and guidance by faculty members. Prospective and longitudinal studies for larger and different cohorts would be necessary.

Peer Review reports

Background

In many countries such as the US, Germany, and Japan, medical students need to pass the National Medical Licensure Examination (NMLE) in order to take a physician’s license, and students who fail to pass the NMLE pose a huge problem from the educational standpoint of healthcare professionals [1,2,3,4,5,6]. Students failing USMLE Step 1 are often delayed from continuing course work, which affects their graduation and increases costs [4]. Failing Step 1 can also affect a student’s ability to enter a residency program and in some instances restrict them from applying for residency in specific states [4, 7]. Moreover, even if they enter a residency program, their performance in several specialty board examinations was poorer than those who passed the USMLE or the National Board of Medical Examiners (NBME) without failing [8,9,10,11,12]. There is also a concern about repeaters who cannot pass the NMLE and repeatedly take the examination. The pass rate among such repeaters is not sufficient, while the pass rate among those who take the examination the first time is very high: 67% vs 96% in USMLE Step 1 in 2017, and 63.9% vs 93.3% in the NMLE in Japan in 2018 [13, 14]. Those who fail seem to be much more likely to end up as repeaters.

Most studies in the past decade have primarily focused on the outcome [3, 5] or poor performance [2, 6] of those who fail USMLE or NMLE. To the best of our knowledge, there are only two studies in the US and Netherlands that have attempted to create models for predicting those who will fail among first-time test takers of Step 1 [4], and in the first-year undergraduate medical curriculum [1]. However, these studies did not confirm their results using predictors in a new group of students. Previous studies about medical students’ academic success reported academic performance associated with not only post admission variables such as previous academic performance [15, 16] and Objective Structured Clinical Examination (OSCE) [17], but also pre-admission variables such as gender [17], age at admission [16], hometown [16, 18], type of high school (HS) [19], HS grade point average (GPA) [20], and entrance exam for medical schools [21].

Therefore, the goal of the current study was to develop a model that can reliably predict those who would fail the NMLE using a prediction formula for the pass rate of NMLE: predictive pass rate (PPR). The prediction formula was then applied to a new cohort of medical students to identify students who had a high risk and need support.

Current situation of medical education and the number of doctors in Japan

Undergraduate medical education in Japanese medical schools is usually 6 years [22, 23], including 4 years of pre-clinical medical sciences and 2 years of clinical training. Graduates from these medical schools can take the NMLE.

In Japan, the total number of enrolling medical students has been controlled by government every year. As a result, entering medical school is highly competitive, and almost all students who pass the entrance examination potentially have the academic ability to pass NMLE. Furthermore, the number of doctors in Japan also has been controlled by the government due to the assumption of the number of doctors in future demand the future. The number of medical students per year has been kept around 7600 in the past two decades (1990th and 2000th). However, according to Organization for Economic Co-operation and Development (OECD) data, Japan is ranked 28th in terms of the number of practicing doctors among 35 OECD countries [24], and the number of medical students was also the least among OECD countries [25]. Since 2008, the number of medical students has been gradually increased to about 9400. However, even this effort is not enough especially in rural areas. Therefore, increasing the number of NMLE failures at rural universities will add to the shortage of physicians and the uneven regional distribution of physicians, which will ultimately affect Japan’s healthcare system.

Methods

Participants

To develop a reliable PPR for the NMLE, six consecutive cohorts of 531 students (2012–2017, 6th academic year) of the Gifu University School of Medicine (GUSM) were included. The GUSM is one of 51 public schools that are largely supported by the Japanese government. The cohorts in each academic year comprised 78, 69, 84, 97, 110, and 93 students, respectively. Through a prospective study, a cohort of 106 students in 2018 was investigated to confirm the PPR by predicting which students will have a strong likelihood of failing the NMLE and providing remediation to them (Table 1). Data from the 637 students in 2012–2018 were obtained after the students were anonymized by the academic affairs office of the GUSM. Ethical approval was granted by the GUSM Ethics Committee. Anonymity and confidentiality were guaranteed (date: 11/30/2016, reference number: 28–333).

Table 1 Characteristics of students in the 2012–2017 and 2018 cohorts

Variables

The dependent variable was “failing to pass the NMLE.” Data were obtained from the department of academic affairs of the GUSM. Explanatory variables included pre-admission variables such as gender, age at admission, location of HS (neighborhood prefectures including Gifu and Aichi from where about 60% of students enter the GUSM, and distant prefectures including Tokyo, Osaka and other prefectures), type of HS (public/private), academic level of HS (Table 1), HS GPA (5-grade evaluation), and achievement (percentage of correct answers) in the common entrance examination for university (National Center Test for University Admissions, NCTUA). Post-admission variables were Test of English as a Foreign Language (TOEFL) score, academic performance (percentage) in liberal arts, total score (percentage) in basic sciences in the first year, total score (percentage) in basic biomedical sciences in the second year, total score (percentage) in pre-clinical medical sciences in the third and fourth years, score in the nationwide Computer-Based Testing with Item Response Theory (CBT-IRT; which assess pre-clinical education in the fourth year), average score (six-point scale) in the Pre-Clinical Clerkship OSCE (Pre-CC OSCE) in the fourth year, achievement (standardized deviation values) in the graduation examination in the sixth year, performance in clinical clerkship during the fifth and sixth years, and with or without holdover from first to sixth years (Table 1). These data were obtained from the office of academic affairs of the GUSM, and the average value calculated from the data not including the missing value was substituted for the missing value.

Data analysis

First, we used Fisher’s exact tests and independent t-tests to compare demographic data before and after attending university between those who failed and passed the NMLE in the 2012–2017 cohort. Second, for the 2012–2017 cohort, logistic regression predicting the likelihood of passing the NMLE was used to calculate ORs and 95% confidence intervals (95% CIs) after simultaneously controlling for potential confounders. Third, we created a prediction formula to obtain the PPR using logistic regression analysis with all possible models. In order to guarantee the generality, two models using forced entry and stepwise method were created,

To confirm the suitability of this formulas, we used these formulas for the new cohort in 2018 to identify students who had a lower PPR in NMLE (95% or less; strong likelihood to fail the NMLE). SPSS ver. 23.0 Japan for Windows (SPSS Inc., Chicago, IL, USA) was used to perform statistical test. Two-tailed p-values of < 0.05 were considered significant.

Support to a cohort of graduates in 2018

Firstly, 9 months before the NMLE, we noticed these risk factors to all students. And then, 3 months before the NMLE, a group of students with a higher risk to fail were referred to the academic affairs committee. The committee members held an individual face-to-face interview, elaborately reviewed their motivation and preparedness for the NMLE, gave advice on how to study, determined their educational environment (i.e., location, period of time to study, support by family and/or classmates, economical problems), and encouraged and advised them repeatedly.

Results

Characteristics of those who failed and passed the NMLE in the 2012–2017 cohort

Table 1 shows some significant differences in the demographic data and achievements before and after attending university. In terms of pre-admission variables, those who failed the NMLE showed the following characteristics: predominantly male, older at admission, HS in distant prefectures, lower HS GPA, and lower NCTUA score. After attending university, they had significantly lower scores in basic and pre-clinical medical sciences, CBT-IRT, and Pre-CC OSCE and poor performance in clinical clerkship and were predisposed to repeat a year in medical school (Table 1).

Logistic regression analysis

Table 2a and b show the results of logistic regression of variables that predict the likelihood of passing the NMLE. Both results show that medical students who passed the NMLE showed the following characteristics: younger age at admission, HS located in Gifu and Aichi Prefecture, higher scores in CBT-IRT and graduation examination but not the total score in pre-clinical medical sciences, and better performance in clinical clerkship.

Table 2 Logistic regression predicting the likelihood of passing the NMLE

PPR in the NMLE

Given that the PPR in the NMLE is p/100, the logistic regression formula was provided in Additional file 1 in the Supplementary Information section.

Ninety-one out of 531 students from 2012 to 2017 had a lower PPR for NMLE using the both methods, and actually 34 of 91 students failed the NMLE.

Prediction for the graduates in the 2018 cohort and support

Using the above formulas, we predicted students who will likely fail the NMLE and guided them (Table 3a, b). Twelve out of 106 students in 2018 were predicted as having lower PPR (Table 3a), and they were supported by the faculty members. Eleven students predicted as having lower PPR by the stepwise method (Table 3b) were all included in the 12 students predicted in Table 3a. Seven of 12 students passed the NMLE after obtaining support, and five students failed as predicted. Thus the pass rate of NMLE in 2018 was 95.3% (101/106) (national average: 90.1%). As compared to the pass rate of 88.2% in 2017 (82/93 students) (national average: 88.7%), better outcome was obtained. In both models, we could predict all five students who would fail; these were included among the high-risk students.

Table 3 Predictive pass rate and the number of students who failed in the 2018 cohort

Discussion

We developed a formula for predicting the pass rate in the NMLE. Using this formula, we evaluated a new cohort of students in 2018 and predicted 12 students who had a higher risk of failing the NMLE. After guidance by faculty members, 7 of the 12 students passed the NMLE.

Predictors for passing the NMLE

We identified four significant internal predictors for passing the NMLE: 1) total score in pre-clinical medical sciences in the third and fourth years, 2) CBT-IRT score in the fourth year, 3) performance in clinical clerkship in the fifth and sixth years, and 4) score in the graduation examination in the sixth year. We also identified two external predictors: age at admission and HS located in surrounding area.

Among them, CBT is a nationwide examination administered by the Common Achievement Tests Organization [26] for medical students in all Japanese medical schools before clinical clerkship using a computer to estimate the student’s knowledge for the clinical clerkship. CBT corresponds to Step 1 of USMLE, and a number of studies on risk factors and outcome for those who failed Step 1 [2,3,4,5, 27, 28] and studies investigating Step 1 score as one of the predictors of performance after Step 1 [8,9,10,11, 21, 29,30,31,32]. The latter may be correlated to our result for CBT. Most studies including the study by Koenig et al. [29] have indicated that a high score in Step 1 is a predictor of success in many fields in the medical profession (i.e., internal medicine, dermatology, ophthalmology, orthopedic surgery, gynecology, and family medicine) [8,9,10,11, 30,31,32], with some opposite results [33, 34]. Casey et al. [21] noted that the medical college aptitude test (MCAT), Step 1 and Step 2, and subsequent clinical performance parameters correlated with NBME scores across all core clinical clerkships. They also emphasized that Step 1 scores identified students at risk of poor performance in NBME subject examinations, facilitating and supporting implementation of remediation before clinical years [21]. Accordingly, it is very reasonable to assume that the score of CBT which is compatible with Step 1 is one of the predictors of passing the NMLE in Japan which is compatible with NBME.

In the present study, additional three other internal predictors for passing the NMLE were also identified: score in pre-clinical medical sciences, performance in clinical clerkship, and graduation examination scores. The NMLE was taken within 3 months after clinical clerkship and graduation examination. The logistic regression analyses in our study showed a negative correlation between the score in pre-clinical medical sciences in the third and fourth academic years and passing the NMLE (Table 2). This result is in conflict with a general thought that the students with higher academic score in preclinical medical science would likely be to pass NMLE. Furthermore, the total scores for pre-clinical medical sciences in the students who failed NMLE in 2012–2017 were actually lower than those who passed NMLE (Table 1). However, when we closely looked into the 36 students who failed, we found that they had older age at admission and better scores in pre-clinical medical sciences but worse performance in the graduation examination. Hence, we hypothesize that older medical students might have insufficient study time because some of them had family or need to work part-time, diminished ability to memorize, or burnout due to longer years of schooling and/or working since they graduated high schools. Further studies are needed to confirm this hypnosis. The four significant internal predictors of passing the NMLE shown in this study can be used to predict those who may fail the NMLE.

Moreover, significant external predictors of passing the NMLE were age at admission and HS located in Gifu and Aichi Prefecture. Using linear regression analysis, Kleshinski et al. [27] identified predictors of performance on Step 1 and Step 2 as follows: science GPA, biologic science section of MCAT, college selectivity, race, and age. Furthermore, McDougle et al. [3] indicated that the relative risk of first-attempt Step 1 failure for medical school graduates was 3.6 for matriculants aged > 22 years (95%CI: 2.0–6.6, p < 0.0001). Consequently, older medical students have a higher risk of failing Step 1, Step 2, and the NMLE. It is unclear why medical students who belonged to a neighborhood HS have better chance of passing the NMLE, and we found no such study on the relationship between NMLE and the location of HS or hometown. Given previous study on academic performance [16, 18], students from the neighborhood city/town might be able to receive various kinds of supports from their families physically, economically, and psychologically. Further investigations are required.

Predicting NMLE with data in lower grade

Several studies have predicted the performance of medical students in Step 1 and primarily focused on first-time test takers [4, 27, 31, 35, 36]. Determining the characteristics of a student who will fail Step 1 is challenging [4] because it is difficult to create models that predict the failure of first-time test takers given the low number of students who fail in most schools [4, 28]. Keeping this in mind, Coumarbatch et al. attempted to create models to predict those who will fail among first-time test takers using logistic regression analysis in 256 students from the graduating class of 2008 at Wayne University [4]. They found that the year-2 standard score and MCAT biological science score were significant predictors of failing and concluded that using internal and external predictors, identifying students at risk of failing Step 1 is possible [4]. Moreover, they described at-risk groups and current educational intervention strategies. In the current study, the year-2 standard score and MCAT score might correspond to the total score in pre-clinical medical science and the NCTUA (Tables 1 and 2), however, there is a difference of the competencies required and the level of difficulty between MCAT and NCTUA, so it would be reasonable that our results using logistic regression analysis were not consistent with theirs [4]. More recently, Baars et al. developed a model for the early and reliable prediction of students who fail to pass the first year in the undergraduate medical curriculum within 2 years after starting [1]. However, we cannot directly compare our results and theirs. In the GUSM, the students who failed the NMLE did not have better or worse scores in liberal arts and basic science during their first year in medical school (Tables 1 and 2).

Thus, in the current study, we determined the PPR using several information which can be obtained easily during medical schools, and predicted students who have higher risk to fail NMLE using the PPR for the first time.

The pass rate in NMLE 2018 after support based on the PPR prediction

Before the current study, faculties noticed that some young students with poor performance in the mock examination (ME) may pass the actual NMLE, while the older students with good performance in the ME sometimes failed NMLE, but the reason was unclear. For a new cohort in 2018, we chose students who had lower PPRs in the NMLE (95% or less), indicating a strong likelihood to fail the NMLE, to confirm the validity of the formula (Table 3). The PPR predicted all five students who would fail. Therefore, this result showed that risk analysis from data such as the PPR can enable effective support from multiple points of view, such as the use of MEs. Further prospective studies are needed in other cultural areas, although we need to confirm the validity of the PPR.

Limitations

First, we cannot directly compare the present and previous studies because of differences in independent variables. Second, our results may be influenced by some differences in the selection of medical students and the medical education system between Japan and other countries. Third, it may be unclear whether our results can be applied to other Japanese medical schools because there was no report similar to our study and the study period was only 1 year. Therefore, we expect to applicate and verify the knowledge in other Japanese medical schools. Fourth, because Gifu University Graduate School of Medicine is a public education institution, we had no choice but to intervene a group of students with a higher risk to fail NMLE once the risks were identified. As a result, the intervention has made it an incomplete experimental model.

Conclusions

This is the first study that demonstrated six significant predictors for passing the NMLE and the possibility of decreasing the number of students who fail the NMLE prospectively using the PPR. To confirm these results, further studies are needed because there is no similar trial.

Availability of data and materials

Our data are not on a data repository because scores and data of students are highly confidential. The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request. Only coded data may be shared.

Abbreviations

AUC:

Area Under the Curve

CBT-IRT:

Computer-Based Testing with Item Response Theory

GPA:

Grade Point Average

GUSM:

Gifu University School of Medicine

HS:

High School

MCAT:

Medical College Aptitude Test

ME:

Mock Examination

NBME:

National Board of Medical Examiners

NCTUA:

National Center Test for University Admissions

NMLE:

National Medical Licensure Examination

OECD:

Organization for Economic Co-operation and Development

OR:

Odds Ratio

PPR:

Predictive Pass Rate

Pre-CC OSCE:

Pre-Clinical Clerkship Objective Structured Clinical Examination

TOEFL:

Test of English as a Foreign Language

USMLE:

United States Medical Licensing Examination

References

  1. 1.

    Baars G, Stijnen T, Splinter T. A model to predict student failure in the first year of the undergraduate medical curriculum. Health Profess Educ. 2017;3:5–14. https://doi.org/10.1097/ACM.0000000000001938.

    Article  Google Scholar 

  2. 2.

    Burns ER, Garrett J. Student failures on first-year medical basic science courses and the USMLE step 1: a retrospective study over a 20-year period. Anat Sci Educ. 2015;8:120–5. https://doi.org/10.1002/ase.1462.

    Article  Google Scholar 

  3. 3.

    McDougle L, Mavis BE, Jeffe DB, et al. Academic and professional career outcomes of medical school graduates who failed USMLE step 1 on the first attempt. Adv Health Sci Educ Theory Pract. 2013;18:279–89. https://doi.org/10.1007/s10459-012-9371-2.

    Article  Google Scholar 

  4. 4.

    Coumarbatch J, Robinson L, Thomas R, et al. Strategies for identifying students at risk for USMLE step 1 failure. Fam Med. 2010;42:105–10.

    Google Scholar 

  5. 5.

    Biskobing DM, Lawson SR, Messmer JM, et al. Study of selected outcomes of medical students who fail USMLE step 1. Med Educ Online. 2006;11:4589. https://doi.org/10.3402/meo.v11i.4589.

    Article  Google Scholar 

  6. 6.

    Kies SM, Freund GG. Medical students who decompress during the M-1 year outperform those who fail and repeat it: a study of M-1 students at the University of Illinois College of Medicine at Urbana-Champaign 1988–2000. BMC Med Educ. 2005;5:18. https://doi.org/10.1186/1472-6920-5-18.

    Article  Google Scholar 

  7. 7.

    Physician Licensing Service. Medical License Requirements by State. https://physicianlicensing.com/resources/state-requirements/ Accessed June 2, 2020.

    Google Scholar 

  8. 8.

    Fening K, Vander Horst A, Zirwas M. Correlation of USMLE step 1 scores with performance on dermatology in-training examinations. J Am Acad Dermatol. 2011;64:102–6. https://doi.org/10.1016/j.jaad.2009.12.051.

    Article  Google Scholar 

  9. 9.

    Swanson DB, Sawhill A, Holtzman KZ, et al. Relationship between performance on part I of the American Board of Orthopaedic Surgery Certifying Examination and Scores on USMLE steps 1 and 2. Acad Med. 2009;84:S21–4. https://doi.org/10.1097/ACM.0b013e3181b37fd2.

    Article  Google Scholar 

  10. 10.

    McCaskill QE, Kirk JJ, Barata DM, et al. USMLE step 1 scores as a significant predictor of future board passage in pediatrics. Ambul Pediatr. 2007;7:192–5. https://doi.org/10.1016/j.ambp.2007.01.002.

    Article  Google Scholar 

  11. 11.

    Myles TD, Henderson RC. Medical licensure examination scores: relationship to obstetrics and gynecology examination scores. Obstet Gynecol. 2002;100:955–8.

    Google Scholar 

  12. 12.

    Case SM, Swanson DB. Validity of NBME part I and part II scores for selection of residents in orthopedic surgery, dermatology, and preventive medicine. Acad Med. 1993;68:S51–6.

    Article  Google Scholar 

  13. 13.

    USMLE. 2017 Performance data. http://www.usmle.org/performance-data/default.aspx Accessed June 2, 2020.

    Google Scholar 

  14. 14.

    Japanese Ministry of Health, Labour and Welfare. Performance Dara for the 112th National Medical Licensure Examination in Japan (in Japanese). www.mhlw.go.jp/file/05-Shingikai-10803000-Iseikyoku-Ijika/0000197914.pdf Accessed June 2, 2020.

  15. 15.

    Ferguson E, James D, Madeley L. Factors associated with success in medical school: systematic review of the literature. BMJ. 2002;324(7343):952–7. https://doi.org/10.1136/bmj.324.7343.952.

    Article  Google Scholar 

  16. 16.

    Nawa N, Numasawa M, Nakagawa M, Sunaga M, Fujiwara T, Tanaka Y, Kinoshita A. Associations between demographic factors and the academic trajectories of medical students in Japan. PLoS One. 2020;15(5):e0233371. https://doi.org/10.1371/journal.pone.0233371.

    Article  Google Scholar 

  17. 17.

    Cleland JA, Milne A, Sinclair H, Lee AJ. Cohort study on predicting grades: is performance on early MBChB assessments predictive of later undergraduate grades? Med Educ. 2008;42(7):676–83. https://doi.org/10.1111/j.1365-2923.2008.03037.x.

    Article  Google Scholar 

  18. 18.

    Malau-Aduli BS, O’Connor T, Ray RA, van der Kruk Y, Bellingan M, Teague PA. Risk factors associated with academic difficulty in an Australian regionally located medical school. BMC Med Educ. 2017;17(1):266. https://doi.org/10.1186/s12909-017-1095-9.

    Article  Google Scholar 

  19. 19.

    Kumwenda B, Cleland JA, Walker K, Lee AJ, Greatrix R. The relationship between school type and academic performance at medical school: a national, multi-cohort study. BMJ Open. 2017;7(8):e016291. https://doi.org/10.1136/bmjopen-2017-016291.

    Article  Google Scholar 

  20. 20.

    O’Neill LD, Wallstedt B, Eika B, Hartvigsen J. Factors associated with dropout in medical education: a literature review. Med Educ. 2011;45(5):440–54. https://doi.org/10.1111/j.1365-2923.2010.03898.x.

    Article  Google Scholar 

  21. 21.

    Casey PM, Palmer BA, Thompson GB, et al. Predictors of medical school clerkship performance: a multispecialty longitudinal analysis of standardized examination scores and clinical assessments. BMC Med Educ. 2016;16:128. https://doi.org/10.1186/s12909-016-0652-y.

    Article  Google Scholar 

  22. 22.

    Suzuki Y, Gibbs T, Fujisaki K. Medical education in Japan: a challenge to the healthcare system. Med Teach. 2008;30(9–10):846–50. https://doi.org/10.1080/01421590802298207.

    Article  Google Scholar 

  23. 23.

    Imafuku R, Saiki T, Suzuki Y. Developing undergraduate research in Japanese medical education. Council Undergrad Res Q. 2016;37:34–40.

    Google Scholar 

  24. 24.

    Organization for Economic Co-operation and Development OECD Data: Doctors. https://data.oecd.org/healthres/doctors.htm Accessed June 2, 2020.

  25. 25.

    Organization for Economic Co-operation and Development OECD Data: Medical graduates. https://data.oecd.org/healthres/medical-graduates.htm Accessed June 2, 2020.

  26. 26.

    Yamada R. Measuring quality of undergraduate education in Japan: comparative perspective in a knowledge based society. Elsevier. 2014:76–8.

  27. 27.

    Kleshinski J, Khuder SA, Shapiro JI, et al. Impact of preadmission variables on USMLE step 1 and step 2 performance. Adv Health Sci Educ Theory Pract. 2009;14:69–78. https://doi.org/10.1007/s10459-007-9087-x.

    Article  Google Scholar 

  28. 28.

    Zhao X, Oppler S, Dunleavy D, et al. Validity of four approaches of using repeaters MCAT scores in medical school admissions to predict USMLE step 1 total scores. Acad Med. 2010;85(Suppl):S64–7. https://doi.org/10.1097/ACM.0b013e3181ed38fc.

    Article  Google Scholar 

  29. 29.

    Koenig JA, Sireci SG, Wiley A. Evaluating the predictive validity of MCAT scores across diverse applicant groups. Acad Med. 1998;73:1095–106.

    Article  Google Scholar 

  30. 30.

    Myles T, Galvez-Myles R. USMLE step 1 and 2 scores correlate with family medicine clinical and examination scores. Fam Med. 2003;35:510–3.

    Google Scholar 

  31. 31.

    Crawford CH III, Nyland J, Roberts CS, Johnson JR. Relationship among United States medical licensing step I, orthopedic in-training, subjective clinical performance evaluations, and American Board of Orthopedic Surgery Examination Scores: a 12-year review of an orthopedic surgery residency program. J Surg Edu. 2010;67:71–8. https://doi.org/10.1016/j.jsurg.2009.12.006.

    Article  Google Scholar 

  32. 32.

    Hemann BA, Durning SJ, Kelly WF, Dong T, Pangaro LN. Referral for competency committee review for poor performance on the internal medicine clerkship is associated with poor performance in internship. Mil Med. 2015;180(suppl):71–6. https://doi.org/10.7205/MILMED-D-14-00575.

    Article  Google Scholar 

  33. 33.

    Herndon JH, Allan BJ, Dyer G, Jawa A, Zurakowski D. Predictors of success on the American Board of Orthopaedic Surgery Examination. Clin Orthop Relat Res. 2009;467:2436–45. https://doi.org/10.1186/s12909-016-0652-y.

    Article  Google Scholar 

  34. 34.

    Johnson GA, Bloom JN, Szczotka-Flynn L, Zauner D, Tomsak RL. A comparative study of resident performance on standardized training examinations and the American board of ophthalmology written examination. Ophthalmology. 2010;117:2435–9. https://doi.org/10.1016/j.ophtha.2010.03.056.

    Article  Google Scholar 

  35. 35.

    Julian ER. Validity of the medical college admission test for predicting medical school performance. Acad Med. 2005;80:910–7.

    Article  Google Scholar 

  36. 36.

    Donnon T, Paolucci EO, Violato C. The predictive validity of the MCAT for medical school performance and medical board licensing examinations: a meta-analysis of the published research. Acad Med. 2007;82:100–6. https://doi.org/10.1097/01.ACM.0000249878.25186.b7.

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank administrators of academic affairs section in Gifu University Graduate School of Medicine for helping authors’ data acquisition.

Funding

This work was partly supported by JSPS KAKENHI Grant Number 19 K21757.

Author information

Affiliations

Authors

Contributions

All authors have made contributions to the study and the manuscript. TS made the concept and design of the study. KT managed the data acquisition and statistical analysis. KT and TS performed interpretation of data, as well as the drafting manuscript. YS made the supervision of the study design and modified the manuscript. All authors have read and approved the manuscript.

Corresponding author

Correspondence to Koji Tsunekawa.

Ethics declarations

Ethics approval and consent to participate

Ethical approval was granted by the GUSM Ethics Committee. Anonymity and confidentiality were guaranteed (date: 11/30/2016, reference number: 28–333).

Consent for publication

(Not applicable)

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Tsunekawa, K., Suzuki, Y. & Shioiri, T. Identifying and supporting students at risk of failing the National Medical Licensure Examination in Japan using a predictive pass rate. BMC Med Educ 20, 419 (2020). https://doi.org/10.1186/s12909-020-02350-8

Download citation

Keywords

  • National Medical Licensure Examination
  • Logistic regression analysis
  • Predicting student failure
  • Supporting high-risk students