Skip to main content
  • Research article
  • Open access
  • Published:

Difference in the general medicine in-training examination score between community-based hospitals and university hospitals: a cross-sectional study based on 15,188 Japanese resident physicians

Abstract

Background

The general medicine in-training examination (GM-ITE) is designed to objectively evaluate the postgraduate clinical competencies (PGY) 1 and 2 residents in Japan. Although the total GM-ITE scores tended to be lower in PGY-1 and PGY-2 residents in university hospitals than those in community-based hospitals, the most divergent areas of essential clinical competencies have not yet been revealed.

Methods

We conducted a nationwide, multicenter, cross-sectional study in Japan, using the GM-ITE to compare university and community-based hospitals in the four areas of basic clinical knowledge“. Specifically, “medical interview and professionalism,” “symptomatology and clinical reasoning,” “physical examination and clinical procedures,” and “disease knowledge” were assessed.

Results

We found no significant difference in “medical interview and professionalism” scores between the community-based and university hospital residents. However, significant differences were found in the remaining three areas. A 1.28-point difference (95% confidence interval: 0.96–1.59) in “physical examination and clinical procedures” in PGY-1 residents was found; this area alone accounts for approximately half of the difference in total score.

Conclusions

The standardization of junior residency programs and the general clinical education programs in Japan should be promoted and will improve the overall training that our residents receive. This is especially needed in categories where university hospitals have low scores, such as “physical examination and clinical procedures.”

Peer Review reports

Background

In 1968, the ability to participate in training without medical license after graduation from medical school was abolished and replaced by the clinical training system in Japan, which required physicians to undergo a further ≥2 years of clinical training, even after obtaining their medical license. In 2004, a new residency system was established, making it obligatory for physicians to undergo ≥2 years of residency with a super-rotation curriculum [1]. In Japan, physicians who are in postgraduate clinical training years (PGY) 1 and 2 under this policy are called junior residents.

Although junior residency has been made required since April 2004, the design and operation of these training programs are largely left to the discretion of individual teaching hospitals. No objective outcome measures of clinical training have been established, and the content of the education provided varies between institutions. This lack of standardization results in great disparities in the competency of junior residents [2]. The Japanese Institute for Advancement of Medical Education Program (JAMEP) has aimed to resolve these issues and support residency education [3, 4].

The JAMEP is a non-profit organization established in 2005 to enrich Japanese health care through the support of resident education and thereby improve the quality of medical care [2]. Its main activities are the following: (1) hosting continued professional development lectures and inviting a wide range of physicians, including residents, supervising physicians, and practitioners; (2) providing Web-based platforms where physicians can share their clinical skills, knowledge, and experiences; and (3) conducting the general medicine in-training examination (GM-ITE) for junior residents [3].

The GM-ITE is designed to evaluate the clinical competence of junior residents objectively. The examination was developed by the JAMEP [5]. In line with the early residency objectives of the Japanese Ministry of Health, Labour, and Welfare, the GM-ITE assesses the four areas of basic clinical knowledge, including “medical interview and professionalism,” “symptomatology and clinical reasoning” “physical examination and clinical procedures,” and “disease knowledge.” This comprehensive test covers all disciplines, with a focus on primary care [6].

The purpose of this investigation was to compare GM-ITE score evaluations between the university and community-based hospitals, which are characterized by vast differences in clinical training environments. University hospitals provide more advanced medicine in highly specialized areas than community-based hospitals. Analyzing how these differences impact GM-ITE scores may help to elucidate the distinct training challenges faced by each type of teaching hospital.

We have previously demonstrated that the mean GM-ITE scores of junior residents in university hospitals tend to be lower than those of community-based hospitals [7]. However, the difference in score was centered only on the total GM-ITE scores. As mentioned earlier, because the four distinct domains in the GM-ITE are designed to measure different aspects of clinical competence, the trends in scores attributed to the training environment described could vary depending on the assessed area. In the present investigation, we assessed the reasons underlying the differences in test scores between university hospitals and community-based hospitals.

Methods

Study design

This was a nationwide, multicenter, cross-sectional study performed in Japan. We used GM-ITE scores to compare university and community-based hospital residents. This study followed STROBE guidelines.

Study population

The study population included 15,188 junior residents working in 815 medical institutions nationwide who took the GM-ITE for the first time in the three years from 2016 to 2018. Residents with missing data from the clinical training environment survey questionnaire or examinee characteristics were excluded from the analysis. Data regarding PGY-1 and PGY-2 were analyzed separately because the differences in-training rotations could significantly influence the results.

General medicine in-training examination

The GM-ITE was developed by the JAMEP in 2011 and serves as an objective evaluation of the essential clinical competencies of junior resident physicians [8,9,10]. The GM-ITE score consists of the following four categories: “medical interview and professionalism,” “symptomatology and clinical reasoning,” “physical examination and clinical procedures,” and “disease knowledge.” The GM-ITE is a multiple-choice knowledge test and used mainly to assess the experience gained during clinical training. The GM-ITE consisted of 100 multiple choice questions in 2016 and was shortened to 60 questions in 2017 and 2018. The test items cover the following fields: internal medicine, surgery, emergency medicine, pediatrics, obstetrics and gynecology, and psychiatry. After completing the test, resident physicians who took the GM-ITE receive feedback based on the relative scoring of all participants and an educational explanation about each question. This examination is conducted at the end of the year for PGY-1 and PGY-2 junior resident physicians. In line with the early residency objectives of the Japanese Ministry of Health, Labour, and Welfare, the GM-ITE covers the four areas mentioned above of basic clinical knowledge [6]. Thus, the test content and construct validity have been well-established.

The test questions are created through multiple processes over a year. First, all test questions are prepared by a question-creating committee consisting of 22 experienced physicians. Next, all prepared test questions are revised by a peer review committee consisting of five instructors [11].

We also performed an analysis on all test questions used in the previous year to improve the quality of the GM-ITE. Figure 1 shows the test analysis results of the previous test question in the field of “disease knowledge.” The test question’s passing rate was 44.5%, and this test question was judged to be a good question as a result of the test analysis because of the effect of a distractor. The correct answer of “4” exhibits a positive correlation with the overall examinee score, while the alternative “3” functions as a distractor (the alternative more likely to be selected by lower scorers). The contents of the test question and detailed explanation of the test analysis are shown in Additional file 1: Appendix 1.

Fig. 1
figure 1

The result of the test analysis for the previous GM-ITE in the field of “disease knowledge.” The vertical axis is the passing rate, and the horizontal axis is the quintile of the total GM-ITE score. The numbers in the figure are the answer choices of the examinees, with “4” as the correct answer

The main purpose of the GM-ITE is to determine the ranking of clinical training facilities’ own grade in Japan and to help overcome their weaknesses in clinical fields. From the point of view of each resident physician, the purpose of the GM-ITE is not to make a pass/fail judgment but to determine the ranking of the resident physician’s own grades in Japan and help overcome their weaknesses in clinical fields. Participating resident physicians will receive detailed medical explanations about the exam questions after the examination. They will also receive feedback about their relative performance in each category (“medical interview and professionalism,” “symptomatology and clinical reasoning,” “physical examination and clinical procedures,” and “disease knowledge”) and each clinical field (internal medicine, surgery, emergency medicine, pediatrics, obstetrics and gynecology, and psychiatry) and the proportion of correct responses. Residency program directors will be informed of the average total scores and proportion of correct responses of the participating residents in their program.

Data collection

We collected information regarding the clinical training environment from a self-reported questionnaire administered just after the residents completed the GM-ITE. It included GM rotation, emergency department duties per month, the average number of inpatients in charge, use of online medical resources, and study time per day. Hospital information (hospital type [university or community based] and location) was obtained from the Residency Electronic Information System website [12] and the Foundation for the Promotion of Medical Training website [13]. We classified university branch hospitals into university hospitals. Regarding the categories of hospital location, 20 cities designated by the Ministry of Internal Affairs and Communications and the 23 wards in Tokyo were defined as urban cities; and the rest as provincial cities.

Statistical analyses

GM-ITE total scores were compared between the university and community-based hospitals. We estimated the two-level linear model, adjusting for the hospital- and resident-level variables listed in the Data Collection section and each hospital as an average random intercept. The year of the GM-ITE was also adjusted to account for the difference in maximum scores (i.e., 100 in 2016 and 60 in 2017/2018). Each resident’s GM-ITE total score was divided into four question fields (subcategories); hence, we also modeled the scores at distinct subcategories (i.e., four observations per resident) using the three-level linear model, including hospital- and resident-level random intercepts with subcategory-level error terms. We adjusted for the same variables as the two-level linear model and the dummy variables for subcategories and their product terms with the university/community-based hospital type. All analyses were conducted in PGY-1 and PGY-2, using restricted (for the two-level model) and unrestricted (the three-level model) maximum likelihood methods using the Proc Mixed assessment in SAS ver. 9.4 (Cary, NC).

Results

The total number of individuals who met the eligibility criteria was 15,188, including 7552 PGY-1 and 7636 PGY-2 resident physicians. After excluding those with missing data, we finally analyzed the data from 7111 PGY-1 and 7154 PGY-2 resident physicians. Table 1 shows that the residents from university hospitals had less experience with emergency department duties, were in charge of fewer inpatients, spent less time studying, and had less experience with GM rotation despite having GM departments.

Table 1 Background characteristics of the residents

Table 2 presents the average GM-ITE scores of the PGY-1 and PGY-2 resident physicians and shows no areas in which junior residents working in university hospitals scored better than those working in community-based hospitals. The slight fluctuation in point rates (average scores divided by maximum score) between the years suggested that the level of evaluation difficulty is stable.

Table 2 Mean (standard deviation) GM-ITE scores of the university and community-based hospital residents

Table 3 shows the difference in total GM-ITE scores between the university and community-based hospitals. Additional file 1: Appendix Table 1 presents all the parameter estimates of two-level models. The mean total GM-ITE scores of the PGY-1 and PGY-2 resident physicians were estimated to be 2.52 points (95% confidence interval [CI]: 1.42–3.61) and 1.89 points higher (95% CI: 0.75–3.03) in the community-based hospitals than in the university hospitals, respectively. The differences in mean scores (approximately 2 to 2.5) were similar with the standard deviation of hospital-level variation as follows: the estimates of the standard deviations of hospital-specific random effects were 2.1 = √4.46 (for PGY-1) and 2.4 = √5.61 (for PGY-2; Additional file 1: Appendix Table 1; likewise, the standard deviation of the scores of the residents in each hospital was estimated to range from 5.5 to 6 points from the residual estimates). The scores also positively correlated with study time and GM rotation, as well as several emergency department duties and the average number of inpatients of whom PGY-2 resident physicians were primarily in charge (Additional file 1: Appendix Table 2).

Table 3 Estimated differences in GM-ITE total score and scores in the 4 subcategories (university hospitals vs. community-based hospitals)

Table 3 shows the differences in subcategory-specific scores, which were calculated by combining the parameter estimates of three-level models (Additional file 1: Appendix Table 2). No significant difference in “medical interview and professionalism” scores were found between the community-based and university hospital residents. However, significant differences were found for the three remaining areas. A 1.28-point difference (95% CI: 0.96–1.59) in “physical examination and clinical procedures” was found in PGY-1; this area alone accounts for approximately half of the difference in total scores. The difference in this area was also found in the PGY-2 resident physicians. However, the estimated value for the difference was halved to 0.51 points (95% CI: 0.19–0.83), reaching a similar magnitude as did “symptomatology and clinical reasoning” and “disease knowledge.”

Discussion

This investigation is the first to utilize JAMEP GM-ITE data to formally evaluate the subcategory scores in addition to the total score by using novel multilevel models with statistical interactions. The present study used a three-level model analysis to illustrate the differences in total GM-ITE scores and scores in four subcategories in junior resident physicians according to hospital type. University hospitals had lower scores in all the subcategories when compared to community-based hospitals. We found more significant gaps in the GM-ITE scores for “physical examination and clinical procedures” and negligible gaps in the GM-ITE scores for “medical interview and professionalism.”

As previously mentioned, the Japanese Ministry of Health, Labour, and Welfare introduced a mandatory 2-year postgraduate training program for graduating medical students with a super-rotation curriculum in 2004. Simultaneously, a national matching system was established to determine the hospital residency programs best suited for medical students [1]. Japan has approximately 1000 teaching hospitals and 1300 junior residency programs, including both community-based and university hospitals. This matching system matches candidates with postgraduate clinical training with teaching hospitals that conduct junior residency programs. The matching of a candidate to a hospital is determined by a computer according to an algorithm based on the preferences of both the candidate and the hospital [14, 15]. As a result of the matching data analysis, community-based hospitals tend to be more popular than university hospitals among medical students in the final (sixth) year [15].

In addition to providing advanced medical care, university hospitals also conduct medical research, including essential, translational, and clinical studies. In comparison, community-based hospitals play a more crucial role in treating neighborhoods afflicted with many prevalent diseases. Thus, community-based hospitals are responsible for providing care to as many patients with common diseases as possible. This is presumably why junior resident physicians working in community-based hospitals scored higher in the GM-ITE, which asks questions based on clinical case scenarios that simulate actual situations.

Issues have been raised in association with the education for junior resident physicians of the university and university-affiliated hospitals in Japan. For advancing medical care, advancing the differentiation of specialties of each department was prioritized in university hospitals [16]. Subspecialization may be efficient in terms of promoting academic advances. However, from the perspective of a generalist who treats patients with a wide range of common diseases, the inability to diagnose and treat illnesses outside one’s specialty is highly disadvantageous. One of the main goals for junior residency is learning diagnostic and treatment skills through appropriate clinical reasoning processes that note common symptoms and patients’ conditions in a primary care setting [1]. General physicians who acquire a broad spectrum of essential clinical competencies such as clinical ethics, physical examination, clinical procedures, clinical reasoning, and professionalism play a critical role in the education of junior resident physicians [17, 18].

The three areas with differences in scores, namely, “symptomatology and clinical reasoning,” “physical examination and clinical procedures,” and “disease knowledge,” are all areas in which knowledge and experience can be mainly gained by examining patients at the bedside. “Physical examination and clinical procedures,” in which the difference in scores was wide in PGY-1, is a special area from the point of view of education for resident physicians. That is because it cannot be acquired only by accumulating knowledge. To acquire the sufficient basic clinical skills of physical examination and clinical procedures, senior doctors need to provide consistent education to resident physicians through bedside learning. We inferred that junior resident physicians in university hospitals could have fewer opportunities for systematic bedside learning by senior doctors than those in community-based hospitals.

The number of supervising physicians per junior resident physician is lower in community-based hospitals than in university hospitals, which suggests that junior resident physicians can care for many patients under the same supervising physicians. Junior resident physicians in community-based hospitals are therefore ensured more opportunities for coherent bedside guidance when compared to their colleagues at university hospitals. We think these factors may cause the difference in “physical examination and clinical procedures” scores between the junior resident physicians in university and community-based hospitals.

Physical examinations are essential to the diagnostic approach of both resident physicians and physicians. However, a previous report has indicated that the physical examination skills of primary care physicians have a crucial drawback [19]. In addition, the physical examination skills of resident physicians have been shown to be deteriorating [20]. Moreover, self-confidence in performing physical examinations does not necessarily increase at each stage of training [21]. To improve physical examination teaching skills, a systematic and innovative education program must be developed to hone physical examination skills based on the preceding pedagogy [22, 23].

Physical examination scores were especially low in university hospital programs, and the opportunity to teach these skills could be enhanced by taking advantage of a large number of supervising physicians. In addition, because university hospitals have several specialty departments, they can offer various educational programs in physical examinations. Another alternative suggestion is to organize a comprehensive educational program on physical examinations and procedures for junior residents in university hospital training programs in which the involvement of all departments is mandatory and to require junior residents to participate in the training program. Consequently, we can expect that the performance of this GM-ITE section will improve among university hospital programs.

The results of our study showed that resident physicians in university hospitals spent a shorter amount of time studying. From the perspective of clinical training in junior residency, it is possible that less-motivated resident physicians may have chosen hospitals with shorter duty hours [24]. The number of resident physicians in university hospitals tends to be greater than in community-based hospitals, so the workload of each resident physician in a university hospital could be shorter than that of a resident physician in a community-based hospital. Moreover, I previously revealed that the amount of study time significantly associated with the GM-ITE score [7]. Several previous studies have also indicated the relationship between sufficient self-study time and professional development [25, 26].

We believe that the generalizability of the whole results of this study is not guaranteed. This is because it is expected that each country will have different medical education systems and different roles between university hospitals and community hospitals. However, we believe that the results of this study have provided medical educators around the world with some important insights. First, the GM-ITE score of PGY-1 in the field of “physical examination and clinical procedures” in university hospitals is extremely low. Attention should be paid to the fact that university hospitals tend to adopt an organ-specific educational system. Immediately after the start of junior residency, supervisors will need to see patients at the bedside with the resident physicians and provide consistent, thorough physical examination and education on clinical procedure. Second, evaluating the relationship between the in-training examination score and performance of resident physicians, the focus should be on several aspects, such as the four domains we indicated, rather than focus on only the total score.

This study has several limitations. First, the rotation order differs across departments and between individual junior resident physicians. Thus, the scores in the four categories measured may have been impacted by the content received by the residents immediately before they completed the GM-ITE. Second, the overall number of participants in this investigation was limited. More specifically, the total number of PGY-1 and PGY-2 junior resident physicians was approximately 18,000, but not all the junior resident physicians participated in the GM-ITE. In Japan, there were 1363 residency programs among 1020 hospitals (907 community and 113 university hospitals, including affiliated hospitals) [15]. Our study integrated and used three years of GM-ITE data, with data obtained from approximately 500 clinical training facilities each year. In other words, about half of the clinical training facilities in Japan participated in GM-ITE. It is also worth noting that the number of participants in university hospitals was smaller than that of community-based hospitals, which could have resulted in a selection bias.

We did not assess the resident physicians’ baseline clinical skills. Medical school experience differs between participants, which could have impacted the study results. Finally, this investigation also did not evaluate the number of supervising physicians in each teaching hospital and the years of clinical experience of their residents. These factors could affect the GM-ITE scores. Future prospective studies are needed to confirm the results of this investigation.

Conclusion

In conclusion, the present study focused on differences between community-based and university hospitals using four subcategories tested via the GM-ITE. The discrepancy in the GM-ITE score by subcategory was investigated to identify more significant gaps between the GM-ITE scores for “physical examination and clinical procedures” and negligible gaps in the GM-ITE scores for “medical interview and professionalism.” To standardize the quality of the junior residency program in Japan, clinical education in categories where university hospitals have low scores, such as “physical examination and clinical procedures,”must be promoted.

Availability of data and materials

The corresponding author will respond to inquiries on the data analyses.

Abbreviations

CI:

confidence interval

GM-ITE:

general medicine in-training examination

JAMEP:

Japan Institute for Advancement of Medical Education Program

PGY:

postgraduate year

References

  1. Ministry of Health, Labour and Welfare [Internet]. Available at: https://www.mhlw.go.jp/stf/seisakunitsuite/bunya/kenkou_iryou/iryou/rinsyo/index.html. Accessed January 8, 2021. (in Japanese).

  2. Tokuda Y, Soshi M, Okubo T, Nishizaki Y. Postgraduate medical education in Japan: missed opportunity for learning clinical reasoning. J Gen Fam Med. 2018;19(5):152–3. https://doi.org/10.1002/jgf2.202.

    Article  Google Scholar 

  3. Japan Institute for Advancement of Medical Education Program [Internet]. Available at: https://jamep.or.jp/. Accessed January 8, 2021. (in Japanese).

  4. Shimizu T, Tsugawa Y, Tanoue Y, Konishi R, Nishizaki Y, Kishimoto M, et al. The hospital educational environment and performance of residents in the general medicine in-training examination: a multicenter study in Japan. Int J Gen Med. 2013;29:637–40.

    Article  Google Scholar 

  5. Nishizaki Y, Shinozaki T, Kinoshita K, Shimizu T, Tokuda Y. Awareness of diagnostic error among Japanese residents: a nationwide study. J Gen Intern Med. 2018;33(4):445–8. https://doi.org/10.1007/s11606-017-4248-y.

    Article  Google Scholar 

  6. Ministry of Health, Labour and Welfare [Internet]. Available at: https://www.mhlw.go.jp/topics/bukyoku/isei/rinsyo/keii/030818/030818b.html. Accessed January 8, 2021. (in Japanese).

  7. Nishizaki Y, Shimizu T, Shinozaki T, Okubo T, Yamamoto Y, Konishi R, et al. Impact of general medicine rotation training on the in-training examination scores of 11, 244 Japanese resident physicians. BMC Med Educ. 2020;20(1):426. https://doi.org/10.1186/s12909-020-02334-8.

    Article  Google Scholar 

  8. Nishizaki Y, Mizuno A, Shinozaki T, Okubo T, Tsugawa Y, Shimizu T, et al. Letters: educational environments and the improvement of score in the general medicine in-training examination score. J Gen Fam Med. 2017;18(5):312–4. https://doi.org/10.1002/jgf2.57.

    Article  Google Scholar 

  9. Mizuno A, Tsugawa Y, Shimizu T, Nishizaki Y, Okubo T, Tanoue Y, et al. The impact of the hospital volume on the performance of residents on the general medicine in-training examination: a multicenter study in Japan. Intern Med. 2016;55(12):1553–8. https://doi.org/10.2169/internalmedicine.55.6293.

    Article  Google Scholar 

  10. Kinoshita K, Tsugawa Y, Shimizu T, Tanoue Y, Konishi R, Nishizaki Y, et al. Impact of inpatient caseload, emergency department duties, and online learning resource on general medicine in-training examination scores in Japan. Int J Gen Med. 2015;8:355–60. https://doi.org/10.2147/IJGM.S81920.

    Article  Google Scholar 

  11. Ministry of Health, Labour and Welfare [Internet]. Available at: https://www.mhlw.go.jp/stf/newpage_05606.html. Accessed January 8, 2021. (in Japanese).

  12. REIS (Residency Electronic Information System) [Internet]. Available at: https://www.iradis.mhlw.go.jp/. Accessed January 8, 2021. (in Japanese).

  13. The Foundation for Promotion of Medical Training [Internet]. Available at: http://pmet.or.jp/. Accessed January 8, 2021. (in Japanese).

  14. Japan Residency Matching Program [Internet]. Available at: https://www.jrmp.jp/. Accessed January 8, 2021. (in Japanese).

  15. Nishizaki Y, Ueda R, Shinozaki T, Tokuda Y. Hospital characteristics preferred by medical students for their residency programs: a nationwide matching data analysis. J Gen Fam Med. 2020;21(6):242–7. https://doi.org/10.1002/jgf2.370.

    Article  Google Scholar 

  16. Ishizuka T. Specialists in internal medicine and subspecialties. Nihon Naika Gakkai Zasshi. 2008;97(5):1130. (in Japanese)–4. https://doi.org/10.2169/naika.97.1130.

    Article  Google Scholar 

  17. Policy statement for general internal medicine fellowships. Society of general internal medicine. J Gen Intern Med. 1994;9:513–6.

  18. Ranji SR, Rosenman DJ, Amin AN, Kripalani S. Hospital medicine fellowships: Works in progress. Am J Med. 2006;119:72.e1–7.

    Article  Google Scholar 

  19. Paauw DS, Wenrich MD, Curtis JR, Carline JD, Ramsey PG. Ability of primary care physicians to recognize physical findings associated with HIV infection. JAMA. 1995;274(17):1380–2. https://doi.org/10.1001/jama.1995.03530170060033.

    Article  Google Scholar 

  20. Ozuah PO, Dinkevich E. Physical examination skills of US and international medical graduates. JAMA. 2001;286:1021.

    Article  Google Scholar 

  21. Wu EH, Fagan MJ, Reinert SE, Diaz JA. Self-confidence in and perceived utility of the physical examination: a comparison of medical students, residents, and faculty internists. J Gen Intern Med. 2007;22(12):1725–30. https://doi.org/10.1007/s11606-007-0409-8.

    Article  Google Scholar 

  22. McMahon GT, Marina O, Kritek PA, Katz JT. Effect of a physical examination teaching program on the behavior of medical residents. J Gen Intern Med. 2005;20:710–4.

    Article  Google Scholar 

  23. Schwind CJ, Boehler ML, Folse R, Dunnington G, Markwell SJ. Development of physical examination skills in a third-year surgical clerkship. Am J Surg. 2001;181(4):338–40. https://doi.org/10.1016/S0002-9610(01)00573-6.

    Article  Google Scholar 

  24. Nagasaki K, Nishizaki Y, Shinozaki T, Kobayashi H, Tokuda Y. Association between resident duty hours and self-study time among postgraduate medical residents in Japan. JAMA Netw Open. 2021;4(3):e210782. https://doi.org/10.1001/jamanetworkopen.2021.0782.

    Article  Google Scholar 

  25. Bull DA, Stringham JC, Karwande SV, Neumayer LA. Effect of a resident self-study and presentation program on performance on the thoracic surgery in-training examination. Am J Surg. 2001;181(2):142–4. https://doi.org/10.1016/S0002-9610(00)00567-5.

    Article  Google Scholar 

  26. Philip J, Whitten CW, Johnston WE. Independent study and performance on the anesthesiology in-training examination. J Clin Anesth. 2006;18(6):471–3. https://doi.org/10.1016/j.jclinane.2006.01.003.

    Article  Google Scholar 

Download references

Acknowledgments

We thank the JAMEP secretariat for their excellent support of our work and Mr. Kento Isogaya and Mr. Yuji Tanaka for the supporting data analysis. The authors thank Enago (www.enago.jp) for the English language review.

Funding

This study did not receive any funding source.

Author information

Authors and Affiliations

Authors

Contributions

YN and YT have designed this study as a whole and written this manuscript. KN and TS 1 have contributed to statistical analyses. TS 2, TO, YY, and RK have contributed to data collection. All authors have contributed, provide advice on the interpretation of the results, and approved the final manuscript.

Corresponding author

Correspondence to Yuji Nishizaki.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the institutional review board of Mito Kyodo General Hospital, Mito City, Ibaraki, Japan. All the participants gave informed consent under an opt-out agreement.

Consent for publication

Not applicable.

Competing interests

YN received an honorarium from the JAMEP as the GM-ITE project manager. YT and TO were JAMEP directors. TS 2 and YY received an honorarium from the JAMEP as an exam developer for the GM-ITE. YN, YT, TO, TS 2, and YY were not involved in the analysis.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1 Appendix 1.

An example of a good-quality question and test analysis. Appendix Table 1. Parameter estimates of two-level linear models for the GM-ITE total score. Appendix Table 2. Parameter estimates of three-level linear models for the GM-ITE subcategory scores

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nishizaki, Y., Nozawa, K., Shinozaki, T. et al. Difference in the general medicine in-training examination score between community-based hospitals and university hospitals: a cross-sectional study based on 15,188 Japanese resident physicians. BMC Med Educ 21, 214 (2021). https://doi.org/10.1186/s12909-021-02649-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-021-02649-0

Keywords