Skip to main content

MCQs in-training examination scores of the surgical residency program in Thailand: the relationship between medical school vs public health-based training institutions

Abstract

Background

This study aims at investigating and evaluating the categorical knowledge of residency using an internet-based examination.

Methods

All in-training examinees from 32 Thailand’s general surgery residency training institutions participating in the online examination. One hundred fifty Multiple Choice questions (MCQs) were selected for this examination from a pool of previous MCQs used for board certification examinations. Baseline chracteristic of the examinee including residency year, training institution (medical schools and public health-based training institutions), regional-based area of the institution, overall test score, scoring by subcategory, total time to complete the examination, and the length of accredited as a training centre time were collected and analysed.

Results

Total 613 examinees. The mean total score of 1st and 3rd year residency of a public health hospital institution differed from those with medical school-based training. On average, the scores in 4 out of 10 categories of residency from medical school training institutions were higher. However, residency from institutions with over 7 years of experience tended to score higher in the Liver, Biliary and Pancreas category.

Conclusions

The average scores of MCQs exams which reflect the medical knowledge for general surgical residency training at medical schools do not significantly differ from those of individuals undergoing public health training. Additionally, the mean scores of MCQs exams did not differ between high-experience training institutions and recent accredited training centers.

Peer Review reports

Background

Medical students in both the UK and the USA have been assessed using MCQs in examinations for over 20 years and they are also commonly used in postgraduate exams [1, 2]. The current debate surrounding MCQs focuses primarily on the pros and cons of different types of questions used in these tests [3]. Assessment methods, including method in question, have their own unique strengths and limitations [4]. In the past, studies have found that the results of evaluating knowledge through MCQs are good when compared to other methods of assessment [4,5,6]. This online, nationwide in-training MCQ surgical examination in Thailand was created by the RCST. The purpose of this examination is to enable surgical residents to self-evaluate and improve their knowledge in each category, as well as to prepare for the Thai Board of Surgery Qualifying Exam by the RCST in the final year of their training program. With the COVID-19 pandemic, the individual computer-based examination platform became especially convenient due to travel restrictions and social distancing policies.

The General Surgery Residency Training Program of Thailand was established in 1980 [7]. The duration of the surgical training was 4 years. To achieve certification in the board examination, candidates need to pass all three parts of the exam: (1) Multiple choice questions (MCQs), (2) Modified Essay Questions (MEQs), and (3) an oral examination. The multiple-choice questions (MCQs) for this examination were carefully chosen from a well-established pool of subjects previously used in board certification exams. These questions were selected by the surgical faculty from 32 general surgery residency training institutions across Thailand. The surgical faculty included both medical-school and public health hospital staff. The faculty members involved are recognized experts in their respective fields, with specialized qualifications in 10 general surgery subspecialties. These subspecialties include esophageal, gastric, and small bowel surgery; head and neck surgery, endocrineprocedures; hepatobiliary and pancreatic (HPB) surgery; vascular surgery; trauma surgery; plastic surgery, burns management; breast surgery; skin and soft tissue surgery; and colorectal and appendiceal surgery. Additionally, the subspecialties encompass areas outside general surgery, such as neurosurgery, cardiovascular thoracic surgery, urologic surgery, pediatric surgery, and bariatric surgery. The selected MCQs were rigorously validated and accredited by the committees of the Training and Examination Board of Certification in Fellowship of Surgery at the Royal College of Surgeons of Thailand (RCST) in Bangkok, Thailand [8, 9].

The main surgical training institutions approved by the RCST can be divided into two categories: medical school-based training institutions and public health hospital-based (regional or provincial hospital) training institutions. Medical school-based training institutions tend to have higher educational resources, such as surgical simulations, library access, and more up-to-date research literature, due to the support system from the university or college. The surgical staff who traingeneral surgical residents typically have obtained advanced training or subspecialties beyond general surgery. For example, the surgical staff of medical school-based training institutions acquired specialist training abroad inhepatopancreatic and biliary surgery, colorectal, vascular surgery, endocrinologic, breast, robotic surgery, oncologic surgery, etc. Additionally, medical school-based training institutions frequently have more academic conferences than public health hospitals, allowing surgical residents to easily participate in these activities. In contrast, public health hospitals typically handle a higher number of cases and more common surgical diseases, especially in tertiary care centers under the public health system, compared to medical school-based training institutions. The surgical staff who train general surgical residents usually have extensive experience in caring for general surgical disease patients. This difference may lead to variations in experience and medical knowledge between residents trained in medical school-based and public health hospital-based institutions. These training institutions are located throughout Thailand, across the Central, Northern, Eastern, Northeastern, and Southern regions. Some institutions have been accredited to deliver the training program for a long time, while others have only recently been approved. The training institutions with more than seven years of experience in surgical residency training were re-accredited under the World Federation of Medical Education (WFME) for general surgery in 2022 by the Training and Examination Board of Certification in Fellowship of Surgery of the RCST. Therefore, for this study, the training institutions were categorized as high-experience training centers and recently accredited training centers based on a seven-year cut-off, which may affect the trainees’ knowledge and experience [9].

There is significant variation in patient characteristics, resources, and trainers’ experience among the 32 institutions that provide general surgery residency training in Thailand. The types of surgical procedures and the level of experience at these institutions are influenced by factors such as common surgical diseases local occurence, endemic diseases, culture, occupation, and the social and economic status of the regions they serve. Trainers’ experience and resources tend to be more substantial at medical school-based training institutions, particularly those with more than seven years of experience. As previously mentioned, these differences impact both the knowledge and surgical skills acquired by residents during their training at each institution [8].

The information collected in the database included the examinee’s sex, residency year, training institution, overall test score, subcategory scores, total time to complete the examination, the location of the institution, and the length of time the institution had been accredited as a training center. The data were then divided into two groups to facilitate comparison: examinees from medical schools and those from public health-based training institutions. MCQs are commonly used in postgraduate medical examinations because they are time-efficient, highly reliable, and easily standardized [10]. The main competencies under the outcome-based training system of WFME in the general surgery curriculum of RCST include patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and system-based practice [9]. The medical knowledge is the important part of main competency which involving to basic science, epidemiological and clinical, as well as the application of this knowledge to surgical patients [11]. The University of Maryland, Medical Center (UMMC) established the Accreditation Council for Graduate Medical Education (ACGME) Common Program Requirements, which mandate that every residency/fellowship program insists on its residents acquiring medical knowledge. It is mandatory programs to define the specific knowledge, skills and attitudes of residents to demonstrate competency in Medical Knowledge for surgical residents [11]. Well-constructed MCQs enable the evaluation of this competency. However, it can be challenging to design MCQs that effectively assess the application of knowledge and the competence of surgical residents. To evaluate medical knowledge in surgical residency, application-based MCQs are more appropriate for assessing understanding of surgical conditions, decision-making, and critical thinking in surgical patients. Most medical schools and residency training programs use case-based and application-based MCQs as vital components in the evaluation of surgical trainees and practicing surgeons [12]. They are less likely to developbasic recall MCQs, which are a poorevaluation of medical knowledge and other competencies. The American College of Surgeons created the Surgical Education and Self-Assessment Program (SESAP®) which is a case-based question and premier educational resourcepromoting excellence and expertise for practicing surgeons in the United States of America (USA) [1]. The RCST has developed an internet-based online examination using case-based and application MCQs for residency self-preparation, designed to evaluate individual knowledge. The examination consists of 150 items across different categories based on body systems. This study aims to determine the relationship between system-based MCQ scores in surgical diseases, which reflects the medical knowledge of trainees, and the type of training institution, either medical school-basedor public health hospital-based(regional or provincial). Additionally, the study compares medical knowledge between high-experience training institutions and recently accredited institutions by assessing the impact of a 7-year duration of surgical residency training.

Gap of knowledge

Assessing knowledge through MCQ exams is a reliable method of evaluating performance. During the COVID-19 pandemic, online exams provide assessors with the convenience of conducting assessments without the need for travel or gatherings. The exam results can also be categorized, enabling examinees to review areas where they scored lower and work on improving themselves.

Methods

Data collection

This retrospective cohort study collected data from a database, including the sex of the examinee, residency year, training institution, overall test score, subcategory scores, total time to complete the examination, location of the institution, and the length of time the institution had been accredited as a training center. The independent variables were then categorized to enable data comparisons as follows: [1] medical school-based training versus public health hospital-based training institutions; [2] the region of the training center, including Central, Northern, Eastern, Northeastern, and Southern regions of Thailand; and [3] high-experience training centers versus recently accredited training centers, based on a 7-year cut-off for this study. The primary outcome, or dependent variable, was the mean score of the MCQ examination. Online examination scores from 613 examinees participating in the examination from 32 training institutions were collected.

Study participants

General surgical residency students from the 1st to 4th year who were in training under the RCST curriculum as of May 7th, 2022, were included in the study to participate in the MCQ examination. Non-RCST general surgery residents, such as those from other countries who choose to study at a training institution in Thailand, were not permitted to participate in the MCQ examination.

Selection of multiple-choice questions (MCQs)

Case-based and application MCQs were randomly selected for this examination from a pool of previous questions used in board certification exams. A total of 150 MCQs, each with five answer options, were included. The questions were equally divided into 10 categories: (1) Esophageal, gastric, and small bowel surgery; (2) Head and neck (thyroid and parathyroid) surgery; (3) Hepatobiliary and pancreatic (HPB) surgery; (4) Vascular surgery (peripheral arteries and aorta, venous system); (5) Trauma (abdominal, chest, neuro, and vascular trauma); (6) Plastic surgery and burns; (7) Breast surgery; (8) Skin and soft tissue surgery; (9) Colorectal and appendix surgery; and (10) Non-general surgery subspecialties (neurosurgery, cardiovascular thoracic surgery, urologic surgery, pediatric surgery, and bariatric surgery).

These 10 categories were determined by the committees of the Training and Examination Board of Certification in Fellowship of Surgery of the RCST and are documented in Thailand’s Surgical Residency curriculum under the World Federation of Medical Education (WFME) standards for general surgery in 2022. The MCQs underwent a pilot test, and their internal consistency was verified by the committee. The MCQ examination is an internet-based online assessment designed to evaluate individual knowledge, summarizing the examinee’s characteristics, institution, overall scores, and categorized scores using a highly reliable computer-based analysis. The total time allocated for the test is approximately 180 min, after which the system automatically logs out. Consequently, all surgical residents mean scores were measured under the same conditions and at the same time, ensuring no bias.

Statistical analysis

Descriptive statistics were used to summarize categorical variables as frequencies and percentages. For continuous variables, if the data were normally distributed, the mean and standard deviation (SD) were reported; if the data were non-normally distributed, the median and interquartile range (IQR) were used. Regarding inferential statistics, Fisher’s exact test was applied to compare proportions between two groups, and the student’s t-test was used to compare means between two institutions. If the data were normally distributed, Analysis of Variance (ANOVA) was employed to assess differences across institutional years and residency years. However, if the data were not normally distributed, the Kruskal-Wallis test was used instead. To analyze the correlation between two groups, subgroup analysis was performed to investigate factors that may be associated with the score, such as the duration of training, subject-based category, and residency year. Statistical analyses were conducted using Stata/SE version 16 for Mac (StataCorp, College Station, TX, USA).

Results

A total of 613 surgical residency examinees completed the examination, with 65% being male. The average overall time to finish the examination did not significantly differ between medical school-based surgical training institutions and public health hospital institutions (169.37 min vs. 170.55 min, p = 0.325). However, when subdivided by residency year, first-year residents from public health hospital training institutions took significantly longer to complete the examination compared to those from medical school training institutions (172.78 min vs. 167.25 min, p = 0.011) (Table 1).

Table 1 Characteristic of online surgical in-training examinators

The mean total scores for first- and third-year residents from public health hospital institutions were significantly lower than those from medical school-based training programs. Specifically, first-year residents had a mean total score of 43.73 compared to 45.89 for medical school residents (p = 0.042), and third-year residents had a mean score of 52.88 compared to 55.58 (p = 0.028) (Table 1).

In the categories of Hepatobiliary and Pancreatic surgery, Peripheral arteries and aorta, Venous system, Breast surgery, and Colorectal and appendix surgery, the overall mean scores were significantly higher for residents from medical school institutions.

Table 2 compares each year of residency based on specific categories. It shows that in the first year, residents from medical schools scored higher than those from public health institutions in the vascular artery, aorta, and vascular veins categories (mean scores 51.4 vs. 44.75, p = 0.003), as well as in the colorectal and appendix categories (mean scores 39.53 vs. 35.34, p = 0.035). In the head and neck, breast, skin and soft tissue, and subspecialty categories, the scores of residents from public health institutions were not significantly different from those of medical school residents. In the second year of residency, statistically significant differences were observed in the vascular and breast sections (p = 0.01 and p = 0.001, respectively). While other groups did not show statistically significant differences, third-year residents from medical schools scored significantly higher than those from public health institutions in the Hepatobiliary and Pancreatic (HPB) and vascular sections (p = 0.009 and p < 0.001, respectively). Conversely, the public health group scored higher in the esophagus, small bowel, stomach, trauma, and subspecialty sections. Among fourth-year residents preparing for the oral examination, the only significant difference was in the skin and soft tissue category, where public health residents scored higher (67.56 points) compared to those from medical schools (59.13 points, p = 0.03).

Table 2 Comparison scoring by institutions year-by-year residents

When divided by the length of time that institutions have been accredited by the RCST (Table 3), there was no significant difference in test duration or overall scores between institutions accredited for more than seven years and those accredited for less than seven years. However, when analyzed by category, residents from institutions accredited for more than seven years tended to score higher in the HPB category. Looking at scores by residency year and category (Table 4), statistically significant differences were found only in the vascular category for first- and third-year residents and in the subspecialty category for third-year residents (p = 0.006, 0.018, and 0.025, respectively).

Table 3 Characteristic of training year
Table 4 Comparison scoring by training year year-by-year residency

Finally, regional analysis showed no significant differences in overall performance between residents from different regions of Thailand, though some categories did show regional variations. For instance, third-year residents in the Northeast region scored significantly higher in the Vascular surgery category (66.00 ± 13.94) compared to their peers in other regions (p = 0.009) (Table 5).

Table 5 Comparison scoring by regional year-by-year residency

Discussion

Among the residents from the two types of institutions compared, only 1st-year residents showed a statistically significant difference in the time taken to complete the examination, with those in public health training institutions taking more time than those in medical training institutions. This difference may be attributed to varying approaches to examination preparation. When analyzing overall scores by category, differences were noted in the HPB (Hepatobiliary and Pancreatic) and vascular system sections between the institutions. These differences might be due to residents in medical schools encountering more complex cases and rare diseases compared to those in public health institutions. Furthermore, some public health institutions may lack access to, or have limited availability of, vascular and HPB surgeons, which restricts them to basic surgical procedures and interventions due to limited resources. Consequently, complex major operations and advanced interventions in HPB and vascular surgery may not be performed in these public health institutions.

These findings can also be explained by the fact that most medical schools are organized into specialist divisions that are organ-oriented. As residents rotate through each division, they gain experience by encountering a broader range of cases. In contrast, examinees from public health training institutions scored higher in the esophagus, small bowel, stomach, and trauma sections, as the majority of patients in these institutions fall into these categories. Additionally, medical training institutions in Thailand primarily serve as tertiary care centers, leading to a lower number of in-hospital patients admitted for common surgical conditions such as peptic ulcer perforation, small bowel obstruction, and trauma. The cutoff point used in this study to distinguish between different levels of surgical residency training experience was 7 years, as approved by the RCST. This cutoff was chosen because the MCQ examination period includes several institutions that have been approved for training, with each institution undergoing evaluation every four years.

In this study, there are no specific criteria for passing each residency year. However, scores and average scores are collected and used in conjunction with the standard deviations at the national and institutional levels as a reference for self-evaluation. In-training evaluations during training sessions have been found to correspond with the scores obtained in actual board exams in the future [13,14,15,16]. Therefore, if an examinee does not score well in a particular area, they can focus their development on specific categories to improve themselves.

The study also examined the impact of the length of time an institution has been accredited as a training center. It was observed that institutions with more than seven years of experience tended to score higher in specific categories like HPB surgery, suggesting that the duration and stability of a training program may contribute to better outcomes in certain specialized areas. However, this difference was not universally significant across all categories, indicating that the length of accreditation may not have a substantial impact on overall medical knowledge.

Despite the differences in training environments, the overall conclusion of the study was that the medical knowledge as assessed by the MCQs did not significantly differ between residents trained in medical school-based and public health-based institutions. This finding highlights the effectiveness of the standardized training and evaluation system accredited by the Royal College of Surgeons of Thailand (RCST), which ensures that all institutions, regardless of their focus or resources, are capable of providing equivalent levels of training and education to their surgical residents. The accreditation system by a central organization that is not involved with the institution is very important and key to success in controlling the quality of the training system under different environments.

One potential next step for this study could be to compare the scores of each individual test-taker across different categories to see if there has been any significant improvement or decline in certain areas. This could include analyzing overall scores as well as scores from specific sub-categories or sections of the exam.

The primary strength of this study is that it represents the first to examine and report on the relationship between institution type and MCQ scores—categorized as either medical school-based or public health hospital-based training institutions—while providing a detailed analysis of the differences between these categories.

Conclusion

the study shows that there is no significant difference in the MCQs exam results between residents trained at medical school-based institutions and those at public health-based institutions. Additionally, the location of the training center, whether in the Central, Northern, Eastern, Northeastern, or Southern regions of Thailand, does not affect the overall mean scores, which reflect the medical knowledge of surgical residency. This finding highlights the effectiveness of the standardized training and evaluation system accredited by the Royal College of Surgeons of Thailand (RCST), ensuring that high-quality surgical education is delivered uniformly across diverse settings, regardless of resource levels.

Data availability

No datasets were generated or analysed during the current study.

References

  1. American College of Surgeons (ACS). Surgical Education and Self-Assessment Program (SESAP®) https://www.facs.org/for-medical-professionals/education/tools-and-platforms/sesap-18/ Accessed 20 May 2024.

  2. Evgeniou E, Peter L, Tsironi M, Iyer S. Assessment methods in surgical training in the United Kingdom. J Educ Eval Health Prof. 2013;10:2.

    Article  Google Scholar 

  3. Anderson J. For multiple choice questions. Med Teach. 1979;1(1):37–42.

    Article  Google Scholar 

  4. Moss E. Multiple choice questions: their value as an assessment tool. Curr Opin Anaesthesiol. 2001;14(6):661–6.

    Article  Google Scholar 

  5. Pham H, Trigg M, Wu S, #39, Connell A, Harry C, et al. Choosing medical assessments: does the multiple-choice question make the grade? Educ Health. 2018;31(2):65–71.

  6. Norcini JJ, Swanson DB, Grosso LJ, Webster GD. Reliability, validity and efficiency of multiple choice question and patient management problem item formats in assessment of clinical competence. Med Educ. 1985;19(3):238–47.

    Article  Google Scholar 

  7. Royal College of Surgeons of Thailand (RCST). History. https://rcst.or.th/en/history/ Accessed 20 May 2024.

  8. Royal College of Surgeons of Thailand (RCST). Surgical training guideline. In: Royal College of Surgeons of Thailand (RCST). https://www.rcst.or.th/web-upload/filecenter/28-9-60.pdf Accessed 20 May 2024.

  9. Training and Examination Board of Certification in Fellowship of Surgery of the Royal College of Surgeons of Thailand (RCST). World Federation of Medical Education (WFME) of general surgery 2022. https://rcst.or.th/en/training-courses/view.php?_did=30. Accessed 20 May 2024.

  10. Capan Melser M, Steiner-Hofbauer V, Lilaj B, Agis H, Knaus A, Holzinger A. Knowledge, application and how about competence? Qualitative assessment of multiple-choice questions for dental students. Med Educ Online. 2020;25(1):1714199. https://doi.org/10.1080/10872981.2020.1714199.

    Article  Google Scholar 

  11. University of Maryland Medical center (UMMC). The Accreditation Council for Graduate Medical Education (ACGME) competencies. https://www.umms.org/ummc/pros/gme/acgme-competencies/program-improvement. Accessed 20 May 2024.

  12. Chéron M, Ademi M, Kraft F, et al. Case-based learning and multiple choice questioning methods favored by students. BMC Med Educ. 2016;16:41.

    Article  Google Scholar 

  13. McCrary HC, Colbert-Getz JM, Poss WB, Smith BK. A systematic review of the relationship between In-Training examination scores and Specialty Board examination scores. J Grad Med Educ. 2021;13(1):43–57.

    Article  Google Scholar 

  14. Velez DR, Johnson SW, Sticca RP. How to prepare for the American Board of Surgery In-Training examination (ABSITE): a systematic review. J Surg Educ. 2021;78(4):1148-55.

  15. Nomura O, Onishi H, Park YS, Michihata N, Kobayashi T, Kaneko K, et al. Predictors of performance on the pediatric board certification examination. BMC Med Educ. 2021;21(1):122.

    Article  Google Scholar 

  16. McClintock JC, Gravlee GP. Predicting success on the certification examinations of the American Board of Anesthesiology. Anesthesiology. 2010;112(1):212–9.

    Article  Google Scholar 

Download references

Acknowledgements

The authors express their gratitude to the Royal College of Surgeons of Thailand and Associate Professor Dr. Narain Chotirosniramit for facilitating the successful completion of the examination. We express our gratitude to the committees responsible for the in-training examination and the committees of Training and Examination Board of Certification in Fellowship of Surgery of the RCST. We extend our appreciation to Mrs. Joan Elizabeth Peagam (Native English Speaker) for her assistance in revising the English language. We are also grateful to the Faculty of Medicine, Chiang Mai University and the Research Group in Surgery, Faculty of Medicine, Thammasat University Hospital, Thammasat University, for providing support and funding for this project.

Funding

The research was funded by the Faculty of Medicine, Chiang Mai University, under grant number 165–2564.

Author information

Authors and Affiliations

Authors

Contributions

CD contributed significantly to the design of the study, analysis of data, and drafting of the manuscript. SO and OH worked together on gathering data, analyzing it, and reviewing the manuscript. NC aided in both data collection and the development of tools, as well as participating in residency assessment. TJ contributed to literature search, data acquisition. All authors reviewed and edited the final manuscript.

Corresponding author

Correspondence to Chagkrit Ditsatham.

Ethics declarations

Ethics approval and consent to participate

This research received ethical approval from the Faculty of Medicine, Chiang Mai University, Chiang Mai, Thailand (Approval Clinical Trial Number ID: 8178/2021). The methods employed in this project adhered strictly to the applicable guidelines and regulations. For the data collection phase, informed consent was waived as the information was gathered from existing databases, and there was no direct contact with any study participants. This exemption was granted by the Research Ethics Committee of the Faculty of Medicine, Chiang Mai University.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Orrapin, S., Homchan, OU., Chotirosniramit, N. et al. MCQs in-training examination scores of the surgical residency program in Thailand: the relationship between medical school vs public health-based training institutions. BMC Med Educ 24, 1037 (2024). https://doi.org/10.1186/s12909-024-06063-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-024-06063-0

Keywords