Skip to main content

Matching of advanced undergraduate medical students’ competence profiles with the required competence profiles of their specialty of choice for postgraduate training

Abstract

Background

Matching between undergraduate students and their chosen specialty has implications for their personal job satisfaction and performance as well as societies’ needs regarding health care quality. Knowledge regarding student-specialty fit can help improve students’ decisions and detect potential deficiencies in specific competences. In this study, we compare self-assessed competence profiles of medical students close to graduation with the competence profiles of their specialty of choice for postgraduate training.

Methods

Self-assessed competence profiles were collected with the modified requirement-tracking (R-Track) questionnaire from 197 final-year medical students close to graduation in 2022. To determine student-specialty fit, difference scores between students’ self-assessed competences and physicians’ requirements for specific specialties were calculated across the R-Track’s six competence areas “Motivation”, “Personality traits”, “Social interactive competences”, “Mental abilities”, “Psychomotor & multitasking abilities”, and “Sensory abilities”, which were assessed on a 5-point Likert scale (1: “very low” to 5: “very high”). Mean difference scores across competence areas were calculated and compared between specialties with multivariate analysis of variance. Student-specialty fit was also calculated independent of students’ choices.

Results

The competence area “Motivation” scored highest for both students and physicians across specialties. However, students’ scores were lower than physicians’ requirements for “Motivation” as well as “Personality traits” across all specialties. Difference scores for “Social interactive competences” were either close to zero or showed higher scores for students. A similar competence pattern for internal medicine, general medicine, paediatrics, and gynaecology was identified with higher than required student scores for “Mental abilities”, “Psychomotor & multitasking abilities”, and “Sensory abilities”. All other specialties showed higher physicians’ requirements for at least one of these competence areas. Independent of students’ specialty choice, we found the highest difference score in favour of student scores for general medicine (0.31) and the lowest difference score for internal medicine (-0.02).

Conclusions

Students’ competence profiles overall show better fit with person-oriented specialties. “Mental abilities”, “Psychomotor & multitasking abilities”, and “Sensory abilities” show higher requirement scores for more technique-oriented specialties. Students interested in such specialties could focus more on basic skill development in undergraduate training or will develop specific skills during residency.

Peer Review reports

Background

The choice of a medical specialty represents one of the most important decisions during medical education [1]. Only few of those choices are changed once made [2, 3]. This importance aggravates considering that person-job fit in the medical profession does not only impact personal wellbeing, job satisfaction, and performance. It can consequently affect the overall healthcare system due to lower quality care, physician burn out, or turnover, if unfitting choices were made [4]. Thus, ideally, medical students should choose a specialty they are optimally qualified and motivated for. Factors considered by medical students in order to make their decision for a specialty for residency training are numerous and include aspects like interest [5], exposure to a certain specialty [6], amount of patient contact [7], work-life balance [8], income [9], or prestige [10]. Identifying these factors and understanding how students make their decisions can help guide them to make a good choice towards person-job fit. Furthermore, to recruit enough skilled residents is especially important for specialties lacking applications, e.g., general medicine and paediatrics [11, 12].

Specialties attracting sufficient applications usually opt for accepting students with higher grades in relevant areas or other performance-based criteria like status of medical school [13] or further qualifications [14]. Thus far, factors relevant for specialty choice were centred around what specialties have to offer and if that fits the students’ needs and preferences. There are, however, also factors attributing to the requirements of a specialty and students’ aptitudes regarding competences [7, 15]. Other aspects like personality [16], in certain specialties being associated with specific personality traits [17], personality types [18], or personality characteristics like empathy [19] are also being taken into account. Choosing the specialty that students fit in best though hinges on the assumptions students have about specialties which can deviate from real circumstances regarding the requirements and decisive aspects of a specialty [20]. Exposure to a specialty can improve understanding of the specialty’s requirements or work conditions [21] and potentially foster interest [22]. However, it will still be subjective and therefore does not suffice for the purpose of matching requirements and students’ characteristics efficiently. In order to also be able to test the student-specialty fit and use compatibility as a tool to improve the decision making towards better fit, students’ self-assessments can be compared with specialty requirement profiles. Such profiles have already been defined for anaesthesiology [23], nephrology [24] and other medical specialties [25].

Competence profiles of specialties demonstrate the differences between specialties and make it possible to match students to their targeted specialty to determine which offers the best fit. On an individual level this can help students decide which specialty they want to choose, while on a professional level it facilitates distributing students according to their competences thus improving job satisfaction [26]. Fitting students and specialties could also highlight deficiencies in undergraduate medical training when specialties show a low matching rate also in comparison to others. Similar thoughts have already been investigated for dermatology [27], ophthalmology [28] or psychiatry [29]. This study aims to use specialty competence profiles and students’ self-assessment of competences to evaluate students’ fit with their chosen specialty across a variety of specialties. We further investigate differences between specialties regarding the fit of students’ competences to the requirements of the specialties, thereby potentially demonstrating educational needs prior to specialty selection in order to improve student-specialty fit.

Methods

Study design and participants

Between September and December 2022, final-year medical students of the region of Northern Germany who had participated in an information event on how to apply for residency were given the possibility to participate in a digital survey of self-assessed competence profiles. Additionally, sociodemographic data (age and gender) were collected and participants named their first and second choice of specialty for residency training. Participation was voluntary and anonymous, and all participants provided informed written consent for participation in this study which was approved by the Ethics Committee of the Chamber of Physicians, Hamburg (PV3649). For data analysis, the competence profiles of the participating students were compared with the competence profiles physicians had provided for their respective specialty in a previous study [25].

Instrument

The Requirement-Tracking questionnaire (R-Track) was used for medical students’ self-assessment of competences. Originally designed for assessment of airline pilots’ competences [30], the questionnaire was previously adapted for health care professionals [25] and also for health care professionals’ self-assessment [31]. Based on established instruments like the Fleishman Job Analysis Survey [32], the R-Track questionnaire aims to assess a broader set of necessary skills and abilities required to successfully fulfill professional tasks. Using a 5-point Likert scale (1: “very low” to 5: “very high”) the R-Track assesses 63 facets of competence, i.e., individual abilities, skills, personality traits, and motivational aspects relevant for successful performance [33], assigned to six areas of competence (“Motivation”, “Personality traits”, “Social interactive competences”, “Mental abilities”, “Psychomotor & multitasking abilities”, “Sensory abilities”). R-Track items per competence area can be obtained from Additional file 1. For the comparison of competence profiles defined by physicians existing data from a previous study [25] were used. The R-Track items from the expert questionnaire were also assessed on a 5-point Likert scale (1: “very low importance” to 5: “very high importance”). Internal consistency (Cronbach’s α = 0.88) was comparable with previous R-Track assessments.

Data analysis

Data were processed using R Version 4.2.2. For analysis regarding different specialties, we included only specialties with sufficient data (i.e., at least n = 7 participants’ first choice of a specialty, following Kleinmann et al.’s suggestion for job analysis instruments [34]) and available data from the expert study for a respective specialty [25]. Mean scores and standard deviations for all competence areas were computed only for specialties with sufficient data as well as average scores across all areas and specialties. Competence area mean scores of students and physicians were compared according to the students’ respective specialty choice and difference scores obtained. Difference scores were computed subtracting physician from student scores, thus resulting in negative scores when expectations from physicians are higher and positive scores when student self-assessment is higher. Additionally, difference scores were computed across all students independent of their specialty choice to determine best fit overall. To rule out possible demographic factors influencing difference scores, we ran a multivariate multiple regression model with interaction of independent variables for all six competence areas. Across all eligible specialties difference scores were compared for each of the six competence areas using multivariate analysis of variance (MANOVA). To further determine where difference scores deviate between specialties, we used univariate analysis of variance (ANOVA) and applied Bonferroni correction to account for multiple testing. For post-hoc comparisons between specialties we computed Tukey HSD tests. The general α-level was set at 0.05.

Results

Overall, 197 final-year medical students (age = 27.5 ± 4.0 years, male = 30%, female = 70%) participated. Students named 17 different specialties as their individual specialty of choice for residency training. Specialties with n < 7 participants that could not be used for analysis were intensive care medicine, neurology, neurosurgery, occupational medicine, ophthalmology, otolaryngology, psychiatry, and urology. Sufficient data was available for n = 9 specialties which were included in the analysis: internal medicine (n = 44), general medicine (n = 24), paediatrics (n = 21), anaesthesiology (n = 18), gynaecology (n = 17), surgery (n = 14), orthopaedics (n = 8), dermatology (n = 7), and radiology (n = 7), resulting in a total of 160 participants (age = 27.8 ± 4.2 years, male = 28%, female = 72%). Highest and lowest mean scores across all specialties in both groups (student and expert physician) were obtained for competence areas “Motivation” (Mstudents = 3.91 ± 0.54, Mphysicians = 4.36 ± 0.16) and “Sensory abilities” (Mstudents = 3.61 ± 0.58, Mphysicians = 3.51 ± 0.22), respectively. Multiple regression showed no significant effect of sex or age on difference scores for all competence areas but “Personality traits” (b = 0.029, p = 0.005), where difference scores become greater with increasing age, indicating better personality fit of students. Full display of means for both groups for the included specialties and difference scores between students and physicians can be obtained from Table 1.

Table 1 Means and difference scores across competence areas and specialties

All difference scores are displayed in Fig. 1. Difference scores for “Motivation” and “Personality traits” were negative (meaning lower scores in the student group) across all specialties ranging from -0.11 (general medicine) to -0.84 (radiology) with an average difference score of -0.44 and from -0.03 (paediatrics) to -0.47 (radiology) with an average of -0.22, respectively. Additionally, internal and general medicine as well as paediatrics and gynaecology all show a similar pattern (referred to as pattern 1) with difference scores for “Social interactive competences” being either negative (internal medicine) or close to zero (general medicine, paediatrics and gynaecology) and difference scores for the remaining areas (“Mental abilities”, “Psychomotor & multitasking abilities” and “Sensory abilities”) being positive (meaning higher scores in the student group). Surgery and orthopaedics only deviate slightly from this pattern with difference scores for “Psychomotor & multitasking abilities” being negative in the student group (pattern 2). All patterns are visualized in Fig. 2.

Fig. 1
figure 1

Mean difference scores between students and physicians across specialties

Fig. 2
figure 2

Display of difference score patterns across competence areas and specialties

Anaesthesiology, dermatology, and radiology all show different patterns. Anaesthesiology, while sharing some pattern aspects with the other specialties (“Social interactive competences” close to zero and a positive score for “Mental abilities” like pattern 1 as well as a negative difference score for “Psychomotor & multitasking abilities” like pattern 2), deviates from both patterns with a negative difference score for “Sensory abilities”, making it one of two specialties with that distinction (dermatology being the other). Dermatology deviates from all previously mentioned patterns with negative difference scores for “Mental abilities”, “Psychomotor & multitasking abilities”, and “Sensory abilities”. While both anaesthesiology and dermatology show difference scores close to zero for “Social interactive competences” which is related to pattern 1, thus similar to all other specialties, the difference score for radiology with -0.43 is not close to zero. Furthermore, radiology is one of two specialties (next to dermatology) with a negative difference score for “Mental abilities” while all other specialties show positive scores.

The one-way MANOVA showed a statistically significant difference between the specialties on the combined competence areas, F(8, 151) = 2.67, p < 0.001, partial η2 = 0.12. Post-hoc univariate ANOVAs showed statistically significant differences between the specialties for competence areas “Mental abilities”, “Psychomotor & multitasking abilities”, and “Sensory abilities”. Post-hoc comparisons with Tukey HSD tests revealed significant differences only for “Mental abilities” and “Psychomotor & multitasking abilities”, but not for “Sensory abilities”. For “Mental abilities”, differences between pediatrics and dermatology (p = 0.011), general medicine (p = 0.003) as well as internal medicine (p = 0.043) and radiology (p = 0.043) were significant with difference scores for pediatrics tending towards higher positive values (higher scores for students). For “Psychomotor & multitasking abilities”, differences between dermatology and other specialties except anaesthesiology, orthopaedics, and surgery were significant with higher negative difference scores (lower scores for students) for dermatology.

All 197 participants combined showed the following competence expressions for the six competence areas: “Motivation” = 3.91 ± 0.54, “Personality traits” = 3.74 ± 0.36, “Social interactive abilities = 3.78 ± 0.35, “Mental abilities” = 3.86 ± 0.52, “Psychomotor & multitasking abilities” = 3.69 ± 0.75, and “Sensory abilities” = 3.61 ± 0.58. Computing mean difference scores across competence areas for specialties independent of students’ choices revealed the lowest difference score for internal medicine (-0.02). The difference score for general medicine (0.31), was the most positive with a higher student score than physician requirement score. Difference scores in all competence areas of the two specialties are shown in Fig. 3.

Fig. 3
figure 3

Mean difference scores per competence area for internal medicine and general medicine between physicians and all students independent of their specialty choice

Discussion

The goal of this study was to determine student-specialty fit and compare different medical specialties regarding the overlap of students’ self-assessment and physicians’ requirements of different competence areas for specialty training. For all specialties, physicians’ requirements regarding “Motivation” and “Personality traits” were higher than the respective students’ self-assessments. These high expectations of practicing physicians are in line with developments regarding the medical profession where non-technical or knowledge based qualities and skills are becoming increasingly important for good clinical practice [35, 36]. Students recognized factors like enthusiasm and commitment to a specialty as relevant [37] and rated competences other than clinical knowledge and skills as important for the medical profession [38]. In the present study, students’ “Motivation” scores were consistently lower than physicians’ requirements in that area, but still highest among all competence areas compared to the other student scores which emphasizes students’ motivation for their respective specialty of choice. Even though students are aware that non-technical qualities and skills are important in the medical profession across specialties, undergraduate medical curricula should further support the development of these skills to support students to reach a level required in the transition to postgraduate training [39].

Expression of “Social interactive competences” was rated higher by students than rated by physicians from most specialties as requirement, or differences between students’ and physicians’ ratings were close to zero. Students seem to have sufficiently developed social competences as another non-technical skill by the end of their undergraduate training. Developing and improving interpersonal communication skills is widely implemented in undergraduate medical curricula [40]. Furthermore, “Social interactive competences” are regarded as basic competences by undergraduate medical students, especially the skill to structure information in communication [20]. In our study, requirements for “Social interactive competences” are met for most specialties at the end of undergraduate training except for radiology, internal medicine, and dermatology. In these specialties, requirements for “Social interactive competences” are rated higher by physicians than the personal competence level assessed by students who wish to choose these specialties for postgraduate training. For radiology residents, for example, courses are offered in their postgraduate training to improve their oral presentation skills [41, 42].

The three competence areas “Mental abilities”, “Psychomotor & multitasking abilities”, and “Sensory abilities” showed lower physician scores than student scores for internal medicine, general medicine, paediatrics, and gynaecology (pattern 1). This indicates that students near graduation fulfil or even exceed the requirements in these more technical and knowledge-based competence areas when choosing one of these specialties. Our findings for all pattern 1 specialties match their categorizing as person-oriented whereas the other specialties included in this study can be regarded as more technique-oriented [43, 44]. The results thus show that students’ scores are sufficient for technical and knowledge-based competences at the end of undergraduate training when a person-oriented specialty is chosen. Physicians of most technique-oriented specialties expect higher scores in at least one of the technical competence areas. Surgical skills (in orthopaedics and surgery) as well as specific psychomotor skill applications and monitoring processes (in dermatology and anaesthesiology) could attribute to higher score expectations in “Psychomotor & multitasking abilities”. Higher score requirements for “Sensory abilities” in anaesthesiology possibly relate to specific auditory and visual cue perception during monitoring [45]. Radiologists’ requirements for “Mental abilities” are potentially exceeding students’ assessments due to the extensive use of imagery in the field which is practiced in postgraduate education [46].

Regardless of students’ specialty choices, we found the lowest difference score for internal medicine (-0.02) and the difference score for general medicine (0.31) was the highest in favour of student scores. This implicates that overall undergraduate training seems to provide students with good person-oriented skills while specific skills for technique-oriented specialties are less covered or have to be acquired during postgraduate training in such specialties [47]. This is also evident in the variance of physicians’ scores in different specialties while students’ scores varied less. Differences identified in the fitting process of students to different specialties also provide evidence that the acquisition of some specialty specific skills and qualities occurs during residency training and medical practice [48, 49]. Adapting the undergraduate curriculum towards enhancement of non-technical qualities for all students while offering more differentiated contents regarding specific skills-based competences for students with an interest in more technique-oriented specialties could provide better students’ fit with their specialty of choice.

With a total of 197 student assessments sufficient data was available for analysis of nine specialties, allowing comparisons between specialties, which is a strength of this study. Only for the competence area “Personality traits” difference scores increased with age, representing the effect of personality maturation [50]. Contrasting different specialties is an important step when determining student-specialty fit. Computing difference scores allows comparisons between the specialties with an underlying metric, expressing the fit between students and physicians. However, although both student and physician scores are on a 5-point Likert-scale, self-assessment of competences and requirement assessment of competences differ in their respective perspectives. While students were asked to assess their own competences compared to other undergraduate students, physicians were asked to rate the statements according to their relevance in their respective field. Therefore, the difference scores are potentially biased by methodological aspects, which is a limitation of our study. Moreover, self-assessment is generally biased and does not necessarily display the actual competence [51, 52]. Higher or lower student scores can thus also be attributed to inaccurate assessments. Also limiting the interpretation of results, difference scores resulted from subtracting physician scores from student scores and were thus not displayed as absolute values. Since negative and positive scores were included, mean difference scores around zero can indicate both low difference scores overall or high difference scores for different competence areas in both directions also resulting in mean scores close to zero. However, we chose this representation as absolute difference scores would have limited the interpretation of good and bad fit since students’ exceeding physicians’ requirements would not have been possible. Some physician scores are higher than others despite having similar requirements (e.g. technique-oriented specialties), inducing a bias towards higher difference scores for these specialties. It remains unclear what influences these high scores since no qualitative assessment of potential influence factors was included in the study. This limits the interpretation of results since high difference scores are potentially biased. Furthermore, it encourages future research to assess differences between specialties that might account for these high score differences that are yet unclear.

Despite these limitations, students’ matching of their competences with the required competences for their specialty of choice can provide them with the opportunity to customise their learning with respect to specialty specific requirements. Self-assessment of competences cannot replace medical educators’ assessment of students’ competences. However, when items are well chosen students’ self-assessment can become more realistic [53]. Therefore, self-assessment with R-Track could be used longitudinally, e.g., starting in year four of undergraduate training, by students to identify their current competence profile and match it with required profiles of specialties of their interest. If, for example, a student who wishes to eventually choose surgery for residency training, notices in year four, that he or she is lacking competences in the area of “Psychomotor & multitasking abilities”, the student could plan his or her further studies with a focus to improve facets of competence from this area, e.g. in electives. On the other hand, if a student worked to reach the required competence profile for surgery but ends up with a dermatology match, he or she will be able to identify competence areas with the R-Track where his or her individual profile does not match dermatology requirements and he or she can focus on improving competences needed in specific areas during postgraduate training. As self-assessment is an important feature of life-long learning in medicine, the R-Track can provide guidance for undergraduate students to identify competence areas for improvement to reach a good competence match with the competence profile of a specialty they wish to choose for residency training. If residents have to work in a specialty for which their R-Track profile does not match the required profile of the respective specialty, they can easily identify the competence areas they need to focus on during postgraduate training to reach a better match for their specialty.

Conclusions

Comparing students’ competence profiles with the required competence profiles of specialties students are interested in for residency training can provide students with new insights with respect to their quality of fit. Overall, students show better fit with specialties that are person-oriented. “Motivation” and “Personality traits” are important competence areas for all specialties and seem to need a more prominent focus in undergraduate training. “Mental abilities”, “Psychomotor & multitasking abilities”, and “Sensory abilities” show difference scores in favour of physicians’ requirements for more technique-oriented specialties. This highlights the focus on basic skill development in undergraduate training for students who are interested in such specialties or provides evidence that some specific skills will be developed during residency. Future studies should aim to assess competencies needed for a good specialty fit from both students and educators involved in undergraduate and residency programs.

Availability of data and materials

All data and materials are available from the manuscript.

Abbreviations

ANOVA:

Analysis of variance

MANOVA:

Multivariate analysis of variance

R-Track questionnaire:

Requirement-Tracking questionnaire

Tukey HSD test:

Tukey honestly significant difference test

References

  1. Lachish S, Goldacre MJ, Lambert TW. Views of UK doctors in training on the timing of choosing a clinical specialty: quantitative and qualitative analysis of surveys 3 years after graduation. Postgrad Med J. 2018;94(1117):621–6.

    Google Scholar 

  2. Birck S, Gedrose B, Robra BP, Schmidt A, Schultz JH, Stosch C, et al. Stability of long-term professional objectives of young physicians during postgraduate training. Results of a multicenter cohort study. Dtsch Med Wochenschr. 2014;139(43):2173–7 ([Article in German]).

    Google Scholar 

  3. Lambert TW, Davidson JM, Evans J, Goldacre MJ. Doctors’ reasons for rejecting initial choices of specialties as long-term careers. Med Educ. 2003;37(4):312–8.

    Google Scholar 

  4. Xiao Y, Dong M, Shi C, Zeng W, Shao Z, Xie H, Li G. Person-environment fit and medical professionals’ job satisfaction, turnover intention, and professional efficacy: a cross-sectional study in Shanghai. PLoS ONE. 2021;16(4):e0250693.

    Google Scholar 

  5. Yen AJ, Webb EM, Jordan EJ, Kallianos K, Naeger DM. The stability of factors influencing the choice of medical specialty among medical students and postgraduate radiology trainees. J Am Coll Radiol. 2018;15(6):886–91.

    Google Scholar 

  6. Pianosi K, Bethune C, Hurley K. Medical student career choice: a qualitative study of fourth- year medical students at Memorial University. Newfoundland CMAJ Open. 2016;4(2):E147–52.

    Google Scholar 

  7. Cleland J, Johnston PW, French FH, Needham G. Associations between medical school and career preferences in year 1 medical students in Scotland. Med Educ. 2012;46(5):473–84.

    Google Scholar 

  8. Thornton J, Esposto F. How important are economic factors in choice of medical specialty? Health Econ. 2003;12(1):67–73.

    Google Scholar 

  9. Pisaniello MS, Asahina AT, Bacchi S, Wagner M, Perry SW, Wong ML, Licinio J. Effect of medical student debt on mental health, academic performance and specialty choice: a systematic review. BMJ Open. 2019;9(7):e029980.

    Google Scholar 

  10. Creed PA, Searle J, Rogers ME. Medical specialty prestige and lifestyle preferences for medical students. Soc Sci Med. 2010;71(6):1084–8.

    Google Scholar 

  11. Bennett KL, Philips JP. Finding, recruiting, and sustaining the future primary care physician workforce: a new theoretical model of specialty choice process. Acad Med. 2010;85(10 Suppl):S81–8.

    Google Scholar 

  12. Mallett P, Thompson A, Bourke T. Addressing recruitment and retention in paediatrics: a pipeline to a brighter future. Arch Dis Child Educ Pract Ed. 2022;107(1):57–63.

    Google Scholar 

  13. Fujihashi A, Yang LC, Haynes W, Patel OU, Burge K, Yadav I, Van Wagoner N, McCleskey B. Evaluating the impact of pass/fail United States medical licensing examination step 1 scoring on pathology residency selection. Acad Pathol. 2023;10(2):100083.

    Google Scholar 

  14. Leahy J, Jo JJ, Steidl W, Appel J. Assessing the competitiveness of medical humanities research on psychiatry, otolaryngology, and ophthalmology residency program applications. Med Educ Online. 2023;28(1):2212929.

    Google Scholar 

  15. Yang Y, Li J, Wu X, Wang J, Li W, Zhu Y, et al. Factors influencing subspecialty choice among medical students: a systematic review and meta-analysis. BMJ Open. 2019;9:e022097.

    Google Scholar 

  16. Borges NJ, Savickas ML. Personality and medical specialty choice: a literature review and integration. J Career Assess. 2002;10(3):362–80.

    Google Scholar 

  17. Mullola S, Hakulinen C, Presseau J, de Gimeno Ruiz Porras D, Jokela M, Hintsa T, Elovainio M. Personality traits and career choices among physicians in Finland: employment sector, clinical patient contact, specialty and change of specialty. BMC Med Educ. 2018;18(1):52.

    Google Scholar 

  18. Sievert M, Zwir I, Cloninger KM, Lester N, Rozsa S, Cloninger CR. The influence of temperament and character profiles on specialty choice and well-being in medical residents. PeerJ. 2016;4:e2319.

    Google Scholar 

  19. Santos MA, Grosseman S, Morelli TC, Giuliano IC, Erdmann TR. Empathy differences by gender and specialty preference in medical students: a study in Brazil. Int J Med Educ. 2016;7:149–53.

    Google Scholar 

  20. Zelesniack E, Oubaid V, Harendza S. Advanced undergraduate medical students’ perceptions of basic medical competences and specific competences for different medical specialties – a qualitative study. BMC Med Educ. 2022;22(1):590.

    Google Scholar 

  21. Querido SJ, de Rond MEJ, Wigersma L, Ten Cate O. Some residents drop out of specialty training. How important is prior clinical experience? A survey among residents in the Netherlands. GMS J Med Educ. 2023;40(1):Doc5.

    Google Scholar 

  22. Williams GC, Saizow R, Ross L, Deci EL. Motivation underlying career choice for internal medicine and surgery. Soc Sci Med. 1997;45(11):1705–13.

    Google Scholar 

  23. Gassner SG, Oubaid V, Hampe W, Kubitz JC. Personality traits in anesthesiology: results from a questionnaire-based requirement analysis. Anaesthesist. 2020;69(11):803–9 ([Article in German]).

    Google Scholar 

  24. Harendza S, Kim WC, Oubaid V. Requirement analysis for nephrologists in hospitals and private practice. Der Nephrologe. 2019;14:159–63 ([Article in German]).

    Google Scholar 

  25. Zelesniack E, Oubaid V, Harendza S. Defining competence profiles of different medical specialties with the requirement-tracking questionnaire – a pilot study to provide a framework for medical students’ choice of postgraduate training. BMC Med Educ. 2021;21(1):46.

    Google Scholar 

  26. Borges NJ, Gibson DD, Karnani RM. Job satisfaction of physicians with congruent versus incongruent specialty choice. Eval Health Prof. 2005;28(4):400–13.

    Google Scholar 

  27. Ulman CA, Binder SB, Borges NJ. Assessment of medical students’ proficiency in dermatology: are medical students adequately prepared to diagnose and treat common dermatologic conditions in the United States? J Educ Eval Health Prof. 2015;12:18.

    Google Scholar 

  28. Esparaz ES, Binder SB, Borges NJ. How prepared are medical students to diagnose and manage common ocular conditions. J Educ Eval Health Prof. 2014;11:29.

    Google Scholar 

  29. Balon R, Morreale MK, Coverdale J, Guerrero APS, Aggarwal R, Louie AK, Beresin EV, Brenner AM. Medical students who do not match to psychiatry: what should they do, and what should we do? Acad Psychiatry. 2020;44(5):519–22.

    Google Scholar 

  30. Oubaid V. Der Faktor Mensch. Berlin: MWV-Verlag; 2019. [Book in German]

  31. Zelesniack E, Oubaid V, Harendza S. Final-year medical students’ competence profiles according to the modified requirement tracking questionnaire. BMC Med Educ. 2021;21(1):319.

    Google Scholar 

  32. Fleishman EA, Reilly ME. Fleishman Job Analysis Survey. Administrator Guide. Potomac: MD Management Research Institute; 1995.

  33. Ten Cate O, Snell L, Carracco C. Medical competence: the interplay between individual ability and the health care environment. Med Teach. 2010;32(8):669–75.

    Google Scholar 

  34. Kleinmann M, Manzey D, Schumacher S, Fleishman EA. F-JAS Fleishman Job Analyse System für eigenschaftsbezogene Anforderungsanalysen. Hogrefe: Hogrefe; 2010. [Article in German].

    Google Scholar 

  35. Shrank WH, Reed VA, Jernstedt GC. Fostering professionalism in medical education: a call for improved assessment and meaningful incentives. J Gen Intern Med. 2004;19(8):887–92.

    Google Scholar 

  36. Mi M, Wu L, Zhang Y, Wu W. Integration of arts and humanities in medicine to develop well-rounded physicians: the roles of health sciences librarians. J Med Libr Assoc. 2022;110(2):247–52.

    Google Scholar 

  37. Smith F, Lambert TW, Goldacre MJ. Factors influencing junior doctors’ choices of future specialty: trends over time and demographics based on results from UK national surveys. J R Soc Med. 2015;108(10):396–405.

    Google Scholar 

  38. Rademakers JJDJM, De Rooy N, Ten Cate OTJ. Senior medical students appraisal of competencies. Med Educ. 2007;41(10):990–4.

    Google Scholar 

  39. Rabinowitz DG. On the arts and humanities in medical education. Philos Ethics Humanit Med. 2021;16(1):4.

    Google Scholar 

  40. Gilligan C, Powell M, Lynagh MC, Ward BM, Lonsdale C, Harvey P, et al. Interventions for improving medical students’ interpersonal communication in medical consultations. Cochrane Database Syst Rev. 2021;2(2):CD012418.

    Google Scholar 

  41. Rockall AG, Justich C, Helbich T, Vilgrain V. Patient communication in radiology: moving up the agenda. Eur J Radiol. 2022;155:110464.

    Google Scholar 

  42. Pino-Postigo A, Domínguez-Pinos D, Lorenzo-Alvarez R, Pavía-Molina J, Ruiz-Gómez MJ, Sendra-Portero F. Improving oral presentation skills for radiology residents through clinical session meetings in the virtual world Second Life. Int J Environ Res Public Health. 2023;20(6):4738.

    Google Scholar 

  43. Manuel RS, Borges NJ, Jones BJ. Person-oriented versus technique-oriented specialties: early preferences and eventual choice. Med Educ Online. 2009;14:4.

    Google Scholar 

  44. Borges NJ, Richard GV. Using the Delphi method to classify medical specialties. Career Deve Q. 2018;66:85–90.

    Google Scholar 

  45. Fioratou E, Flin R, Glavin R, Patey R. Beyond monitoring: distributed situation awareness in anaesthesia. Br J Anaesth. 2010;105(1):83–90.

    Google Scholar 

  46. Chatterjee A, Szasz T, Munakami M, Karademir I, Yusufishag MS, Martens S, et al. An interactive App with mulit-parametric MRI – whole-mount histology correlation for enhanced prostate MRI training of radiology residents. Acad Radiol. 2023. https://doi.org/10.1016/j.acra.2023.04.001.

    Article  Google Scholar 

  47. Doyen B, Vlerick P, Maertens H, Vermassen F, Van Herzeele I. Non-technical attributes and surgical experience: a cross-sectional study comparing communication styles and attitudes in surgical staff, trainees and applicants. Int J Surg. 2019;63:83–9.

    Google Scholar 

  48. Green ML, Aagaard EM, Caverzagie KJ, Chick DA, Holmboe E, Kane G, Smith CD, Iobst W. Charting the road to competence: developmental milestones for internal medicine residency training. J Grad Med Educ. 2009;1(1):5–20.

    Google Scholar 

  49. Kirkman MA. Deliberate practice, domain-specific expertise, and implications for surgical education in current climes. J Surg Educ. 2013;70(3):309–17.

    Google Scholar 

  50. Damian RI, Spengler M, Sutu A, Roberts BW. Sixteen going on sixty-six: a longitudinal study of personality stability and change across 50 years. J Pers Soc Psy. 2019;117(3):674–95.

    Google Scholar 

  51. Blanch-Hartigan D. Medical students’ self-assessment of performance: results from three meta-analyses. Patient Educ Couns. 2011;84(1):3–9.

    Google Scholar 

  52. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094–102.

    Google Scholar 

  53. Bußenius L, Harendza S. Development of an instrument for medical students’ self-assessment of facets of competence for patient-centred care. Patient Educ Couns. 2023;115:107926 ([Epub ahead of print]).

    Google Scholar 

Download references

Acknowledgements

We would like to thank all medical school students who participated in this study.

Funding

The Joachim Herz Stiftung supported this work. The funding body played no role in the design of the study and in collection, analysis, and interpretation of data and in writing the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

All authors designed and performed the study. SH recruited the participants and SP and SH coordinated the study and the data acquisition. LJ performed the analyses and interpreted the results with VO and SH. LJ drafted the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Sigrid Harendza.

Ethics declarations

Ethics approval and consent to participate

The study was performed in accordance with the Declaration of Helsinki and the Ethics Committee of the Chamber of Physicians, Hamburg, approved this study and confirmed its innocuousness (PV3649). Participation was voluntary and all participants provided informed written consent for participation in this study. All data were anonymized.

Consent for publication

Not applicable.

Competing interests

SH has a position as Section Editorial Board Member to BMC Medical Education. LJ, SP, and VO have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:

Competence areas and items of R-Track.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jebram, L., Prediger, S., Oubaid, V. et al. Matching of advanced undergraduate medical students’ competence profiles with the required competence profiles of their specialty of choice for postgraduate training. BMC Med Educ 23, 647 (2023). https://doi.org/10.1186/s12909-023-04632-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-023-04632-3

Keywords