Skip to main content

Study on a job competence evaluation system for resident physicians (including integrated postgraduates) receiving standardized training

Abstract

Background

Standardized training for resident physicians is the primary form of postgraduate medical education, and it plays a pivotal role in healthcare safety and industry stability. Currently, it has garnered significant attention from healthcare institutions.

Methods

By conducting a comprehensive literature review and a Delphi consultation in June 2022 for which 40 experts in clinical medicine, public health, and other related fields in China were invited. The indicators were adjusted according to the results of the consultation, and the final indicator weights were determined through an analytic hierarchy process.

Results

The response rate was 100%, and the expert authority coefficient was 0.879. The consistency among the experts on the tertiary indicators, as measured by Kendall’s W, was 0.675 (χ2 = 42.516, p < 0.001). Based on the results of the expert consultation, a job competence evaluation system for resident physicians (including integrated postgraduates) receiving standardized training was established, which included 6 primary indicators, 18 secondary indicators, and 116 tertiary indicators. The weights for the primary indicators, namely professional quality, skills and knowledge, patient care, communication and collaboration, teaching skills, and lifelong learning, were 0.313, 0.248, 0.181, 0.083, 0.066, and 0.110, respectively. The top three secondary indicators in terms of combined weights were clinical skills (0.122), professional ethics (0.120), and professional dedication (0.109). The three tertiary indicators with the highest scores were “maintains collaboration with peers and colleagues in patient treatment,” “has clinical thinking skills, makes diagnosis and treatment decisions based on analysis of evidence, and has the ability to administer suitable treatments,” and “abides by laws and discipline and refuses to seek personal gains in medical practice”; their combined weights were 0.035, 0.028, and 0.027, respectively.

Conclusion

This study has established a concrete, objective, and quantifiable competency assessment index system for standardized training of clinical resident physicians (including integrated postgraduates). This system provides a foundation for the quantitative evaluation of the competency of clinical resident physicians (including integrated postgraduates) undergoing standardized training.

Peer Review reports

Background

The standardized training of resident physicians is an essential part of postgraduate medical education; it is important because it (1) equips the physicians with the necessary medical knowledge and clinical skills to undertake diagnostic and treatment tasks in a standardized and independent manner, (2) improves the quality of medical care, and (3) safeguards public health [1]. In November 2014, the Chinese Ministry of Education, in collaboration with six other departments, including the National Health and Family Planning Commission and the State Administration of Traditional Chinese Medicine, issued the Opinions on Deepening the Reform of Clinical Medical Talent Training through Medical-Educational Collaboration (Education Research [2014] No. 2). This directive explicitly called for the standardization of resident physician training. Beginning in 2015, all newly enrolled clinical medical master’s degree graduate students, who also participate in standardized residency training, are required to undergo clinical training in accordance with the nationally established standardized requirements for resident physicians [2]. However, due to various factors, there remains no clearly defined integrated model for the convergence of standardized resident physician training and professional master’s degree graduate education. Furthermore, the lack of quantifiable assessment and evaluation criteria persists. Consequently, the development and mastery of a systematic training evaluation indicator system carries profound significance for advancing healthcare reform in China.

In 1973, renowned psychologist David McClelland at Harvard University introduced one of the earliest concepts of competency. He defined it as underlying characteristics that distinguish individuals with exceptional performance from their peers in a specific job. These characteristics may encompass motivation, personality traits, self-image, attitudes, values, domain-specific knowledge, cognitive or behavioral skills, and more. Notably, these traits significantly differentiate individuals with outstanding and average performance—a concept known as the “competency model” or “competency-based model“ [3]. In the 1980s, the concept of competency was applied to the study of clinical physicians. The goal was to outline the competencies required by qualified resident physicians. This approach, known as competency-based medical education, offers a detailed description of the competency requirements for resident physicians, enhancing the standardization of their training [4].

Some Western countries, following extensive research, have established national physician competency frameworks and competency-based education and training (CBET) models. Competency serves as the foundation for medical education, training, and assessments [5]. Subsequently, pertinent governmental bodies, institutions, and professional associations in developed countries embarked on extensive and in-depth studies of this educational model, seeking to establish national competency frameworks. For instance, in 1998, the Accreditation Council for Graduate Medical Education (ACGME) in the United States introduced a competency framework for clinical physicians, which included domains such as professionalism, patient care, medical knowledge, communication skills, practice-based learning, and system-based practice.This model functions as the foundational framework for postgraduate education and enjoys widespread implementation. To assess specialist physician competencies more effectively, the ACGME and the American Board of Medical Specialties (ABMS) developed the Toolbox of Assessment Methods based on their competency model [6]. This toolbox utilizes systematic and scientific evaluation methods to assess competency levels. In 2005, the Royal College of Physicians and Surgeons of Canada (RCPSC) introduced the CanMEDS Competency Framework, which categorizes the competencies required by specialist physicians into seven roles: medical expert, communicator, collaborator, manager, health advocate, scholar, and professional. Each role is detailed in terms of the fundamental knowledge, skills, and abilities indispensable for specialist physicians [7]. Furthermore, the RCPSC also developed the CanMEDS Assessment Tools Handbook to evaluate the competency levels of specialist physicians in each of these roles [8]. In 2010, the International Association for Medical Education published a report in The Lancet on the future of medical education in the 21st century. This report highlighted the third generation of medical education reform, which focuses on clinical competency as the foundation for medical education and talent development. This model has become a global trend in the development of medical education [9, 10].

The Chinese residency training system was initiated relatively late. Currently, the standardized training for resident physicians focuses on job competency. However, the assessment indicators for job competency are relatively generic compared to international standards. Therefore, how to further develop the content of competency is an important issue that needs to be addressed urgently [11]. In 2014, the Notice of the National Health and Family Planning Commission on the Issuance of the Management Measures for Standardized Residency Training for Resident Physicians (Trial) (National Health Science and Education Development [2014] No. 49) outlined the objective of standardized residency training for resident physicians: to nurture a high-caliber clinical physician workforce with a fundamental focus on cultivating job competency [12]. In 2018, a collaborative effort by the Chinese Elite Teaching Hospitals Alliance for Resident Physician Training yielded the Alliance Consensus on Core Competencies for Resident Physicians, aiming to establish a core competency framework aligned with the unique Chinese context. This consensus introduced six core competencies that resident physicians should possess: professionalism, knowledge and skills, patient care, communication and collaboration, teaching abilities, and lifelong learning [11]. Furthermore, this framework presented 3 to 4 sub-requirements for each competency indicator. However, when it comes to assessing job competency in clinical resident physicians (including integrated postgraduates), only general principles have been established, lacking specific third-level indicators and their corresponding weights. Therefore, there is a compelling need to refine the evaluation system for the job competency of resident physicians. This refinement will not only facilitate a more comprehensive appraisal of their job competency but also contribute to the standardization and enhancement of residency training practices. With this in mind, the objective of this paper is to use the Delphi expert consultation method to develop a set of assessment indicators for clinical resident physicians (including integrated postgraduates) [13,14,15]. In addition, we will employ the Analytic Hierarchy Process (AHP) to determine the relative weights of these indicators. The ultimate aim is to provide a solid foundation for the quantitative and optimized assessment of job competency in clinical resident physicians (including integrated postgraduates). This research holds theoretical significance and practical relevance for the advancement of the medical workforce, the elevation of healthcare standards, and the progress of medical education and healthcare reform.

Objectives and methods

Development of a draft version of the job competence evaluation indicators

This study undertook an extensive literature review spanning the years from 1990 to 2022 to investigate the job competency of clinical resident physicians. The research encompassed a wide array of domestic and international databases, including but not limited to Wanfang, CNKI, CQVIP, PubMed, Medline, and Web of Science. Our search employed a set of strategic keywords such as “Job competence,“ “Resident physician,“ “Standardized training,“ “Professional degree graduate,“ " Integrated education,“ and others. Initially, the literature screening process involved a rigorous assessment of abstracts to exclude irrelevant and duplicate publications, alongside merging similar sources. Concurrently, the selection of relevant indicators was meticulously conducted, paralleling a thorough analysis of physician competency literature in their respective roles. Furthermore, this study was significantly inspired by a variety of sources, which collectively contributed to the formulation of the initial draft of the evaluation indicator system framework. These sources included the Core Competency Framework for Physicians in the United States [16], Canada’s Competency Framework for Physicians [7], the Physician Competency Framework outlined in “Good Medical Practice” by the General Medical Council (GMC) of the United Kingdom in 2006 [17], the Competency Framework for Clinical Physicians developed by China Medical University [18], and the Core Competency Framework Consensus for Resident Physicians released by the Chinese Elite Teaching Hospitals Alliance for Resident Physician Training.

Finalization of the job competence evaluation indicators based on expert consultation

Development of a questionnaire on expert consultation

A questionnaire was developed composed of the following parts: (1) an introduction to the background, purpose, and significance of the consultation, and a guide for filling out the questionnaire; (2) a survey gathering basic information about the respondents, such as their gender, age, specialties, years of work, title, and educational background; and (3) a form to collect the respondents’ expert opinions on the importance of each of the evaluation indicators for job competence. These opinions were collected through the use of 5-point Likert scales, where 1 = not important at all, 2 = of little importance, 3 = of average importance, 4 = very important, and 5 = absolutely essential [19]. The questionnaire also had an open field where the respondents could add additional comments and suggest changes to the evaluation indicators [20]. The indicators were then screened according to the following criteria: (1) mean importance score > 4.0, (2) coefficient of variation < 0.25, and (3) full-score ratio > 20% [21].

Determination of consulting experts

Experts were chosen for their familiarity with the characteristics of job competence, and were recruited using purposive sampling. To ensure geographical representation, experts were chosen from major tertiary grade A hospitals in Guangdong Province, and hospitals, universities, and research institutions in Hainan and Beijing. The selection criteria were as follows: (1) voluntary participation, (2) > 5 years of experience in the relevant fields, (3) with at least a bachelor’s degree and an intermediate or higher professional title, and (4) with comprehensive knowledge and understanding of the relevant issues. The final roster was composed of 40 experts.

Expert consultation

In June 2022, the questionnaires were distributed to the experts and completed through the online survey platform Wenjuanxing. The experts did not meet with each other and were unaware of each other’s responses. After collecting the answered questionnaires, the research group calculated the mean weight scores, coefficient of variation (CV), and full-score ratio of each indicator; they also used the expert opinions to decide whether to revise any indicators. In the first round of expert consultation in this study, the coefficient of concordance was 0.675 (χ2 = 2348.979, P < 0.001), with a variation coefficient ranging from 0.060 to 0.234. The experts unanimously agreed on the importance of the indicators, and their assessment opinions were well-coordinated. Therefore, there was no need for a second round of expert consultation [22, 23]. Furthermore, based on the criteria for indicator selection, the experts did not suggest any deletions or modifications to the evaluation indicator system. The overall number of evaluation indicator items and the framework remained unchanged. However, there were two suggested modifications to the wording of third-level indicator names. Specifically, for “Know the basic elements of the medical profession,“ experts recommended changing “Know” to “Know, be familiar with, and master.“ For “Reasonably accept criticism from patients, family members and colleagues,“ experts suggested changing “Rationally accept” to “Make self-adjustment and self-improvement based on.“ We have made corresponding modifications to each item as suggested, resulting in the final formation of the competency evaluation indicator system for standardized training of clinical resident physicians (including integrated postgraduates). This system comprises six primary indicators, eighteen secondary indicators, and one hundred and sixteen tertiary indicators.

AHP analysis for determining the weight of the indicators at all levels

AHP was formally introduced and developed by the American operations researcher Saaty in the mid-1970s. It provides a suitable analyzing method because AHP is a multi-criteria decision making technique that allows subjective as well as objective parameters to be considered in the decision making process. The hierarchical process provides the possibility of studying the problem as a whole, while paying attention to the interaction between intrahierarchical components. In this method, each decision making problem can be designed in a tree framework, so that the levels of the tree include the goals, criteria to achieve the goals, sub-criteria and finally, the understudy options. By breaking the problem into decision-making levels, it can focus on the smaller set of decisions [24, 25]. The AHP analysis was conducted as follows. (1) Establishing a Hierarchical Model: Drawing from Fariba Hassani’s research [25], a hierarchical model was formulated, comprising the overall goal level, the benchmark level, and the alternative level. The overall goal level represents the decision objectives, the benchmark level serves as the intermediary, and the alternative level signifies the candidate indicators. In this study, the competency evaluation indicator system for standardized training of clinical resident physicians (including integrated postgraduates) was categorized into three levels: the goal level for research purposes, the benchmark level for primary and secondary evaluation indicators, and the alternative level for tertiary evaluation indicators. (2) Constructing Judgment Matrices: Judgment matrices blend qualitative and quantitative analysis. Based on the 9-level ratio scale developed by Saaty (as outlined in Table 1), every indicator within each level was compared pairwise to generate the relative importance judgment matrix for that level [26]. The Saaty scale in this study was established by calculating the difference between the average importance scores assigned by experts to the indicators. Specifically, assuming Zij and Zjk represent the average importance values for any two evaluation aspects, a judgment matrix is constructed. It is determined that if Zij-Zjk = 0, Zij and Zjk are equally important, and the Saaty scale is set to 1. If 0.13 < Zij-Zjk < 0.26, Zij is slightly more important than Zjk, and the Saaty scale is set to 3. If 0.39 < Zij-Zjk < 0.52, Zij is considerably more important than Zjk, and the Saaty scale is set to 5. If 0.65 < Zij-Zjk < 0.78, Zij is significantly more important than Zjk, and the Saaty scale is set to 7. If Zij-Zjk > 0.91, Zij is extremely more important than Zjk, and the Saaty scale is set to 9. When the difference falls between the two scales, the Saaty scale is 2, 4, 6, or 8 [25]. Following these principles, two pairwise judgment matrices for primary indicators were created, as depicted in Table 2.(3)Consistency Check and Weight Calculation Results: Utilizing Yaahp 10.3 software, weights for each level and a consistency check were determined. After inputting the pairwise comparison scales for each matrix, a fundamental consistency check was performed. A fully consistent matrix has a consistency ratio of 0. A matrix is considered satisfactorily consistent when the consistency ratio is less than 0.1 [27]. If the consistency ratio exceeds 0.1, the matrix lacks consistency. Following the successful consistency check for all matrices, the Yaahp 10.3 software was employed to compute the weight results for each indicator element.

Table 1 Importance Scale Level 1–9 [28]
Table 2 AHP Judgment Matrix for Primary Indicators

Calculation of relevant indicators and interpretation criteria

Relevant indicators were calculated as follows. (1) Enthusiasm coefficient of experts: Reflected by the rate of response to the questionnaire, meaning the number of experts who completed the questionnaire/the total number of invited experts × 100% [29].(2) The degree of expert authority: Based on the basis for the experts’ judgment (theoretical basis, practical experience, Peer opinions, and expert intuition) and familiarity with the issue (very familiar, moderately familiar, somewhat familiar, less familiar, and unfamiliar). The corresponding assigned scores are shown in Table 3. The authority coefficient (Cr) = (familiarity coefficient + judgment coefficient)/2, with Cr > 0.7 indicating a high level of expert authority [20]. (3) Full-score ratio of indicators = the number of experts giving full scores/the total number of invited experts × 100%.

Table 3 Items Used to Calculate Expert Authority Coefficients and Criteria for Assigning Values

Statistical analysis

SPSS 20.0 was used for the statistical analysis, and yaahp 10.3 was used for the AHP analysis to calculate the weights of indicators at each level. Normally distributed measurement data were expressed as (x̄ ± s), whereas enumeration data were expressed in frequencies. Kendall’s W and χ2 testing were used to determine the degree of consistency of the expert opinions. AHP analysis was used to calculate the weights of indicators at each level. The combined weights of primary indicators were calculated based on the scores given by the experts, whereas the combined weight of the secondary indicator = the score of the secondary indicator × the combined weight of the primary indicator, and the combined weight of the tertiary indicator = the score of the tertiary indicator × the combined weight of the secondary indicator. A two-sided test with a testing level of α = 0.05 was conducted.

Results

Basic information of experts

The 40 enrolled experts were (42.13 ± 5.73) years old, and the number of years engaged in related work was (16.63 ± 7.30). Among them, 23 (57.5%) were male and 17 (42.5%) were female, 72.5% held a master’s degree or higher, and 77.5% held senior titles. The details are shown in Table 4.

Table 4 Basic Information of the Expert Panel (n = 40)

Expert enthusiasm and authority coefficients

In this study, the questionnaires were distributed to 40 experts, and all 40 questionnaires were recovered, with a valid response rate of 100%. The authority coefficient of the experts in this study was 0.879.

Determination of Indicator weights by AHP analysis

Weights of primary and secondary indicators

The importance scores of the primary and secondary indicators were calculated based on the experts’ importance scores of the tertiary indicators. The judgment matrices constructed for the importance evaluation of the primary and secondary indicators both passed the consistency test and could be used to calculate the indicator weights. Professional quality, knowledge and skills, patient care, communication and collaboration, teaching skills, and lifelong learning were weighted as 0.313, 0.248, 0.181, 0.083, 0.066, and 0.110, respectively (see Table 5). The top three secondary indicators in terms of combined weights were clinical skills (0.122), professional ethics (0.120), and professional dedication (0.109).

Weights of tertiary indicators

For the tertiary indicators, the mean importance scores ranged from 4.38 to 4.925, and the CVs ranged from 0.05 to 0.21. The three tertiary indicators with the highest scores were “maintains collaboration with peers and colleagues in patient treatment,” “has clinical thinking skills, makes diagnosis and treatment decisions based on analysis of evidence, and has the ability to administer suitable treatments,” and “abides by laws and discipline and refuses to seek personal gains in medical practice,” and their combined weights were 0.035, 0.028, and 0.027, respectively.

Table 5 A System for Evaluating the Job Competence of Resident Physicians (and Postgraduates) Receiving Standardized Training

Discussion

In the context of the ongoing global medical education reform, the primary focus has shifted towards job competency, with a specific emphasis on elevating the job competency of clinical physicians [30]. This study employs a combination of the Analytic Hierarchy Process (AHP) and the Delphi method to construct an evaluation indicator system for the competency of clinical resident physicians (including integrated postgraduates). Both of these methods are based on the expert group’s theoretical knowledge and practical experience [31, 32]. The AHP, as a method for decision-making, incorporates mathematical and statistical processing on top of expert subjective judgments, enhancing its logical foundation and making it more scientifically rigorous [33]. Moreover, through the inherent logic at each hierarchical level, different indicators are assigned appropriate weights, allowing the evaluation system to be focused while comprehensively reflecting the core competencies that clinical resident physicians (including integrated postgraduates) should possess from various perspectives.Compared to the purely qualitative methods used in the past, this research method results in a more objective and scientific evaluation indicator system.

The competency evaluation system constructed in this study comprises six primary indicators, eighteen secondary indicators, and one hundred and sixteen tertiary indicators. It quantifies competency in six crucial dimensions: professional qualities, knowledge and skills, patient care, communication and collaboration, teaching abilities, and lifelong learning. This quantification enhances comparability and objectivity. Furthermore, the establishment of weight coefficients for primary and secondary indicators enhances the practical applicability of these indicators in real-world scenarios. It not only provides an objective foundation for quantitatively assessing the competency of clinical resident physicians undergoing standardized training (including integrated postgraduates) but also offers a clear and intuitive competency development framework for training institutions. This framework can act as a tangible target in future training programs and clinical teaching, ultimately leading to more effective training outcomes.

In this study, an indicator-based job competence evaluation system for resident physicians (and postgraduates) receiving standardized training was preliminarily developed based on literature analysis and expert discussion. Professional titles, specialties, educational background, job positions, and years of work were used as the criteria for including consulting experts, and a total of 40 experts were ultimately enrolled [34]. Among the 40 experts, the average number of years engaged in related work was 16.50 ± 7.30, showing extensive work experience; 72.5% held a master’s degree or higher, and 77.5% held senior titles. The cohort of enrolled experts could thus be considered highly qualified. The valid response rate of the expert consultation was 100%, and the authority coefficient was 0.879 (> 0.7), indicating that the experts were highly motivated to provide advice, and the consultation results were authoritative and reliable. The Kendall‘s W of the expert responses was 0.675 (χ2 = 42.516, p < 0.001), indicating that the expert opinions were highly consistent. The AHP was introduced to test the consistency of the indicators at all levels. The result Cr < 0.100 suggests that the indicators were unambiguous without logical errors [27].

Based on the AHP analysis, the weights of the primary indicators, namely professional quality, knowledge and skills, patient care, communication and collaboration, teaching skills, and lifelong learning, were 0.313, 0.248, 0.181, 0.083, 0.066, and 0.110, respectively. As demonstrated, professional quality had the highest weight, which is consistent with the findings of the study by Liao Wumeizi [35]. Professional quality refers to the self-imposed moral restraint of physicians and is the soul of their medical career. The Content and Standards of Standardized Resident Training (2020 Revised Edition) suggests that the goal of resident training for medical institutions at all levels is to foster resident clinicians with good professional quality and expertise who are ready to provide humanitarian medical service to patients and who can independently diagnose and treat common diseases within their specialty [36]. The indicator “knowledge and skills” encompasses knowledge and skills in basic medicine, clinical medicine, and related disciplines essential for resident physicians. These two indicators complement each other and are both indispensable. As the paradigm of medical care changes and physician–patient conflict becomes more acute, patient care and communication and collaboration also become increasingly important, which explains why their weights are also high. Finally, teaching skills and lifelong learning are paths for self-improvement in the career of physicians.

The top three secondary indicators were clinical skills (0.122), professional ethics (0.120), and professional dedication (0.109). Clinical skills serve as the cornerstone of a medical professional’s core competencies. A strong command of clinical skills is an indispensable prerequisite for individuals seeking to embark on the path of standardized training as clinical resident physicians, including integrated postgraduates. Notably, clinical skills carry the heaviest weight within the competency evaluation system. This weightage mirrors the findings of Liu Zhuang and fellow researchers in their endeavors to construct a comprehensive competency indicator system for clinical physicians [37].

The three tertiary indicators with the highest scores were “maintains collaboration with peers and colleagues in patient treatment,” “has clinical thinking skills, makes diagnosis and treatment decisions based on analysis of evidence, and has the ability to administer suitable treatments,” and “abides by laws and discipline and refuses to seek personal gains in medical practice”; their combined weights were 0.035, 0.028, and 0.027, respectively. These results show it is important for physicians to be able to assimilate into the team, know their responsibilities, help each other, avoid conflicts, effectively implement medical decisions, and abide by the law in the practice of medicine. In the daily responsibilities of clinical physicians, consistent communication and engagement with patients, their families, higher-level hospitals or specialty facilities, and colleagues are indispensable. Having robust interpersonal communication and teamwork skills serves as the fundamental requirement for fulfilling their roles. The Accreditation Council for Graduate Medical Education has identified interpersonal skills and communication abilities as one of the six essential competencies in the training of family medicine residents in the United States. This requires family medicine residents to effectively communicate and collaborate with patients, their families, and other healthcare professionals [38]. The results also show that it is important for clinical residents (and postgraduates) to develop clinical thinking as part of the diagnosis and treatment process [39].

By comparing our findings with those of other studies conducted domestically and internationally, we have found that, in terms of first-level indicators, there is a fundamental similarity in content, although there may be some differences in the way they are expressed. For example, in Canada, the competency indicator system’s first-level indicators are categorized into seven roles for general practitioners, such as communicators, managers, researchers, etc [7]. In contrast, other countries, including our indicator system, treat various fundamental abilities as first-level indicators. The clinical physician competency framework proposed by the Accreditation Council for Graduate Medical Education in the United States includes professional ethics, patient care, medical knowledge, communication skills, practice-based learning, and system-based practice [16]. Building upon these foundations, our study further refines second and third-level indicators, enriching the competency assessment system. This refinement provides an objective basis for quantifying and optimizing the evaluation of standardized training for clinical resident physicians (including integrated postgraduates).

As the Delphi method is based on experts’ subjective judgment, this study may have some limitations in terms of the indicators that were screened out of the study. In addition, the indicator system for evaluating the job competence of clinical resident physicians (and postgraduates) in standardized training that was established in this study has not yet been piloted in relevant institutions, therefore its reliability and validity need to be further verified in practice.

In summary, the indicator system for job competence evaluation of resident physicians (and postgraduates) receiving standardized training that has been preliminarily established in this study has been proven to have scientific and application value, and it can be a basis for quantitatively assessing the job competence of trainees. It can also serve as a reference for developing curriculum systems, rotation plans, and assessment criteria for postgraduates pursuing a master’s degree in clinical medicine.

Data Availability

All data generated or analyzed during the current study are available from the corresponding author upon reasonable request.

References

  1. Qin A. A study on the establishment of post competency evaluation index system for standardized resident training physicians. North China University of Science and Technology; 2021.

  2. Wu DG, Liu SM, Yang Y, Liu L. Problems and countermeasures of integrated education of postgraduate students with professional degree in clinical medicine. Int Med Health Guid News. 2018;24(22):3.

    Google Scholar 

  3. McClelland DC. Testing for competence rather than for intelligence. Am Psychol. 1973;28(1):1–14.

    Article  Google Scholar 

  4. Lee H. Implement competency-oriented medical education and improve the standardized training level of resident physicians. Chin J Med. 2018;53(1):4.

    Google Scholar 

  5. Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010;32(8):631–7.

    Article  Google Scholar 

  6. Education ACFM, Specialties ABOM. Toolbox Of Assessment Methods. 2000.

  7. Frank JE. The CanMEDS 2005 Physician competency framework. [EB/OL]. [2023-10-06]. http://www.ciperj.org/imagens/canmed2005.pdf. 2005.

  8. Bandiera G, Sherbino J, Frank JR. The CanMEDS assessment tools handbook: an introductory guide to assessment methods for the CanMEDS competencies. Ottawa: The Royal College of Physicians and Surgeons of Canada; 2006. pp. 18–26.

    Google Scholar 

  9. Frenk J, Chen L, Bhutta ZA, et al. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet (London England). 2010;376(9756):1923–58.

    Article  Google Scholar 

  10. Tetzlaff JE. Assessment of competency in anesthesiology. Anesthesiology. 2007;106(4):812–25.

    Article  Google Scholar 

  11. China Consortium of Elite Teaching Hospitals for Residency Education. Consensus on core competency framework for residency education among China consortium of elite teaching hospitals for residency education. Med J PUMCH. 2022;13(01):17–23.

    Google Scholar 

  12. National Health and Family Planning Commission. Notice on the Issuance of the Interim Measures for the Management of Standardized Training for Resident Physicians [EB/OL]. [2023-10-06].http://www.gov.cn/gongbao/content/2015/content_2806023.htm

  13. Suleiman L, Bakhtary S, Manuel SP. Defining core competencies in transfusion medicine for resident physicians: a multi-specialty Delphi consensus study. Transfusion. 2021;61(3):939–47.

    Article  Google Scholar 

  14. Okoli C, Pawlowski SD. The Delphi method as a research tool: an example, design considerations and applications-ScienceDirect. Inf Manag. 2004;42(1):15–29.

    Article  Google Scholar 

  15. Santaguida P, Dolovich L, Oliver D, et al. Protocol for a Delphi consensus exercise to identify a core set of criteria for selecting health related outcome measures (HROM) to be used in primary health care. BMC Fam Pract. 2018;19(1):152.

    Article  Google Scholar 

  16. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach. 2007;29(7):648–54.

    Article  Google Scholar 

  17. GMC.Good Medical Practice[EB/OL]. [2023-10-03].https://www.gmc-uk.org/ethical-guidance/ethical-guidance-for-doctors/good-medical-practice

  18. Sun B, Li J, Wang Q. Construction and application of the Competency Model for Clinical Physicians in China. People’s Medical Publishing House; 2015.

  19. Liu M, He W, Li K, Deng YH. Study on the combination of delphi method and analytic hierarchy process to construct the evaluation index system of discipline construction in an affiliated hospital. Chin Med Pharm. 2019;9(17):18–22.

    Google Scholar 

  20. Peng Y, Zhou X, Liu X, et al. Evaluation of the effect of outpatient service improvement in tertiary hospitals of Nanchang from the perspective of patients. MHM. 2019;17(04):13–7.

    Google Scholar 

  21. Lin X, Yu X, Tian C, Xiao J, Department N. Construction of quality indicators to assess the quality of neonatal intensive care unit nursing:a pilot study. Chin J Evidence-Based Pediatr. 2016.

  22. Chang W, Miao S, Ding T, et al. Study on the identification of emergency medical rescue scenarios with aircraft carriers and carrier aircrafts based on Delphi. Chin J Disaster Med. 2019;7(10):6.

    Google Scholar 

  23. Chen B, Xue C, Ren J, et al. The construction of the family service needs index system(FSNIS)in public health events based on Delphi method. Fudan Univ J Med Sci. 2022;49(1):7.

    Google Scholar 

  24. Shapiro AF, Koissi MC. Fuzzy logic modifications of the Analytic Hierarchy process. Volume 75. Insurance Mathematics & Economics; 2017.

  25. Alimohammadzadeh K, Bahadori M, Hassani F. Application of Analytical Hierarchy process Approach for Service Quality evaluation in Radiology departments: a cross-sectional study. Iran J Radiol. 2016;13(1):e29424.

    Article  Google Scholar 

  26. Saaty TL. Decision making, scaling, and number crunching. Decis Ences. 1989;20(2):404–9.

    Article  Google Scholar 

  27. Yang S, Fu Z, Pan Y, et al. Development and applicability verification of a competency evaluation index system for general practice team leaders. Chin Gen Pract. 2022;25(7):874–81.

    Google Scholar 

  28. Saaty TLA. Hierarchy process. John Wiley & Sons, Ltd.; 2013.

  29. Zhang Y, Sumiya A, Yang J. Establishment of nutrition literacy core items for Chinese people. Chin J Prev Med. 2020;54(10):6.

    Google Scholar 

  30. Horak H, Englander R, Barratt D, Kraakevik J, Soni M, Tiryaki E. Entrustable professional activities: a useful concept for neurology education. Neurology. 2018;90(7):326–32.

    Article  Google Scholar 

  31. Cui K, Shen F, Han B, Liu H, Chen J. Establishment and application of an index system for prevention of coal workers’ pneumoconiosis: a Delphi and analytic hierarchy process study in four state-owned coal enterprises of China. Occup Environ Med. 2018;75(9):654–60.

    Article  Google Scholar 

  32. Vidal LA, Marle F, Bocquet JC. Using a Delphi process and the Analytic Hierarchy process (AHP) to evaluate the complexity of projects. Expert Syst Appl. 2011;38(5):5388–405.

    Article  Google Scholar 

  33. Rangone A. An analytical hierarchy process framework for comparing the overall performance of manufacturing departments. Int J Oper Prod Man. 1996;16(8):104–19.

    Article  Google Scholar 

  34. Guo Y, Yu D, Huang X, Yu C, Cheng X. Using Delphi method to establish evaluation indicator system for core competence of public health professionals. Chin J Pub Health Manage. 2016;32(6):3.

    Google Scholar 

  35. Liao W. The evaluation of the competence of the training of traditional Chinese medicine resident in Guangdong Province. Guangzhou Univ Chin Med; 2020.

  36. Lu J, Li Y, Jin M, Hu J, Zeng S. Explore the innovative path of core competency training for resident physicians in the department of pathology. ETP. 2021;13:3.

    Google Scholar 

  37. Liu Z. Research on construction and evaluation of competency model for clinical physicians. China Medical University; 2017.

  38. Gong X. Research on the construction of general practitioners’ competency index system in China. China Medical University; 2022.

  39. Che YJ, Pang LJ, Lv XD, Shi YN, Jiang X, Jing Y. Thoughts and methods of scientific construction in clinical thinking mode of traditional Chinese medicine. J Tradit Chin Med. 2019;2:5.

    Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This study was supported by the Medical Teaching and Educational Management Reform Research Project Foundation of Jinan University(2021YXJG017).

Author information

Authors and Affiliations

Authors

Contributions

Yuanzheng Fu and Luxian Zeng wrote the main manuscript text, Guoxiang Zhao and Jie Shan prepared Tables 1, 2, 3, 4 and 5. All authors reviewed the manuscript.

Corresponding author

Correspondence to Luxian Zeng.

Ethics declarations

Ethics approval and consent to participate

In accordance with the relevant guidelines and regulations, all methods in this study were performed following appropriate ethical considerations and received approval from the Ethics Committee of Guangdong Second Provincial General Hospital [2023-KY-KZ-067-01]. Informed consent was obtained from all participants included in the study.

Consent for publication

All participants provided written consent for their anonymized data to be used for research and publication purposes.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fu, Y., Zhao, G., Shan, J. et al. Study on a job competence evaluation system for resident physicians (including integrated postgraduates) receiving standardized training. BMC Med Educ 23, 834 (2023). https://doi.org/10.1186/s12909-023-04833-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-023-04833-w

Keywords