Skip to main content

Assessment of clinical competence of graduating medical students and associated factors in Ethiopia

Abstract

Background

Ethiopia has scaled up medical education to improve access to healthcare which presented challenges to maintaining training quality. We conducted a study to assess the clinical competence of graduating medical students and the associated factors.

Methods and materials

A pretest assessment of a quasi-experimental study was conducted in 10 medical schools with a sample size of 240 students. We randomly selected 24 students per school. Clinical competence was assessed in a 12-station objective structured clinical examination. The clinical learning environment (CLE), simulation training, and practice exposure were self-rated. Mean scores for clinical competence, and satisfaction in the CLE and simulation training were calculated. Proportions of students with practice exposure, and who agreed on CLE and simulation items were done. Independent t-tests were used to look at competence differences among subgroups. Bivariate and multiple linear regression models were fitted for the outcome variable: competence score. A 95% statistical confidence interval and p-value < 0.05 were used for making statistical decisions. A 75% cut-off score was used to compare competence scores.

Results

Graduating medical students had a mean competence score of 72%. Low scores were reported in performing manual vacuum aspiration (62%), lumbar puncture (64%), and managing childbirth (66%). Female students (73%) had a significantly higher competence score than males (70%). Higher cumulative grade point average (CGPA), positive appraisal of the CLE, and conducting more clinical procedures were associated with greater competence scores. Nearly half of the students were not satisfied with the clinical practice particularly due to the large student number and issues affecting the performance assessment. About two-thirds of the students were not satisfied with the sufficiency of models and equipment, and the quality of feedback during simulation training. Nearly one-third of the students never performed lumbar puncture, manual vacuum aspiration, and venipuncture.

Conclusions

Medical students had suboptimal clinical competence. A better clinical learning environment, higher cumulative GPA, and more practice exposure are associated with higher scores. There is a need to improve student clinical practice and simulation training. Strengthening school accreditation and graduates’ licensing examinations is also a way forward.

Peer Review reports

Background

Many countries across the world have serious health workforce challenges [1]. In 2020, a global shortage of 15.4 million health workers with huge skill mix imbalances and maldistributions was reported [2]. Sub-Saharan Africa (SSA) is disproportionally affected, with relatively few health workers and a high burden of disease [3, 4]. The density of physicians in SSA countries (< 0.3/1000 population in 2018) was very low as compared to the high-income countries (2.0–5.0) [5, 6].

Like many African countries, Ethiopia invested to increase the number of healthcare workers. It has been implementing the national human resources for health strategic plan which set a goal of increasing the stock of physicians by five-fold from 5411 in 2016 to 28,121 in 2025 [7]. As a result, medical schools have been expanded from 5 in 2005 to 43 in 2022, including 10 private colleges [8]. The annual number of graduates has increased tenfold and reached 1500 – 1600. Despite the positive strides, addressing the health workforce challenges in the country is far from finished. For instance, the density of physicians was still low, at 0.1 per 1000 people [9].

Increasing the number of training institutions can only improve population health when the quality of training is ensured. Workforce quality is an important part of the solutions to the global human resource crisis. Enrolment of so many students with limited medical school adaptations has fueled the quality issues in the country [10,11,12]. Despite the commendable efforts in expanding medical education, Ethiopia has lagged behind the WHO’s recommendations in reforming and implementing medical curricula, expanding student clinical sites and simulation-based training, and strengthening accreditation [13]. There have been shortages of learning resources and experienced faculty [14, 15]. With the rapid expansion, the training quality concerns have further deepened which might have affected student learning. Practice analysis of Junior physicians also reported substantial clinical skill gaps [16].

The effect of the training quality gaps on the competence of medical students is not well studied. Therefore, we conducted a study to assess the clinical competence of graduating medical students and associated factors.

Methods and materials

Study design and period

This pretest assessment is part of a quasi-experimental study aiming to assess the impact of project interventions in improving the quality of medical education in Ethiopia. Considering the complexity of the medical education environment for a random control study, we used a quasi-experimental design which is well suited to examine the cause-and-effect relationships among variables [17]. A posttest part of this quasi-experimental study will be repeated in 2025. Changes in the clinical competence and the associated factors from the pretest level will be used as evaluation measures. This pretest study was carried out in July and August 2022.

Study setting and study population

There were 43 medical schools in Ethiopia, including 33 public and 10 private schools. Most medical schools admit high school graduates using a direct entry scheme while some accept graduates in the health and other science fields using a graduate entry scheme. The duration of medical education is 6 academic years, including 1 year of internship at the end. Starting from the third academic year, medical education is provided at hospitals and other clinical settings. The study population for this study was the 1556 undergraduate medical students who completed or were nearly completing their internship program and expected to graduate in 2022.

Sample size and selection criteria

Ten schools that had graduating classes around the same period were selected in consultation with the Ethiopian Medical Association and the Ministry of Health. Of the 10 selected schools, four employed graduate entry schemes, and two were private schools. About 875 medical students were expected to graduate from the 10 schools in 2022. To determine the sample size, we considered a 95% confidence level, 80% statistical power, 1:1 optimal allocation (sample ratio of intervention to comparison), a moderate effect size of 0.5, and a design effect of two. We calculated the minimum sample size of 218 graduating students. After including a non-response rate of 10%, the final sample size was found to be 240.

Sampling procedures

To include 240 students from the 10 medical schools, we considered recruiting 24 students per school. Because our study aimed at observing students’ competence using a 12-station objective structured clinical examination (OSCE), including 24 students ensured better competency observations and the attainment of adequate data points at each school. To develop the sampling frame, we requested the lists of graduating medical students from the deans’ offices. Using a lottery method, we randomly selected 24 students. We provided the lists of students to the assessors (data collectors) who invited students to participate in the study. If the selected students were not willing to participate for any reason, the data collectors thanked them and did not replace them.

Measurements and instruments

The key variables of interest were clinical competence, clinical learning environment (CLE), simulation training, and practice exposure. We assessed clinical competence using a 12-station OSCE, a reliable method to assess clinical skills [18, 19]. Using global and national standards, OSCE case scenarios, assessment rubrics, and assessor instructions were developed [20,21,22]. The stations focused on the core competencies required for the provision of safe medical care which included 10 manned stations: taking a focused history, conducting pericardium examination, providing patient education and counseling for diabetes mellitus, conducting Leopold’s maneuver, conducting manual vacuum aspiration (MVA) for incomplete abortion, managing childbirth, performing wound suturing, taking emergency response for polytrauma, obtaining consent for hernia repair, and performing lumber puncture (LP). The remaining two stations were unmanned and focused on interpreting chest X-rays and complete blood cell counts for tuberculosis (TB) patients; and prescribing medication for a malaria case. To ensure content validity, we reviewed the stations with both subject matter experts and medical educators. The assessment rubrics had 4 to 7 items and followed a five-point global rating scale (GRS), where 1 meant “poor performance”, 2 “unsatisfactory performance”, 3 “satisfactory performance but not good”, 4 “good performance”, and 5 “excellent performance”. To assess the clinical learning environment, we used a validated clinical learning evaluation questionnaire (CLEQ) [23], a tool for measuring a learning climate in the clinical settings for undergraduate medical students with 18 items organized in four domains: clinical cases, motivation of students, supervision by preceptors, and clinical encounter organization. Similarly, we developed another 10-item structured tool to assess the quality of simulation training using guidelines and literature [24, 25]. Students self-assessed their experiences on each item of the two questionnaires on 5 points Likert scale, where 1 meant strongly disagree, 2 disagree, 3 neutral, 4 agree, and 5 strongly agree. In addition, we developed a structured tool to determine the practice exposure of students to 12 task procedures in the past 12 months. The list of procedures was adopted from the national scope of practice and training curricula. There were also variables about the background characteristics of medical students.

Data collection

The OSCE was administered by 18 physicians and medical educators. A two-day training was given to assessors on data collection procedures, tools, ethical principles, data quality assurance, and the CommCare software application. Assessors informed study participants about the purpose, procedure, and ethical principles of the study and obtained consent. Study participants completed the CLEQ, simulation training quality, and practice exposure questionnaires. The study participants were encouraged to provide accurate information and/or the best plausible response to each item. For the OSCE stations, the data collectors made sure that all essential logistics (standardized patients, models, medical equipment, medical supplies, assessor instructions, and assessment rubrics) were available. The data collectors asked the study participants to undertake the required tasks at each OSCE station within 12 minutes. They conducted direct observations of students’ performance and rated them exclusively using the GRS. The data collectors were also closely supported by 6 supervisors to check errors and omissions. The OSCE assessment rubrics had a total of 64 items and an average Cronbach’s alpha value of 0.79 with a range of 0.62–0. 89 (Table 2).

Data management and analysis

We exported the data from CommCare v. 2.53 to SPSS v. 27 for data cleaning and statistical analyses. Summary statistics were computed for all key variables to check outliers and missing data. Cronbach alpha coefficients were computed to assess the consistencies of items listed in each competence. Means, medians, standard deviations, proportions, tables, and graphs were computed. Mean scores for the 12 competencies and the overall composite mean score were computed. Mean satisfaction scores on CLE and simulation training were also calculated. To conduct desired statistical tests using continuous quantitative variables, we decided to merge many items of CLE and simulation into a single one by transforming the Likert scale data into composite mean scores [26]. The five-point Likert scale measures of CLE and simulation training were also grouped into two categories (strongly agree and agree as “agree”, and strongly disagree, disagree and neutral as” disagree”), and proportions were conducted to give a meaningful interpretation. Proportions were calculated for practice exposure. Since the curriculum did not specify thresholds for the number of clinical procedures expected to be performed, the median for each procedure was used as a cutoff point to decide between high and low exposures. The median values were used as measures of central tendency since the data had outlier values. An independent sample t-test was used to make comparisons between male and female students, private and public schools, direct and graduate entries, students with high and low clinical exposures, and students with high and low CGPA. We also checked the necessary assumptions for regression analysis and ensured the model’s adequacy [27]. Bivariate and multiple linear regression models were then fitted. The outcome variable was the competence score. The independent variables were age, sex, school type, cumulative grade point average (CGPA), school entry scheme, composite satisfaction scores for the four CLE domains, and simulation training. We considered all independent variables with P < 0.025 at the bivariate level for inclusion in the multivariable regression analysis. A 95% statistical confidence interval and p-value < 0.05 were used for making statistical significance decisions. Students are expected to master essential skills for safe and beginner-level healthcare delivery at the point of graduation. A 75% cut-off score which is recommended in mastery learning was used in comparing the competence scores [28].

Data quality assurance

We adopted a standardized CLEQ data collection tool to assess the clinical learning environment. In the case of no standardized tools, the questionnaires for OSCE, simulation training, and practice exposure were reviewed and validated by medical education experts. We recruited senior medical education experts who have experience in conducting OSCE as data collectors. A two-day training was provided to data collectors to standardize data collection. Study investigators along with supervisors ensured that quality data were collected. We used an electronic data collection application to prevent data entry errors and supervised the data collection process.

Ethics

Ethical approval for the study was obtained from the Ethiopian Public Health Association and Johns Hopkins Bloomberg School of Public Health Institutional Review Board with IRB number 21116. Permission to conduct the study was also obtained from the Ministry of Health (MOH) and the deans of medical schools. Study participants provided informed oral consent, and measures were taken to protect autonomy and data confidentiality.

Results

Background characteristics

A total of 218 graduating medical students took part in this study with a response rate of 90.8%. Their mean age was 27.1 years. The majority of study participants were males (70.2%), from public schools (86.2%), and had a cumulative grade point average (CGPA) of more than 3.00 at the beginning of the internship (74.5%). Graduates from private medical schools were younger (mean age 25.7 vs. 27.4 years), had higher proportions of female students (66.7% vs. 23.9%), and had higher CGPA (3.33 vs. 3.19) than those from public medical schools (Table 1).

Table 1 Background characteristics of study participants (N = 218)

Competence scores of graduating medical students

The overall mean competence score of graduating medical students was 72%. The highest scores were observed for obtaining consent for hernia repair (81%), and interpreting chest X-rays and CBC for TB patients (78%). On the other hand, competence scores were relatively low in conducting MVA for incomplete abortion (62%), performing LP (64%), and managing childbirth (66%) (Fig. 1).

Fig. 1
figure 1

Mean clinical competece scores of graduating medical studetns in percentage

There was no statistically significant difference in overall student competence scores between public (71.6) and private (71.7) schools. However, students from public schools had significantly better scores in taking a focused history (p = 0.001), conducting precordium examination (p = 0.002), and obtaining consent for hernia patients (p = < 0.001). On the other hand, students from private medical had significantly better scores in patient education and counseling (p = 0.03), prescribing medication for a malaria case (p < =0.001), and wound suturing (p = 0.02) (Table 2).

Table 2 Mean competence scores of study participants in percentage by school type
Table 3 Mean competence scores of study participants in percentage by medical school entry schemes

Clinical learning environment

Medical students had an overall mean CLE satisfaction score of 75.2%. The motivation of students during clinical practicum had the highest score (83.7%). In addition, the majority of the students knew their learning limitations (91.7%), enjoyed learning at clinical practice sites (88.5%), and thought that the supervisors were good role models (89.9%). However, supervision of students during practicum had a low score (71.4%). Significant of them also reported that the way the supervisors dealt with medical students was satisfactory (40.8%), the number of students in clinical sessions was appropriate (56.0%), and the assessment of clinical learning was aligned with objectives (53.7%) (Fig. 2).

Table 4 Mean competence scores of study participants in percentage by sex
Fig. 2
figure 2

Percent of medical graduates who were satisfied with CLE items and satisfaction scores by CLE domain (N = 218)

Simulation training quality

Overall, 51% of participants were satisfied with the quality of simulation training. Specifically, 77% of respondents said the number of students at the skills development lab (SDL) was appropriate, and 61% acknowledged that supportive trainers were available. About two-thirds of the respondents expressed dissatisfaction with the availability of models and equipment at the lab, and the feedback provided at each practice session, and did not enjoy learning at the skills lab (Fig. 3).

Table 5 Mean competence scores of study participants in percentage by cumulative GPA
Fig. 3
figure 3

Percent of graduating medical students who were satisfied with the simulation training quality (N = 218)

Clinical practice exposure

Of the 12 procedures assessed, the majority of students performed the following tasks more than five times: nutrition assessment (95.9%), urinary catheterization (94.5%), intravenous (IV) cannulation (93.1%), giving oxygen (92.5%), and nasogastric (NG) tube insertion (91.7%). On the contrary, significant proportions of students never performed venipuncture (34.4%), lumbar puncture (LP) (30.7%), manual vacuum aspiration (MVA) (30.3%), and assisted normal delivery (9.6%) (Fig. 4).

Fig. 4
figure 4

Percent of graduating medical students who conducted 5 or more procedures and never conducted

Factors affecting competence scores of graduating medical students

Female medical students had 2.4% higher competence scores compared to their male counterparts (p = 0.03). On average. Medical students with a CGPA of < 3.00 had 7.1% lower competency scores compared to those with a CGPA of > 3.50 (p = 0.001). Similarly, students with a CGPA of 3.00–3.49 had on average 3.7% lower competency scores than those with a CGPA of > 3.50 (p = 0.001). For a unit increase in the satisfaction score of students’ motivations in the CLE, the mean competence score increased by 12.7% (p = 0.020) (Table 6).

Table 6 Bivariable and multivariable linear regression results to assess factors affecting the competence of graduating medical students
Table 7 Mean competence difference of study participants by level of practice exposure (number of conducted procedures)

Discussion

After a successful primary healthcare expansion, Ethiopia has strengthened secondary and tertiary care aiming to increase its responsiveness to the population’s health needs. This progress has stimulated the rapid expansion of medical training in the country. With no congruent attention given to maintaining the training quality, the expansion has affected the medical schools in meeting the minimum pre-service education standards [28, 29]. Understanding the real effects of rapid training expansion and the challenges it poses is a critical step for improvement; particularly in contexts like Ethiopia where there is scanty evidence. To that end, we conducted this research to answer two main questions: what level of clinical competence did the undergraduate medical students master at the point of graduation? And which factors were associated with competence development?

The results of this study showed that the graduating medical students had suboptimal competence scores as a whole and in many competence areas as compared to the 75% cut-off score, signifying students’ capability gaps required for essential healthcare delivery. The pervasive shortages of experienced faculty and learning infrastructure, and challenging practical learning in Ethiopia’s medical schools coupled with the underdeveloped medical education regulation practices might be the underlying factors [7, 15, 30]. Comparable competence scores were also reported by studies conducted in Ethiopia and elsewhere [31,32,33,34]. Challenges of medical education due to shortages of critical training inputs and processes were similarly reported in Tanzania [35]. The competence gaps among the study participants made it clear that the medical graduates were not fully prepared for the responsibilities of general practitioners listed under the national scope of practice guidelines [20]. This means that the new graduates’ performance, confidence, professional identity, career progression, and quality of life can be affected [36, 37]. This all can have huge implications for the standards of patient care.

Effective preservice education for medical students requires high-quality clinical preceptorship and simulation training [38]. Repeated practice opportunities in clinical sites can boost the competencies learned and experiences acquired [39,40,41]. To that end, ensuring an optimal number and variety of cases in clinical settings is vital [42]. However; as stipulated in this study, performing hands-on clinical procedures by the medical students proved relatively more difficult. And a significant proportion of the students also had fewer practice exposures. On top of that, our study depicted that the medical students had challenging simulation and clinical learning environments. Studies conducted in other countries also discovered that the psychomotor abilities of final-year medical students were not fully developed [43,44,45]. The large number of enrolled students in Ethiopia’s medical schools might negatively affect the practical training in both simulation and clinical settings. Introducing medical education program accreditation and regulation has the potential to motivate schools to pursue quality [46]. Other researchers also corroborated our reports of the unnecessary effects of the rapid training scale-up and overwhelming students in Ethiopia [10,11,12, 29, 30]. However, many of the study participants had favorable perceptions regarding the number of students during practice. This might trigger questions about how well the schools used clinical rotations and scheduling to offset practice site overcrowding. And did the schools have adequate clinical sites used for student practice? [47]As per the findings of our study, the medical students’ motivation in clinical learning was associated with competency development. Unfortunately, the existing CLE gaps including the suboptimal availability of case varieties, motivation and supervision of students, and organization of clinical encounters affected the quality of student practice which might diminish the competence development [48,49,50].

Similar to what we reported, good academic performances were also associated with competence attainment in other studies [51]. This entails that medical schools should ensure that well-prepared students are enrolled and effectively taught, evaluated, and supported students across all stages of the curriculum. Despite many programmatic reports suggesting gender disparity in Ethiopia disfavoring females [52], this study depicted that female medical students had higher competence scores than males. They also had better scores in managing TB and malaria cases, conducting manual vacuum aspiration and lumbar puncture. Female nonphysician anesthesia students in Ethiopia similarly outperformed their male counterparts [53]. However, the male midwifery students had a better performance than the females [54].

Strengths and weaknesses

Covering all the required clinical competency domains and considering all types of medical schools found in the country enabled us to generate high-quality evidence. We conducted a direct observation of student performance using OSCE tools which have acceptable reliabilities and high objectivity. The multiple quality indicators were evaluated in the causal chain of educational inputs, processes, and outcomes, providing a better picture of the training. To address logistical challenges, we widened the data collection period to include all schools as the academic calendars of medical schools were variable. The shortage of OSCE logistics was solved in collaboration with the medical schools. Since we did not get standardized assessment rubrics for our purpose, experts assisted in developing and piloting rubrics based on the curricula and standards.

Conclusions

Medical students had suboptimal clinical competence. Lower competence scores were found in clinical procedures. A better CLE, higher cumulative GPA and academic performance, and more practice exposure were associated with high competence scores. We recommend that medical schools need to expand student clinical sites to primary healthcare units and private health facilities. Effective scheduling and clinical rotations are required to boost practice opportunities. Expanding and/or developing preceptors should be conducted. It is also imperative to address the simulation training gaps. Strengthening licensing examinations is also a way forward to ensure the graduates are fit for practice. Research studies are needed to understand the effects of the current medical education status on patient outcomes. Additional investigation is also required to assess the medical students’ ethics, leadership, communication, and collaboration skills.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Abbreviations

CBC:

Complete blood count

CGPA:

Cumulative grade point average

CLE:

Clinical Learning Environment

CLEQ:

Clinical Learning Environment Questionnaire

CPD:

Continuing professional development

DM:

Diabetes mellites

GRS:

Global rating scale

IRB:

Institutional Review Board

IV:

Intravenous

LP:

Lumbar puncture

MOE:

Ministry of Education

MOH:

Ministry of Health

MVA:

Manual vacuum aspiration

NG tube:

Nasogastric tube

OSCE:

Objective structured clinical examination

SD:

Standard deviation

SDL:

Skills development laboratory

SOP:

Scope of practice

SSA:

Sub Sahara Africa

TB:

Tuberculosis

USAID:

United States Aids for International Development

WHO:

World Health Organization

References

  1. Liu JX, Goryakin Y, Maeda A, Bruckner T, Scheffler R. Global health workforce labor market projections for 2030. In: Policy research working paper 7790. World Bank group; 2016. https://documents1.worldbank.org/curated/en/546161470834083341/pdf/WPS7790.pdf. Accesssed 2 Nov 2022.

  2. Boniol M, Kunjumen T, Nair TS, Siyam A, Campbell J, Diallo K. The Global Health workforce stock and distribution in 2020 and 2030: a threat to equity and ‘universal’ health coverage? BMJ Glob Health. 2022;7:e009316. https://doi.org/10.1136/bmjgh-2022-009316.

    Article  Google Scholar 

  3. GBD 2019 Diseases and Injuries Collaborators. Global burden of 369 diseases and injuries in 204 countries and territories, 1990–2019: a systematic analysis for the global burden of disease study 2019. Lancet. 2020;396:1204–22. https://doi.org/10.1016/S0140-6736(20)30925-9.

    Article  Google Scholar 

  4. World Health Organization. Health workforce requirements for universal health coverage and the sustainable development goals. Human Resour Health Obs. 2016;17 https://apps.who.int/iris/handle/10665/250330. Accessed 5 Nov 2022.

  5. World Health Organization. Global health observatory. In: The density of physicians (per 1,000 population). 2018. https://www.who.int/data/gho/indicator-metadata-registry/imr-details/3107. Accessed 5 Nov 2022.

  6. Schluger NW, Sherman CB, Binegdie A, Gebremariam T, Kebede D, Worku A, et al. Creating a specialist physician workforce in low-resource settings: reflections and lessons learned from the east African training initiative. BMJ Glob Health. 2018;3:e001041. https://doi.org/10.1136/bmjgh-2018-001041.

    Article  Google Scholar 

  7. Ministry of Health (MOH). The national human resources for health strategic plan for Ethiopia 2016–2025. 2016. https://pdf.usaid.gov/pdf_docs/PA00TWMW.pdf. Accessed 10 Dec 2022.

  8. Ministry of Education (MOE) and Education and Training Authority (ETA). Lists of universities and colleges with medical and other health science programs in Ethiopia. 2022. unpublished databases.

    Google Scholar 

  9. World Health Organization. World health statistics 2022: monitoring health for the SDGs, sustainable development goals. Geneva: World Health Organization; 2022. https://www.who.int/data/gho/publications/world-health-statistics. Accessed 13 Jan 2023.

  10. Kelly CM, Vins H, Spicer JO, Mengistu BS, Wilson DR, Derbew M, et al. The rapid scale-up of medical education in Ethiopia: medical student experiences and the role of e-learning at Addis Ababa University. PLoS One. 2019;4(9):e0221989. https://doi.org/10.1371/journal.pone.0221989.

    Article  Google Scholar 

  11. Derbew M, Animut N, Talib ZM, Mehtsun S, Hamburger EK. Ethiopian medical schools’ rapid scale-up to support the government’s goal of universal coverage. Acad Med. 2014;89(8 Suppl):S40–4. https://doi.org/10.1097/ACM.0000000000000326.

    Article  Google Scholar 

  12. Mekasha A. Brief history of medical education in Ethiopia: teaching article. Ethiopia Med J. 2020;58(1). https://emjema.org/index.php/EMJ/article/view/1461/577. Accesssed 4 Nov 2022.

  13. World Health Organization. Transforming and scaling up health professionals’ education and training. World Health Organization (WHO) Guidelines; 2013. https://www.who.int/publications/i/item/transforming-and-scaling-up-health professionals%E2%80%99-education-and-training.

    Google Scholar 

  14. Jhpiego. Strengthening human resources for health project 2012-2019 project accomplishments. End of the project report June 2019. https://www.jhpiego.org/wp-content/uploads/2020/06/HRH-EOP-Report_6_12_2019.pdf_f03d9f1c-bfa0-42fb-82b3-204f0c9027a5.pdf.

  15. Morgan C, Teshome M, Crocker-Buque T, Bhudai R, Signh K, et al. Medical education in difficult circumstances: analysis of the experience of clinical medical students following the new innovative medical curriculum in Aksum, rural Ethiopia. BMC Medl Educ. 2018;18:119. https://doi.org/10.1186/s12909-018-1199-x.

    Article  Google Scholar 

  16. Dejene D, Yigzaw T, Mengistu S, Wolde Z, Hiruy A, Woldemariam D, et al. Practice analysis of junior doctors in Ethiopia: implications for strengthening medical education, practice, and regulation. Glob Health Res Policy. 2018;3:31. https://doi.org/10.1186/s41256-018-0086-7.

    Article  Google Scholar 

  17. Stufflebeam DL, Coryn CL. Evaluation theory, models, and applications. 2nd ed. John Wiley & Sons; 2014. https://www.wiley.com/enus/Evaluation+Theory,+Models,+and+Applications,+2nd+Edition-p-9781118074053. Accessed 6 Jan 2023.

  18. Khan KZ, Ramachandran S, Gaunt K, Pushkar P. The objective structured clinical examination (OSCE): AMEE guide no. 81. Part I: a historical and theoretical perspective. Med Teach. 2013;35(9):e1437-e1e46. https://doi.org/10.3109/0142159X.2013.818634.

    Article  Google Scholar 

  19. Gormley G. Summative OSCEs in undergraduate medical education. Ulster Med J. 2011;80:3. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3605523/. Accessed 7 Jan 2023.

  20. Ministry of Health Ethiopia. Scope of practice for health professionals in Ethiopia (draft). 2021. unpublished report.

    Google Scholar 

  21. Association of American Medical Colleges (AAMC). Core entrustable professional activities for entering residency. Curriculum developers’ guide; 2014. https://store.aamc.org/downloadable/download/sample/sample_id/63. Accessed 10 June 2022.

  22. Rao X, Lia J, Wu H, Li Y, Xu X, Browning CJ, et al. The development of competency assessment standards for general practitioners in China. Front. Public Health. 2020;20(8):23. https://doi.org/10.3389/fpubh.2020.00023.

    Article  Google Scholar 

  23. Al Haqwi A, Kuntze J, van der Molen HT. Development of the clinical learning evaluation questionnaire for undergraduate clinical education: factor structure, validity, and reliability study. BMC Med Educ. 2014;14:44. https://doi.org/10.1186/1472-6920-14-44.

    Article  Google Scholar 

  24. Lazzara EH, Benishek LE, Dietz AS, Salas E, Adrainsen DJ. Eight critical factors in creating and implementing a successful simulation program. Jt Comm J Qual Patient Saf. 2014;40(1):21–9. https://doi.org/10.1016/S1553-7250(14)40003-5.

    Article  Google Scholar 

  25. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: A best evidence practical guide. AMEE Guide No. 82. Med Teach. 2013;35:10. https://doi.org/10.3109/0142159X.2013.818632.

    Article  Google Scholar 

  26. Chakrabartty SN. Scoring and analysis of Likert scales: few approaches. J Knowl Manag Inf Technol. 2014;1:2 https://www.researchgate.net/profile/SnChakrabartty/publication/321268871_Scoring_and_Analysis_of_Likert_Scale_Few_Approaches/links/5d5e623392851c37637173ba/Scoring-and-Analysis-of-Likert-Scale-Few-Approaches.pdf. Accesssed 15 Jan 2023.

  27. Montgomery DC, Peck EA, Vining GG. Introduction to linear regression analysis, 5th Edition. Hoboken, New Jersey: By John Wiley & Sons, Inc.; 2012. https://ocd.lcwu.edu.pk/cfiles/Statistics/Stat503/IntroductiontoLinearRegressionAnalysisbyDouglasC.MontgomeryElizabethA.PeckG.GeoffreyViningz-lib.org.pdf. Accessed 21 Jan 2023.

  28. Friederichs H, Marschall B, Weissenstein A. Simulation-based mastery learning in medical students: skill retention at 1-year follow up. Med Teach. 2019;41(5):539–46. https://doi.org/10.1080/0142159X.2018.1503411.

    Article  Google Scholar 

  29. World Federation for Medical Education (WFME). Basic medical education WFME global standards for quality improvement: the 2020 revision. 2020. https://wfme.org/standards/bme/. Accessed 23 Jan 2023.

  30. Girma T, Asaminew T, Matthias S, Fischer MR, Jacobs F, Desalegn S, et al. Establishing medical schools in limited resource settings. Ethiop J Health Sci. 2016;26(3):277–84. https://doi.org/10.4314/ejhs.v26i3.10.

    Article  Google Scholar 

  31. Mengistu BS, Vins H, Kelly CM, MacGee DR, Spicer JO, Derbew M, et al. Student and faculty perceptions on the rapid scale-up of medical students in Ethiopia. BMC Med Educ. 2017;17:11. https://doi.org/10.1186/s12909-016-0849-0.

    Article  Google Scholar 

  32. Tadese M, Yeshaneh A, Baye G. Determinants of good academic performance among university students in Ethiopia: a cross-sectional study. BMC Med Educ. 2022;22:395. https://doi.org/10.1186/s12909-022-03461-0.

    Article  Google Scholar 

  33. Prediger S, Fürstenberg S, Berberat PO, Kadmon M, Harendza S. Interprofessional assessment of medical students’ competencies with an instrument suitable for physicians and nurses. BMC Med Educ. 2019;19:46. https://doi.org/10.1186/s12909-019-1473-6.

    Article  Google Scholar 

  34. Lewis TP, Roder-DeWan S, Malata A, Ndiaye Y, Kruk ME. Clinical performance among recent graduates in nine low- and middle-income countries. Trop Med Int Health. 2019;24:5. https://doi.org/10.1111/tmi.13224.

    Article  Google Scholar 

  35. Miles S, Kellett J, Leinster SJ. Medical graduates’ preparedness to practice: a comparison of undergraduate medical school training. BMC Med Educ. 2017;17:33. https://doi.org/10.1186/s12909-017-0859-6.

    Article  Google Scholar 

  36. Nyamtema A, Karuguru M, Mwangomale S, Monyo AF, Malongoza E, Kinemo P. Factors affecting the production of the competent health workforce in Tanzanian health training institutions: a cross-sectional study. BMC Med Educ. 2022;22:662. https://doi.org/10.1186/s12909-022-03719-7.

    Article  Google Scholar 

  37. Carr SE, Celenza A, Pudde IB, Lake F. Relationships between the academic performance of medical students and their workplace performance as junior doctors. BMC Med Educ. 2014;14:157. https://doi.org/10.1186/1472-6920-14-157.

    Article  Google Scholar 

  38. Tallentire VR, Smith SE, Wylde K, Cameron HS. Are medical graduates ready to face the challenges of foundation training? Postgrad Med J. 2011;87:1031. https://doi.org/10.1136/pgmj.2010.115659.

    Article  Google Scholar 

  39. Johnson P, Fogarty L, Fullerton J, Bluestone J, Drake M. An integrative review and evidence-based conceptual model of the essential components of pre-service education. Hum Resour Health. 2013;11:42. https://doi.org/10.1186/1478-4491-11-42.

    Article  Google Scholar 

  40. Scicluna HA, Grimm MC, Jones PD, Pilotto LS, MacNeil HP. Improving the transition from medical school to the internship – evaluation of preparation for the internship course. BMC Med Educ. 2014;14:23 http://www.biomedcentral.com/1472-6920/14/23. Accessed 3 Feb 2023.

  41. Editorials. Medical students need experience not just competence. BMJ. 2020;371 https://doi.org/10.1136/bmj.m4298.

  42. Sandra JS, Pratt Daniel PD, Glenn R. Competency is not enough: integrating identity formation into the medical education discourse. Acad Med. 2012;87:91. https://doi.org/10.1097/ACM.0b013e3182604968.

    Article  Google Scholar 

  43. Remmen R, Scherpbier A, Van der Vleuten C, Denekens J, Derese A, Hermann I, et al. Effectiveness of basic clinical skills training programs: a cross-sectional comparison of four medical schools. Med Educ. 2001;35(2):121–8. https://doi.org/10.1111/j.1365-2923.2001.00835.x.

    Article  Google Scholar 

  44. Zelesniack E, Oubaid V, Harendza S. Final-year medical students’ competence profiles according to the modified requirement tracking questionnaire. BMC Med Educ. 2021;21:319. https://doi.org/10.1186/s12909-021-02728-2.

    Article  Google Scholar 

  45. Malau-Aduli BS, Jones K, Alele F, et al. Readiness to enter the workforce: perceptions of health professions students at a regional Australian university. BMC Med Educ. 2022;22:89. https://doi.org/10.1186/s12909-022-03120-4.

    Article  Google Scholar 

  46. Verma A, Singhal A, Verma S, Vashist S. Assessment of competencies of medical students in conducting ‘normal delivery’ using various tools. World J Anemia. 2018;2(2):47–50. https://doi.org/10.5005/jp-journals-10065-0029.

    Article  Google Scholar 

  47. World Health Organization. Transforming and scaling up health professionals’ education and training. World Health Organization guidelines 2013; 2013. 9789241506502_eng.pdf (who. int).

    Google Scholar 

  48. AlHaqwi AI, Van der Molen HT, Schmidt HG, Magzub ME. Determinant of effective clinical learning: A student and teacher perspective in Saudi Arabia. Educ Health Change. 2010;23:2. https://www.researchgate.net/publication/46307432. Accessed 19 Jan 2023.

  49. Dejene D, Ayelew F, Yigzaw T, Versluis M, Stekelenburg J, Mengistu M, et al. Qualitative study of clinical education for undergraduate medical students in a resource-limited setting. 2023. unpublished data.

    Google Scholar 

  50. Pienaar M, Orton AM, Botma Y. A supportive clinical learning environment for undergraduate students in health sciences: an integrative review. Nurse Educ Today. 2022;119:105572. https://doi.org/10.1016/j.nedt.2022.105572.

    Article  Google Scholar 

  51. Sellberg M, Palmgren PJ, Möller R. A cross-sectional study of clinical learning environments across four undergraduate programs using the undergraduate clinical education environment measure. BMC Med Educ. 2021;21:258. https://doi.org/10.1186/s12909-021-02687-8. Accessed 1 May 2022.

  52. Gebru HF, Verstegen D. Assessing predictors of students’ academic performance in Ethiopian new medical schools: a concurrent mixed-method study. BMC Med Educ. 2023;23:448. https://doi.org/10.1186/s12909-023-04372-4.

    Article  Google Scholar 

  53. Ministry of Education and Education Strategic Center. Ethiopian Education Development Roadmap (2018–2030) an integrated executive summary. 2018. https://planipolis.iiep.unesco.org/sites/default/files/ressources/ethiopia_education_development_roadmap_2018-2030.pdf.

    Google Scholar 

  54. Asemu YM, Yigzaw T, Desta FA, Scheele F, van der Akker T. Evaluating the effect of interventions for strengthening non-physician anesthetists’ education in Ethiopia: a pre- and post-evaluation study. BMC Med Educ. 2021;21:421. https://doi.org/10.1186/s12909-021-02851-0.

    Article  Google Scholar 

Download references

Acknowledgements

We would like to acknowledge the study participants and data collectors for their dedication and time in obtaining quality data. We also acknowledge the following experts who contributed to conducting this study at various stages. Samuel Mengistu (MD, MPH, Ph.D.) supported us in the designing and planning of the study and data collection. Yohannes Molla (BSc, MSc, FMER) and Mintwab Gelagay (BSc, MSc FMER) contributed to conducting data collection and logistic preparation for the study. Assegid Samuel (BSc, MSc.) and Shelemo Shawula (MD, MPH) contributed to reviewing the study protocol, interpretation of the results, and at the early stage of the write-up.

Funding

This work was financially supported by Jhpiego Ethiopia under its USAID Health Workforce Improvement Program (HWIP). Funding for open access is not provided by the donor.

Author information

Authors and Affiliations

Authors

Contributions

Daniel Dejene, the lead author, contributed to the study concept and design, statistical analysis, results interpretation, and drafting and revision of the manuscript. Firew Ayalew, Tegbar Yigzaw and Alemseged Woretaw contributed to the study concept and design, results interpretation, and manuscript revision.Marco Versluis and Jelle Stekelenburg contributed to the study concept and design, drafting, and revision of the manuscript. All authors read and approved the final version of the manuscript.

Authors’ information (optional)

DD (MD, MPH, FMER) is a Ph.D. student at the Department of Health Sciences, Global Health, the University Medical Center at Groningen University. He is the Deputy Chief of Party for the HWIP project supporting the quality of medical education, Jhpiego Ethiopia. FA (MSc, Ph.D.) is the senior research advisor for the HWIP project, Jhpiego Ethiopia. TY (MD, Ph.D., MPH, FMER) is the Chief of Party for the HWIP project, Jhpiego Ethiopia. AW (MD, MSc) is a senior Education and Training Advisor, at Jhpiego Ethiopia. MV (MD, Ph.D.) and JS are professors at the Department of Health Sciences, Global Health, University Medical Centre Groningen/University of Groningen.

Corresponding author

Correspondence to Daniel Dejene.

Ethics declarations

Ethics approval and consent to participate

Ethical approval for the study was obtained from the Ethiopian Public Health Association and Johns Hopkins Bloomberg School of Public Health Institutional Review Board with IRB number 21116. Permission to conduct the study was also obtained from the Ministry of Health (MOH) and the deans of training institutions. Study participants provided informed oral consent, and measures were taken to protect autonomy and data confidentiality. All collected data were anonymized, handled, and stored by the tenets of the Declaration of Helsinki.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dejene, D., Ayalew, F., Yigzaw, T. et al. Assessment of clinical competence of graduating medical students and associated factors in Ethiopia. BMC Med Educ 24, 17 (2024). https://doi.org/10.1186/s12909-023-04939-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-023-04939-1

Keywords