Skip to main content

Improving the resident assessment process: application of App-based e-training platform and lean thinking

Abstract

Background

The assessment system for standardized resident training is crucial for developing competent doctors. However, it is complex, making it difficult to manage. The COVID-19 pandemic has also aggravated the difficulty of assessment. We, therefore, integrated lean thinking with App-based e-training platform to improve the assessment process through Define–Measure–Analyze–Improve–Control (DMAIC) cycles. This was designed to avoid unnecessary activities that generate waste.

Methods

Panels and online surveys were conducted in 2021–2022 to find the main issues that affect resident assessment and the root causes under the frame of waste. An online app was developed. Activities within the process were improved by brainstorming. Online surveys were used to improve the issues, satisfaction, and time spent on assessment using the app.

Results

A total of 290 clinical educators in 36 departments responded to the survey, and 153 clinical educators used the online app for assessment. Unplanned delay or cancellation was defined as the main issue. Eleven leading causes accounted for 87.5% of the issues. These were examiner time conflict, student time conflict, insufficient examiners, supervisor time conflict, grade statistics, insufficient exam assistants, reporting results, material archiving, unfamiliarity with the process, uncooperative patients, and feedback. The median rate of unplanned delay or cancellation was lower with use of the app (5% vs 0%, P < 0.001), and satisfaction increased (P < 0.001). The median time saved by the app across the whole assessment process was 60 (interquartile range 60–120) minutes.

Conclusions

Lean thinking integrated with an App-based e-training platform could optimize the process of resident assessment. This could reduce waste and promote teaching and learning in medical education.

Peer Review reports

Introduction

All medical students have to participate in resident training to become competent doctors. In China, medical education is a multi-track, long-term process that starts at the undergraduate level, and includes further stages at the graduate, doctoral, and post-doctoral levels. In 2013, to ensure the quality and homogeneity of medical staff, and improve the national quality of the health service as a whole, seven Chinese government ministries jointly launched the Guidelines for Standardized Resident Training (SRT) [1]. These guidelines require all clinicians with a bachelor’s degree or above to receive standardized training for residents. Based on the concept of synergy between clinical practice and college education, Peking University Health Science Center developed a training model for clinical medicine students known as the four-in-one track, which has been in place since 1997. It was the first in China to combine standardized training for four types of students: clinical medicine graduate students, eight-year undergraduate medical students, students with equivalent educational qualifications and residents. All four types of students need to receive the same level of training and reach the same level of competence in medicine. To develop competent [2] physicians in clinical medicine, standardized resident training in China has paid increasing attention to the process of assessment. This has promoted residents’ capacity to apply their knowledge and skills to actual clinical practice.

The process of standardized resident training is very complex. It involves students rotating through dozens of clinical departments. It also involves a wide range of people, such as clinical educators, hospital administrators, and patients. The overall process takes a lot of time and energy for its design, proposition, organization, evaluation, feedback, and filing [3]. New SRT assessment tools were proposed to simplify the process and to improve the quality [4,5,6]. During the COVID-19 pandemic, implementing the assessment process became more difficult because of the increased pressure on both clinical educators and students. For standardized resident training, assessment is critical [7]. However, the integrity and standardization of actual practices are often compromised because of the complexity of the process, clinical work stress, and the time required.

Lean thinking is a management framework originated from a car manufacturing business that aims to help practitioners improve efficiency and the quality of work. Womack first proposed lean thinking as a management concept and methodology in 1991 [8]. Lean thinking is the strategic dimension of the concept of lean [9] and focuses on optimizing processes, improving efficiency, and providing continuous improvements for all members of the organization as a whole by removing waste [10]. On the basis of the concept of waste put forward by Ohno [11], Shingo [12] identified the seven classical waste types from manufacturing, which are overproduction, inventory, over-processing, motion, defects, waiting and transportation. In particular, under-used talent has been further identified as an eighth waste [13]. In addition to its beneficial application in manufacturing, lean thinking has been widely used and validated useful in a wide variety of fields including educational settings [14, 15]. In higher education, more work has been carried out to define sub-wastes and explain the eight wastes [16]. Recent years have seen an emerging use of lean thinking in medical education [17,18,19,20]. The philosophy and methodology to identifying and eliminating waste generated by massive interdepartmental interaction activities and processes could allow universities and teaching hospitals to manage the SRT assessment in an optimized way.

This study focused on streamlining the process of resident assessment by using an App-based e-training platform, drawing on the philosophy of lean thinking. This study is important because it explores how to integrate lean thinking with informational technologies and sheds light on the value of adopting lean thinking for resident assessment. This has significant practical implications in the field of medical education under the impact of COVID-19. It therefore enriches the literature by providing a more efficient management framework for resident assessment for use in teaching hospitals.

Methods

Optimized SRT assessment process establishment through DMAIC cycle

This study followed the lean six sigma DMAIC methodology [21] to optimize the process of resident assessment using an App-based e-training platform, a mobile app called “XUEYIKU” developed by Peking University Third Hospital (Fig. 1). DMAIC refers to the five sequential phases of process improvement: Define, Measure, Analyze, Improve, and Control [22]. The DMAIC tool of SigmaGuide (http://www.sigmalogic.de) was used to help implement the DMAIC cycle. It supports collecting, processing and integrating the necessary information along the five phases of DMAIC and provide tools and charts along the DMAIC phases.

Fig. 1
figure 1

Chinese User interface of “XUEYIKU” presented on a mobile device

Define

Panels hearing from students and teachers about resident assessments were held every month. Complaints and opinions were filtered and translated into requirements. It was confirmed that the key need shared by both clinical students and educators was to have no unplanned delay or cancellation (UDC) of resident assessment. To integrate and clarify the scope of this study, the core process was mapped to the Supplier–Input–Process–Output–Customer (SIPOC) diagram (Fig. 2). A study team of multidisciplinary experts including administrators, clinical teachers, and evaluation experts was established. The main issue or defect was defined as UDC, which results in unnecessary or additional work. The definition of the rate of UDC is cases of UDC/total planned resident assessments per semester. After comprehensive consideration based on the baseline data, the target of this study was set as reducing the incidence of UDC to zero by March 2022.

Fig. 2
figure 2

SIPOC diagram of resident assessment (the process shown is the pre-study process)

Measure

In this project, the pre-study process started with an exam and ended with the filing of examination materials (Fig. 2). Factors causing delay or cancellation were analyzed across the process using the wastes framework (Table 1). In total, 18 sub-wastes were identified through brainstorming of Input–Analyze, with actual examples after 26 field observations and 16 team meetings since June 2021. Online surveys developed by the study team were conducted using random sampling in 36 clinical departments to identify the negative factors influencing resident assessment. We then compared the rate of UDC, satisfaction with the whole process of resident assessment (using a five-point Likert-type scale), and time savings before and after the application of an online app developed using informational technology (See Improve).

Table 1 Wastes occurring in resident assessment

Analyze

The collected data on factors causing UDC were evaluated graphically. Eleven main root causes were responsible for 87.5% of the issues encountered. These were examiner time conflict, student time conflict, insufficient number of examiners, supervisor time conflict, grade statistics, insufficient number of exam assistants, reporting of results, material archiving, unfamiliarity with the process, uncooperative patients, and feedback (Fig. 3).

Fig. 3
figure 3

Root causes of unplanned delay or cancellation (UDC) of resident assessment in a teaching hospital. The vertical left axis represents the frequency of occurrence of root causes of UDC of resident assessment which are listed in descending order of counts starting at the left side of the Pareto chart. The right vertical axis represents the total cumulative percentage of root causes of UDC. The crossed dotted lines indicate that eleven leading causes accounted for 87.5% of the issues

Improve

To find solutions for the root causes, we reviewed and analyzed the process map in a brainstorming session, to find intelligence technology (IT)-supported ways to improve activities (Fig. 4). We established an App-based e-training platform using the lean thinking in order to improve resident training assessment. To be specific, we used natural language processing (NLP) technology [23,24,25,26] to extract features and convert them to a normalized data structure. The department of education provided training on how to organize the resident assessment.

Fig. 4
figure 4

The improved process for resident assessment, showing the value of IT. The whole process of resident assessment has been improved under the philosophy of lean. Each activity supported by the App-based e-training platform is connected to a can with detailed description

As we previously reported [27], NLP algorithm in the current study is mainly used after assessment and serves the feedback stage (Fig. 4). Taking a resident's inadequate grasp of the differential diagnosis of systemic lupus erythematosus in the exam as an example, the XUEYIKU app divided the text of a teacher's evaluation into words in a Chinese word segmentation process, and extracted the keywords "systemic lupus erythematosus” and “differential diagnosis" by using the semantic analysis of NLP. Specifically, abbreviation substitution, synonym replacement, and text similarity calculation may also be used in information retrieval, machine translation, text mining etc., according to the teacher's actual evaluation. The student side of the app matches training tasks with the same theme in the test question bank according to the keywords extracted and sends them to students for intensive learning.

The measured main root causes related to multiple aspects in the wastes framework (Table 1). Time conflicts of examiners, students, and supervisors together accounted for 40.8% (281/688) of the main causes. These causes led to waiting, over-processing and motion. The new App-based e-training platform supports one-click notification, educator database memory and automatic student rotation schedule-matching. Functions to support grade statistics, reporting and feedback were developed to save waiting, and have solved 25.4% (175/688) of the main causes. Text conversion of voice input has also been developed to save time and provide more feedback.

Insufficient examiners and exam assistants together accounted for 18.5% (127/688) of the main causes. These causes showed under-used talent in resident assessment. The IT-supported online platform included both information and technology talents, and it has optimized the whole process to assist the organization of assessment.

Archiving accounted for 5.7% (39/688) of the main causes. The app was designed to file all trace and assessment forms, grades and feedback. It has therefore helped to solve the inventory problem of material archiving. Standard operation procedure (SOP) and communication skills were both highlighted in the training to help reduce the problems of process unfamiliarity (5.2%, 36/688) and uncooperative patients (4.4%, 30/688). Step-by-step activities are also embedded in the app to aid the completion of the whole process.

Control

The study team verified the improvements and benefits of the newly developed app continuously. They also organized feedback panels and discussions over an interval of 2–4 weeks. An SOP was developed, including a process control, and this was installed and revised to take into account field observations. Control charts were used to monitor the process, and departments with high rates of UDC were prompted about SOP use by the educational management committee.

Evaluation of the process established

An online survey was conducted to investigate the effect of introducing the optimized SRT assessment process. We compared the rate of UDC and the satisfaction before and after the use of the App-based e-training platform using the Wilcoxon matched-pairs signed-ranks test. All data was analyzed using Stata version 15.0. All tests were two-sided and were considered statistically significant if P < 0.05.

Ethical review

This study was reviewed and approved by the Peking University Third Hospital Medical Science Research Ethics Committee (No. IRB00006761-M2022063). All methods were carried out in accordance with relevant guidelines and regulations. Informed consent was obtained from all participants.

Results

The SOP for resident training assessment was built using a self-developed mobile app called “XUEYIKU”. Quality control measures were divided into three areas: training for organizers and examiners, field supervision by the educational management committee and real-time feedback data obtained through the examination system. The XUEYIKU app was officially used in June 2020. Until 2022, there was a total of 6392 medical students and 6616 teachers from 29 different departments registered online.

A total of 290 clinical educators in 36 departments responded to the online survey. Before using the online mobile app, all participants gave causes for UDC. In total, 153 participants used the app for resident assessments. The median rate of UDC before and after the introduction of the app was 5% (0%, 15%) and 0% (0%, 10%) (P < 0.001) (Table 2). There was a statistically significant difference in satisfaction before and after the introduction of the app (P < 0.001) (Table 3). The median time saved by the app was 60 (interquartile range 60–120) minutes. The study team observed more friendly and positive communications between clinical educators and students after the improvement of the resident assessment process.

Table 2 Comparison of rate of UDC before and after app introduction in various departments (N = 153)
Table 3 Comparison of satisfaction before and after app introduction in various departments (N = 153)

Discussion

Optimizing the process of resident assessment to provide continuous improvements in quality

COVID-19 has greatly influenced the field of education and has highlighted a profound need for advanced technology. In teaching hospitals, educators are also clinical doctors. In medical education, face-to-face assessment is still the dominant way for clinical educators to evaluate whether residents are competent. In this study, the pre-study process for resident assessment was optimized and improved by eliminating root causes leading to UDC, all of which could cause wasting of time and resources, and loss of enthusiasm. A total of 149 resident assessments were held from June 2021 to January 2022. A 5% UDC means five to six cancellations of examinations per semester. One cancellation involves at least three groups of people, including clinical doctors, residents or students and supervisors from the educational management committee. It generates schedule confusion, requires rearrangement of the examination site and may result in emotional exhaustion for the organizer, who may be under considerable pressure from both clinical and educational work. It is therefore necessary to optimize the complex process of resident assessment to provide continuous quality improvement and support the development of medical talent. The first step in optimizing any process is to eliminate or reduce the root causes of the main problem. We found 11 root causes, which could be classified into waiting, over-processing, inventory, motion, defects, and under-used talent using the framework of waste from lean philosophy. The application of an App-based e-training platform, together with supportive training and the development of an SOP helped to reduce and eliminate these root causes. UDCs have been significantly reduced, especially in departments requiring surgical operations. These departments often have less time to spend on clinical education. Some activities like feedback were being compressed because of limited time. The app helped to optimize the whole process and enable clinical doctors to complete the assessment process.

Integration of lean thinking and App-based e-training platform to improve efficacy

Lean thinking is both a methodology and a philosophy, designed to build continuous quality improvement in the culture of an organization. By eliminating factors hindering the overall process of resident assessment, the combination of lean thinking and App-based e-training platform in this study helped to build a more user-friendly and efficient process. Our work confirms the efficacy of integrating lean thinking and App-based e-training platform, and sheds light on how to improve a complex assessment process by preventing waste.

Using the waste framework, we were able to save time through multiple activities supported by the app. For instance, the function of sending notifications to everyone involved in the examination saved time spent on phone calls or text messages. The function providing grade statistics and reports saved doctors from calculating or preparing those manually, avoiding potential errors. The function of text conversion from voice input could save time and enable more detailed feedback. The total time saved was a median of about 60 min, which was equal to the time a doctor spends on two laparoscopic appendectomy surgeries or four hysteroscopic surgeries. The time saved could also be spent in improving teaching or assessment quality. Time cost reduction, process optimization, and satisfaction improvements could all contribute to improved efficacy of the assessment process and result in more competent doctors.

Fostering a teaching and learning culture

The app helped clinical educators to avoid missing activities in the assessment process, because the steps have to be completed sequentially. This therefore reduces the requirement to remember steps and also avoids any confusion. The informational staff at the hospital have also been encouraged to contribute to medical education, and a culture of full participation of clinical educators, administrators, and technicians in teaching has been gradually developed. This suggests that the introduction of lean thinking can increase satisfaction and enthusiasm in the management of medical education. It supports the organization’s approach to quality, and once the culture has been built, a mutually beneficial reinforcing cycle of lean and quality improvement for both educators and students should be developed.

Limitations

To the best of our knowledge, this is the first prospective study to integrate lean thinking with App-based e-training platform to improve the process of resident assessment. However, this study had some limitations. First, it was based on a single institution. The hospital is representative of most teaching hospitals in China, because clinical medicine at Peking University plays a leading role in the field of medical education. Nevertheless, the extrapolation of the study findings might be limited by other factors including financial investment in education, informational level and the culture of teaching in other hospitals. Second, there might be recall bias, because the UDC rate was provided by clinical educators. An error log is needed to record objective data on routine administration. Additionally, the COVID-19 pandemic might have compounded the negative impact of examination cancellations and influenced the causes leading to UDC.

Conclusions

Combination use of DMAIC cycle framework and App-based e-training platform has a significant positive effect in optimizing the process of resident assessment. This integration reflects the successful use of lean thinking in medical education, and encourages a stronger teaching environment and culture for both students and clinical educators.

Availability of data and materials

All data generated or analysed during this study are included in this published article. A license was required to use any of the data collection instruments in this study upon reasonable request and with permission of the authors (contact email: yuanwenqing@bjmu.edu.cn).

References

  1. Huang S, Chen Q, Liu Y. Medical resident training in China. Int J Med Educ. 2018;9:108–10.

    Article  Google Scholar 

  2. Wagner JP, Lewis CE, Tillou A, Agopian VG, Quach C, Donahue TR, Hines OJ. Use of Entrustable Professional Activities in the Assessment of Surgical Resident Competency. JAMA Surg. 2018;153(4):335.

    Article  Google Scholar 

  3. Boulet JR, Durning SJ. What we measure … and what we should measure in medical education. Med Educ. 2019;53(1):86–94.

    Article  Google Scholar 

  4. Rekman J, Hamstra SJ, Dudek N, Wood T, Seabrook C, Gofton W. A New Instrument for Assessing Resident Competence in Surgical Clinic: The Ottawa Clinic Assessment Tool. J Surg Educ. 2016;73(4):575–82.

    Article  Google Scholar 

  5. Glarner CE, McDonald RJ, Smith AB, Leverson GE, Peyre S, Pugh CM, Greenberg CC, Greenberg JA, Foley EF. Utilizing a novel tool for the comprehensive assessment of resident operative performance. J Surg Educ. 2013;70(6):813–20.

    Article  Google Scholar 

  6. Van Heest AE, Agel J, Ames SE, Asghar FA, Harrast JJ, Marsh JL, Patt JC, Sterling RS, Peabody TD. Resident Surgical Skills Web-Based Evaluation: A Comparison of 2 Assessment Tools. J Bone Joint Surg Am. 2019;101(5):e18.

    Article  Google Scholar 

  7. Tudevdagva U. Structure-Oriented Evaluation: An Evaluation Approach for Complex Processes and Systems. Cham: Springer International Publishing; 2020.

    Book  Google Scholar 

  8. Wolf BM. The Machine That Changed the World. J Int Bus Stud. 1991;22(3):533–8.

    Article  Google Scholar 

  9. Hines P, Holweg M, Rich N. Learning to evolve. Int J Oper Prod Man. 2004;24(10):994–1011.

    Article  Google Scholar 

  10. Chiarini A. From Total Quality Control to Lean Six Sigma: Evolution of the Most Important Management Systems for the Excellence. Milano: Springer Milan; 2012.

    Book  Google Scholar 

  11. Ohno T. Toyota Production System: beyond Large-Scale Production. Cambridge, Massachusetts: Productivity Press; 1988.

    Google Scholar 

  12. Shingo S. A Study of the Toyota Production System from an Industrial Engineering Viewpoint. Cambridge, Massachusetts: Productivity Press; 1989.

    Google Scholar 

  13. Brito M, Ramos AL, Carneiro P, Gonçalves MA. The eighth waste: Non-utilized talent. In: Lean Manufacturing: Implementation, Opportunities and Challenges. NY, U.S.A.: Nova Science Publisher; 2019. p. 151–63.

    Google Scholar 

  14. Kakouris A, Sfakianaki E, Tsioufis M. Lean thinking in lean times for education. Ann Oper Res. 2022;316(1):657–97.

    Article  Google Scholar 

  15. Klein LL, Tonetto MS, Avila LV, Moreira R. Management of lean waste in a public higher education institution. J Clean Prod. 2021;286:125386.

    Article  Google Scholar 

  16. Kazancoglu Y, Ozkan-Ozen YD. Lean in higher education. Qual Assur Educ. 2019;27(1):82–102.

    Article  Google Scholar 

  17. John N, Snider H, Edgerton L, Whalin L. Incorporation of lean methodology into pharmacy residency programs. Am J Health-Syst Ph. 2017;74(6):438–44.

    Article  Google Scholar 

  18. Aij KH, Simons FE, Widdershoven GAM, Visse M. Experiences of leaders in the implementation of Lean in a teaching hospital—barriers and facilitators in clinical practices: a qualitative study. BMJ Open. 2013;3(10):e3605.

    Article  Google Scholar 

  19. Kim CS, Lukela MP, Parekh VI, Mangrulkar RS, Del Valle J, Spahlinger DA, Billi JE. Teaching Internal Medicine Residents Quality Improvement and Patient Safety: A Lean Thinking Approach. Am J Med Qual. 2010;25(3):211–7.

    Article  Google Scholar 

  20. Shah NK, Emerick TD. Lean Six Sigma Methodology and the Future of Quality Improvement Education in Anesthesiology. Anesth Analg. 2021;133(3):811–5.

    Article  Google Scholar 

  21. Hutwelker R. Six Sigma Green Belt Certification Project: Identification, Implementation and Evaluation. Cham: Springer International Publishing AG; 2019.

    Book  Google Scholar 

  22. Jamil N, Gholami H, Saman MZM, Streimikiene D, Sharif S, Zakuan N. DMAIC-based approach to sustainable value stream mapping: towards a sustainable manufacturing system. Econ Res-Ekonomska Istraživanja. 2020;33(1):331–60.

    Article  Google Scholar 

  23. Chary M, Parikh S, Manini AF, Boyer EW, Radeos M. A Review of Natural Language Processing in Medical Education. West J Emerg Med. 2019;20(1):78–86.

    Article  Google Scholar 

  24. Hirschberg J, Manning CD. Advances in natural language processing. Science. 2015;349(6245):261–6.

    Article  Google Scholar 

  25. Kreimeyer K, Foster M, Pandey A, Arya N, Halford G, Jones SF, Forshee R, Walderhaug M, Botsis T. Natural language processing systems for capturing and standardizing unstructured clinical information: A systematic review. J Biomed Inform. 2017;73:14–29.

    Article  Google Scholar 

  26. Sarker A, Klein AZ, Mee J, Harik P, Gonzalez-Hernandez G. An interpretable natural language processing system for written medical examination assessment. J Biomed Inform. 2019;98:103268.

    Article  Google Scholar 

  27. Wang M, Sun Z, Jia M, Wang Y, Wang H, Zhu X, Chen L, Ji H. Intelligent virtual case learning system based on real medical records and natural language processing. BMC Med Inform Decis Mak. 2022;22(1):60. https://doi.org/10.1186/s12911-022-01797-7.

Download references

Acknowledgements

We thank the support of the Information Management and Big Data Center of Peking University Third Hospital.

Funding

This work was supported by Beijing Quality Improvement Project of Standardized Training for Resident Physicians in 2021 (ZP 2021045), Education Management Project of Post-graduation Medical Education Group/Continuing Medical Education Group, Medical Education Branch of Chinese Medical Association in 2021 (21BY008).

Author information

Authors and Affiliations

Authors

Contributions

WQY: Conceptualization, Methodology, Data analysis, Investigation, Writing original draft, Writing review & editing, Visualization, Project administration. ZQL: Conceptualization, Methodology, Investigation, Writing review & editing, Visualization, Project administration. SXG: Conceptualization, Methodology, Data analysis, Investigation, Writing review & editing, Project administration, Funding acquisition. NS: Supervision, Methodology, Writing review & editing, Resources, Funding acquisition. JLH: Resources, Conceptualization. HLC: Resources, Conceptualization. SL: Resources. The author(s) read and approved the final manuscript.

Corresponding authors

Correspondence to Shixian Gu or Ning Shen.

Ethics declarations

Ethics approval and consent to participate

This retrospective study had been reviewed and approved by the Peking University Third Hospital Medical Science Research Ethics Committee (No. IRB00006761-M2022063). All methods were carried out in accordance with relevant guidelines and regulations. We confirmed that informed consent was obtained from all subjects.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yuan, W., Li, Z., Han, J. et al. Improving the resident assessment process: application of App-based e-training platform and lean thinking. BMC Med Educ 23, 134 (2023). https://doi.org/10.1186/s12909-023-04118-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-023-04118-2

Keywords