Skip to main content

Development, implementation, and evaluation of entrustable professional activities (EPAs) for medical radiation technologists in Taiwan: a nationwide experience

Abstract

Background

Competency-based medical education (CBME) is an outcomes-oriented approach focused on developing competencies that translate into clinical practice. Entrustable professional activities (EPAs) bridge competency assessment and clinical performance by delineating essential day-to-day activities that can be entrusted to trainees. EPAs have been widely adopted internationally, but not yet implemented for medical radiation professionals in Taiwan.

Materials and methods

A nationwide consensus process engaged 97 experts in radiation technology education representing diagnostic radiography, radiation therapy, and nuclear medicine. Preliminary EPAs were developed through the focus group discussion and the modified Delphi method. The validity of these EPAs was evaluated using the QUEPA and EQual tools.

Results

Through iterative consensus building, six core EPAs with 18 component observable practice activities (OPAs) in total were developed, encompassing routines specific to each radiation technology specialty. QUEPA and EQual questionnaire data verified these EPAs were valid, and of high quality for clinical teaching and evaluation.

Conclusion

The consensus development of tailored EPAs enables rigorous competency assessment during medical radiation technology education in Taiwan. Further expansion of EPAs and training of clinical staff could potentially enhance care quality by producing competent professionals.

Peer Review reports

Introduction

Competency-based medical education (CBME) is an outcome-oriented instructional approach focused on developing and measuring specific competencies that translate into standard professional practices in daily clinical routines [1, 2]. The cornerstone of CBME is competency-based assessment and improvement of trainees’ performance through reliable curricular implementation [3]. CBME has gained widespread advocacy in recent years after emerging in North America in the early 2000s. The paradigm has shifted from time-based to learner-centered, outcome-driven framework [4]. Numerous educational oversight bodies have instituted competency frameworks, including the Accreditation Council for Graduate Medical Education’s (ACGME) Core Competencies in 2002 [5] and the CanMEDS Competency Framework by the Royal College of Physicians and Surgeons of Canada (RCPSC) in 2005 [6]. The ACGME delineated six general competencies: patient care, medical knowledge, interpersonal communication, professionalism, practice-based learning and improvement, and systems-based practice [7]. CBME enables postgraduate clinicians to develop multifaceted abilities, including applied knowledge, attitudes, and skills for clinical reasoning and interprofessional collaboration. Thus, integration of CBME into current medical curricula is imperative.

The concept of entrustable professional activities (EPAs) was first proposed by Ten Cate in 2005 as an implementation strategy for competency-based medical education (CBME) to combine patient care and trainee development [8]. EPAs are defined as “units of professional practice consisting of responsibilities and tasks that supervisors can entrust to trainees once they have demonstrated satisfactory competence“ [9]. In practice, EPAs often comprise a series of tasks to focus the evaluation and performance management process [10], serving as a bridge between clinical practice and competency assessment. EPAs have since been widely adopted in medical education as they delineate essential day-to-day clinical practices. To ensure adequate training outcomes, EPA-based assessments evaluate trainees’ specific knowledge, skills, and attitudes needed for entrustment with for core activities [11].

Adoption of EPAs allows educators to detect trainees’ deficiencies and needs in medical education, leading to their widespread integration in globally [12]. EPAs have also been implemented across numerous residency and physician training programs, including family medicine [13], orthopedic surgery [10], internal medicine [14], emergency medicine [15], anesthesiology [16], pediatrics [17], and psychiatry [18]. More recently, EPA-based education has gained traction among other medical professionals such as nurses [12], pharmacists [19], and physical therapists [20]. In Taiwan, undergraduate students acquire knowledge and skills in medical imaging and radiological science through college education. However, postgraduate curricula are dependent on specific health care settings, including diagnostic radiography, radiation oncology, and nuclear medicine. Based on recent literature searches, there are no reports of EPA implementation in postgraduate training programs for diagnostic radiographers and radiation therapists. Therefore, our study objective is to establish EPAs tailored for radiology departments by achieving consensus among clinical educators. These EPAs are to be designed for integration and supervision within training curricula. By clarifying the principles and key elements of EPAs and providing the proper tools, our project enables clinical teachers to flexibly assess trainees for readiness to act unsupervised, with the ultimate goal of enhancing health services through competent staff trained using the EPA approach.

Materials and methods

We conducted a nationwide consensus process to develop Entrustable Professional Activities (EPAs) for radiological professionals in Taiwan. The Taiwan Association of Medical Radiation Technologist (TAMRT) collaborated with the National Taiwan University Hospital (NTUH) to engage experts from radiology departments across the country. A total of 97 educators were invited, representing diagnostic radiographers, radiation therapists, and nuclear medicine technologists from various clinical institutions. These experts had extensive teaching and curriculum design experience in postgraduate training programs, and were involved in developing the preliminary EPAs through a focus group discussion and a modified Delphi technique consensus procedures. The preliminary EPAs were then evaluated through surveys and finalized via further expert consensus meetings to establish agreement on the EPAs developed, as well as their use in clinical teaching and assessment. Ultimately, the goal was to improve programmatic assessment and care quality for trainees in medical radiation in Taiwan.

Supervision levels and OPAs as components of EPAs

EPAs were a concept including observable, measurable and work-based activities and many entrustment scales had been reported in related studies [8]. The crucial question of EPAs is “Do I trust this trainee to accomplish the clinical routine?” Therefore, EPAs should be assessed in number and convert performance of supervision to scale, such as level 1 to 5. The “Level 1” is observation only and “Level 5” is to provide supervision to learners. (Table 1)

Table 1 Entrustment levels of EPAs

In our model we distinguished three level of specification for EPAs. The actual EPAs, limited in number, were each specified in three Observable Practice Activities (OPAs), each of which were further described in three to five specifications, to detail what the activities are for which radiation technologist are to be qualified. This adds OPAs between the EPA title and its specification, as recommended in AMEE Guide 140 [21].

Observable Practice Activities (OPAs) [22] were clarified by Warm et al. in 2014 and were described to assess the entrustment with small, specific tasks that the authors called OPAs, at any time to accomplish EPAs development [23]. Given the nature of radiotherapists’ clinical tasks and practice models, workplace observations are typically divided into stages such as before, during, and after the execution of the actual procedure. OPAs were deemed useful to examine these minor elements that enable the workplace observation of different phases within the same EPA. Therefore, OPAs could contribute to the final entrustment decision by aiding in the evaluation process for each comprehensive EPA. In this study, the expert panels were asked to define every EPA and OPA by consensus procedure.

Consensus-building procedure

A consensus-building procedure utilizing focus group discussion (FGD) and a modified Delphi method was employed to develop topics and content for each EPA. These approaches are generally recommended for consensus-building [24] and aim to achieve agreement and convergence of ideas on a given issue through iterative rounds of inquiry and feedback. FGD is an interactive discussion format that allows all participants the opportunity to express their perspectives for consideration by the group [25]. The following steps were undertaken to develop consensus:

Preliminary EPA development

Prior research has demonstrated that developing valid EPAs requires engagement of participants with expertise in the relevant clinical domain and assessment methodology [26]. We recruited a panel of 97 experts across various clinical organizations to design radiological EPAs tailored to key specialty areas, including diagnostic radiographers (DRs), radiation therapists (RTTs), and nuclear medicine technologists (NMTs). The panel experts surveyed relevant literature, including CBME resources [27], existing EPA frameworks in other medical professions [28], and analyzed current training programs. Drawing from this background, they generated a list of critical clinical routines needing entrustment. Following established EPA guidelines [8], the panel formulated titles and descriptions for each proposed EPA. Through this consensus process, a total of 6 preliminary EPAs encompassing 18 observable practice activities (OPAs) were formulated for further discussion and validation.

Review and refinement

After developing the initial EPAs, we distributed them to the expert panel members for review to determine if they encompassed the essential clinical skills and adequately covered the necessary attributes. The panel experts were instructed to not only read but closely review the draft EPAs to ensure they contained the requisite attitudes, knowledge, and skills expected of clinical radiological staff. The experts provided feedback by selecting to amend, delete, or retain the proposed EPAs. Additionally, the panelists could suggest modifications or additions to the EPAs or OPAs. All feedback contents were compiled and discussed at the subsequent expert panel meeting. This review process enabled refinement of the EPAs based on the insightful critiques and recommendations of the knowledgeable panel.

Final EPA consensus meeting

The focus group discussion (FGD) and modified Delphi method were utilized to conduct the consensus process under the oversight of the Joint Commission of Taiwan (JCT) and TAMRT. A final expert panel meeting was convened in October 2020, including invited DRs, RTTs, and NMTs educators to participate in the EPA consensus building. Prior to the meeting, the preliminary EPAs were distributed to the experts to review the contents and provide input.

During the FGD process, the moderator (C-W Y), a physician educator experienced in consensus methodology and CBME, explained the principles and execution of FGD to achieve a shared mental model among the participants and then applied a standard consensus process for every proposal to moderate the consensus meeting. The moderator reviewed each EPA component generated through FGD and asked for any suggestions for change (i.e., adding, removing, and amending proposals). The proposal for change was seconded by an additional expert before being further discussed, voted on, and documented. Modifications to the drafts were incorporated based on majority expert votes via instant response system (IRS) over 80% to maintain high consensus.

After the FGD process, modified Delphi method was applied to explore the relevance of each EPA, OPA, and their specification. The proposed EPAs, OPAs and their specifications by FGD process were further debated and graded in real-time IRS vote to examine the overall level of agreement on relevance of each item. During the confirmatory process, experts reviewed the description of each EPA, OPA and specification and rated each with respect to the relevance to the training of radiotherapists. In order to gain a high level of concordance among the experts, each item had to meet a quartile deviation of ≤ 0.6 and an average score of ≥ 4 to be included in the final decision. If the criteria for concordance were not met, the items would be left for further debate and review in the next Delphi process round.

EPA quality assessment

There are assessment tools to evaluate the quality of EPAs, including the Quality of Entrustable Professional Activities tool (QUEPA) [29] and the EPA Quality tool EQual [30]. We utilized these two methods with 5-point Likert scales to assess the quality of the six developed EPAs specific to each radiology discipline in a faculty development activity. QUEPA and EQual were employed to gauge the effectiveness of the EPAs in meeting evaluable aspects and benchmarks. Additionally, the EQual questionnaire provided insight into the attendants’ level of understanding. Prior research has established an average cutoff score of 4.07 for EPA quality [31, 32]. Thus, any EPA domain in the EQual questionnaire with a mean score below 4.07 would be considered insufficient and likely necessitate revision. This systematic EPA quality assessment ensured the EPAs were robust and meaningful for evaluating clinical competencies.

Data analysis

The data from the FGD and Delphi consensus processes were collected and analyzed. Questionnaires of QUEPA and EQual were administered to the participants during the faculty development activities. The participants rated their level of agreement on a 5-point Likert scale (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree). For each survey, we calculated the mean score and standard deviation (M ± SD) for the ratings.

Results

The consensus meeting was conducted in October 2020 by convening a national medical education expert panel comprising 97 members (Table 2). The panel included representation from the Departments of Medical Imaging, Radiation Oncology, and Nuclear Medicine. During the FGD process, a total of 48 proposals for change, including 4 adding ones, 39 amending ones and 5 removing ones, were further discussed, voted on to reach consensus, and documented. During the first round Delphi process, all items reached the criteria for concordance and included in the final decisions. Each department planned and consensus on EPAs tailored to their distinct professional domain and finally 6 EPAs, 18 OPAs, and 85 specifications in total were decided. The 6 results EPAs were demonstrated in the Table 3.

Table 2 Participant demographics of the consensus meeting
Table 3 Results EPAs, OPAs and their specifications

Following the consensus meeting, we conducted a faculty development activity. During this event, 192 clinical educators, 119 from the Medical Imaging Department, 35 from the Radiation Oncology Department, and 38 from the Nuclear Medicine Department, evaluated the EPAs employing the EQual and QUEPA questionnaires to determine if the six EPAs developed through expert consensus [33] demonstrated validity. During the session, faculty members from each department assessed the two EPAs within their own respective fields.

The overall QUEPA score data is presented in Table 4. A total of 360 responses were collected, comprised of 222 from the Medical Imaging Department, 67 from the Radiation Oncology Department, and 71 from the Nuclear Medicine Department. This provided quantitative evidence that the established EPAs had validity for training medical radiation professionals.

Table 4 QUEPA score for EPAs of different sub-specialties

As for the EQual questionnaire, 364 responses were obtained, comprised of 224 from the Medical Imaging Department, 64 from the Radiation Oncology Department, and 76 from the Nuclear Medicine Department. The data demonstrated that in the Medical Imaging Department, the overall average score was 4.54 ± 0.63 for the general diagnostic imaging EPAs and 4.48 ± 0.65 for the CT imaging EPAs. In the Radiation Oncology Department, the overall average score was 4.43 ± 0.63 for the CT simulation EPAs and 4.56 ± 0.54 for the external beam radiotherapy EPAs. Finally, in the Nuclear Medicine Department, the overall average score was 4.53 ± 0.57 for the positron emission tomography (PET) imaging EPAs and 4.69 ± 0.49 for the single-photon emission computed tomography (SPECT) imaging EPAs. All item average scores exceeded 4.07 (Table 5). This indicates the final EPAs developed through consensus have high quality and can serve as evaluation guidelines for postgraduate clinical radiation technologist training nationally.

Table 5 EQual scores for EPAs of different sub-specialties

Discussion

The paradigm of medical education is shifting towards competency-based medical education (CBME). Unlike traditional models, CBME is learner-centric and integrates diverse assessments [34, 35]. This philosophical shift has informed pre- and post-graduate curricula internationally [36, 37]. In Taiwan, postgraduate programs have progressively adopted competency-driven approaches with clinical observation. The complexity of clinical environments necessitates that trainees across disciplines develop the requisite knowledge, skills, and attitudes to appropriately manage situations through immersive daily practice and assimilation of professional competencies and responsibilities. Concurrently, clinical educators must observe learners and evaluate their performance and progress multidimensionally. Core competency blueprints [38] and milestone mapping [39, 40] are integral to actualizing CBME. However, focused trainee evaluation requires unified criteria and objectives. EPAs offer an objective tool to appraise competence, enabling professions to delineate and select their field’s most critical, representative clinical skills for guided development.

Integrating EPAs into postgraduate medical radiation technology education presents difficulties. Prior studies largely focused on medicine, including general surgery [41], pediatric cardiology, dentistry [42, 43], and seldom on allied health fields like pharmacy [44] or nursing [45]. This study pioneers EPA implementation as a clinical training assessment across medical radiation technology specialties - diagnostic radiography, radiation therapy, and nuclear medicine.

Medical radiation technologists have different working scopes in different medical institutions, which are affected by the clinical department, the scale of the medical institution, and the execution differences between units. Considering the time pressure, the complexity of the training content, and the differences between different specialties, it takes a lot of time and manpower to develop or change the current evaluation method for trainees. Therefore, at the beginning of the project, we invited course leaders from all levels of medical units across the country to attend courses and discussion meetings to let them understand the core idea of CBME and the EPAs evaluation method. Then, we used the FGD and modified Delphi method to conduct several consensus processes [25, 46], including paper-based data discussions and face-to-face meetings. These two methods provide anonymous and non-hierarchical discussion patterns. Finally, we reached a consensus on six core EPAs tasks, which can be used by each medical unit as the scope and teaching content for evaluating new medical radiation technologists.

EPAs focus on routine clinical behaviors or processes that are performed every day, or clinical activities with high risk and easy to make mistakes. When the assessor has doubts about the completeness of the trainee’s task, or is not confident in a certain clinical skill, it can be reflected in the Entrustment-Supervision (ES) level. Through EPAs evaluation, the course planner can also know if the trainee needs to extend the training period or adjust the course to achieve ability evaluation. EPAs provide a more intuitive assessment of routine medical behaviors. ten Cate recommends dividing supervision into five levels [8]: observation (Level 1), direct supervision (Level 2), indirect supervision and on call for direct help at any time (Level 3), no need for supervision (Level 4) and supervise others (Level 5). Considering the complexity and variability of the content of each radiology profession, we applied the Chen-modified ES scale divided Level 2 direct supervision into joint completion (Level 2a) and timely assistance (Level 2b), and Level 3 indirect supervision into need to confirm all items (Level 3a), key confirmation (Level 3b) and no need to confirm items (Level 3c) [47]. In this way, clinical teachers can more accurately give trainees the corresponding trust level.

This study still has many possible applications that can be explored and extended. As mentioned earlier, the scale and scope of work of medical radiation technologists in various medical units in Taiwan are very different. Although we have invited the course directors and course leaders of the key medical units of all levels in the country, including medical centers, regional hospitals and district hospitals, the number is far from the total number of medical institutions in the country. Through several discussion meetings, we agreed on six EPAs tasks, but the evaluation content and views of these EPAs are not necessarily suitable for all levels of medical units. Therefore, each unit needs to report and feedback to the medical radiation technology organization in charge, so that the content can be modified. In addition, the continuing education of clinical medical radiation technologists after graduation may include multiple aspects. Our team has initially discussed six EPAs tasks, but there are still many EPAs that need to be developed and promoted through consensus process, or the existing EPAs projects need to be carefully planned. This is the direction we need to work hard in the future. The Accreditation Council for Graduate Medical Education (ACGME) demands Clinical competency committees (CCC) to review the clinical performance of trainees in a certain period of time. Team consensus is a necessary process for the development of EPAs in order to determine the clinical training performance and condition of trainees [48, 49]. CCCs can determine the training level to the overall performance of the trainees, and provide feedback to the course directors, so that the follow-up adjustments and processing can be carried out and make summative entrustment decisions about EPAs. We will also move towards the CCC model in the future, so that the training of medical radiation technologists can be more personalized and more complete. We are confident in the results of this study, because it has a great change and far-reaching impact on the training mode and opening of medical radiation technologists in Taiwan. There are still many EPAs related research directions that need to be explored, and our team will continue to explore the development of evaluation tools in the field of medical radiation in Taiwan.

Data availability

The datasets generated and analyzed during the current study are not publicly available in order to ensure the privacy of participants. However, they are available from the corresponding author upon reasonable request.

References

  1. Richardson D, Kinnear B, Hauer KE, Turner TL, Warm EJ, Hall AK, Ross S, Thoma B, Van Melle E, Collaborators I. Growth mindset in competency-based medical education. Med Teach. 2021;43(7):751–7.

    Article  Google Scholar 

  2. Butani L, Plant J, Barone MA, Dallaghan GLB. Entrustable Professional Activity-based assessments in Undergraduate Medical Education: a Survey of Pediatric Educators. Acad Pediatr. 2021;21(5):907–11.

    Article  Google Scholar 

  3. Harris P, Snell L, Talbot M, Harden RM. Competency-based medical education: implications for undergraduate programs. Med Teach. 2010;32(8):646–50.

    Article  Google Scholar 

  4. ten Cate O. Entrustability of professional activities and competency-based training. Med Educ. 2005;39(12):1176–7.

    Article  Google Scholar 

  5. Joyner BD. An historical review of graduate medical education and a protocol of Accreditation Council for Graduate Medical Education compliance. J Urol. 2004;172(1):34–9.

    Article  Google Scholar 

  6. Rourke J, Frank JR. Implementing the CanMEDS physician roles in rural specialist education: the multi-specialty community training network. Educ Health (Abingdon). 2005;18(3):368–78.

    Article  Google Scholar 

  7. Kavic MS. Competency and the six core competencies. JSLS. 2002;6(2):95–7.

    Google Scholar 

  8. Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(1):157–8.

    Article  Google Scholar 

  9. Shorey S, Lau TC, Lau ST, Ang E. Entrustable professional activities in health care education: a scoping review. Med Educ. 2019;53(8):766–77.

    Article  Google Scholar 

  10. Watson A, Leroux T, Ogilvie-Harris D, Nousiainen M, Ferguson PC, Murnahan L, Dwyer T. Entrustable Professional activities in Orthopaedics. JB JS Open Access 2021, 6(2).

  11. Mulder H, Ten Cate O, Daalder R, Berkvens J. Building a competency-based workplace curriculum around entrustable professional activities: the case of physician assistant training. Med Teach. 2010;32(10):e453–459.

    Article  Google Scholar 

  12. Al-Moteri M. Entrustable professional activities in nursing: a concept analysis. Int J Nurs Sci. 2020;7(3):277–84.

    Google Scholar 

  13. Schultz K, Griffiths J, Lacasse M. The application of Entrustable Professional activities to inform competency decisions in a Family Medicine Residency Program. Acad Med. 2015;90(7):888–97.

    Article  Google Scholar 

  14. Caverzagie KJ, Cooney TG, Hemmer PA, Berkowitz L. The development of entrustable professional activities for internal medicine residency training: a report from the Education Redesign Committee of the Alliance for Academic Internal Medicine. Acad Med. 2015;90(4):479–84.

    Article  Google Scholar 

  15. Hart D, Franzen D, Beeson M, Bhat R, Kulkarni M, Thibodeau L, Weizberg M, Promes S. Integration of Entrustable Professional activities with the milestones for Emergency Medicine residents. West J Emerg Med. 2019;20(1):35–42.

    Article  Google Scholar 

  16. Woodworth GE, Marty AP, Tanaka PP, Ambardekar AP, Chen F, Duncan MJ, Fromer IR, Hallman MR, Klesius LL, Ladlie BL, et al. Development and Pilot Testing of Entrustable Professional activities for US Anesthesiology Residency Training. Anesth Analg. 2021;132(6):1579–91.

    Article  Google Scholar 

  17. Pitts S, Schwartz A, Carraccio CL, Herman BE, Mahan JD, Sauer CG, Dammann CEL, Aye T, Myers AL, Weiss PG, et al. Fellow entrustment for the Common Pediatric Subspecialty Entrustable Professional activities Across subspecialties. Acad Pediatr; 2021.

  18. Boyce P, Spratt C, Davies M, McEvoy P. Using entrustable professional activities to guide curriculum development in psychiatry training. BMC Med Educ. 2011;11:96.

    Article  Google Scholar 

  19. Marshall LL, Kinsey J, Nykamp D, Momary K. Evaluating practice readiness of Advanced Pharmacy Practice Experience Students using the Core Entrustable Professional activities. Am J Pharm Educ. 2020;84(10):ajpe7853.

    Article  Google Scholar 

  20. Zainuldin R, Tan HY. Development of entrustable professional activities for a physiotherapy undergraduate programme in Singapore. Physiotherapy. 2021;112:64–71.

    Article  Google Scholar 

  21. Ten Cate O, Taylor DR. The recommended description of an entrustable professional activity: AMEE Guide No. 140. Med Teach. 2021;43(10):1106–14.

    Article  Google Scholar 

  22. Anscher MS, Chang MG, Moghanaki D, Rosu M, Mikkelsen RB, Holdford D, Skinner V, Grob BM, Sanyal A, Wang A, et al. Lovastatin may reduce the risk of erectile dysfunction following radiation therapy for prostate cancer. Acta Oncol. 2016;55(12):1500–2.

    Article  Google Scholar 

  23. Warm EJ, Mathis BR, Held JD, Pai S, Tolentino J, Ashbrook L, Lee CK, Lee D, Wood S, Fichtenbaum CJ, et al. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med. 2014;29(8):1177–82.

    Article  Google Scholar 

  24. Jones J, Hunter D. Consensus methods for medical and health services research. BMJ. 1995;311(7001):376–80.

    Article  Google Scholar 

  25. Kitzinger J. Qualitative research: introducing focus groups. BMJ. 1995;311:299–302.

    Article  Google Scholar 

  26. Ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using Entrustable Professional activities (EPAs): AMEE Guide No. 99. Med Teach. 2015;37(11):983–1002.

    Article  Google Scholar 

  27. Ahn D. Current trend of accreditation within medical education. J Educ Eval Health Prof. 2020;17:30.

    Article  Google Scholar 

  28. White K, Qualtieri J, Courville EL, Beck RC, Alobeid B, Czuchlewski DR, Teruya-Feldstein J, Soma LA, Prakash S, Gratzinger D. Entrustable Professional activities in Hematopathology Pathology Fellowship Training: Consensus design and proposal. Acad Pathol. 2021;8:2374289521990823.

    Article  Google Scholar 

  29. Post JA, Wittich CM, Thomas KG, Dupras DM, Halvorsen AJ, Mandrekar JN, Oxentenko AS, Beckman TJ. Rating the quality of Entrustable Professional activities: Content Validation and associations with the clinical context. J Gen Intern Med. 2016;31(5):518–23.

    Article  Google Scholar 

  30. Meyer EG, Taylor DR, Uijtdehaage S, Durning SJ. EQual Rubric Evaluation of the Association of American Medical Colleges’ Core Entrustable Professional activities for entering Residency. Acad Med. 2020;95(11):1755–62.

    Article  Google Scholar 

  31. Elmes AT, Tekian A, Jarrett JB. The need for Quality Assessment of Entrustable Professional activities in Pharmacy Education. Am J Pharm Educ. 2023;87(2):ajpe9039.

    Article  Google Scholar 

  32. Taylor DR, Park YS, Egan R, Chan MK, Karpinski J, Touchie C, Snell LS, Tekian A. EQual, a Novel Rubric to Evaluate Entrustable Professional Activities for Quality and Structure. Acad Med 2017, 92(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 56th Annual Research in Medical Education Sessions):S110-S117.

  33. Wen SYT, Tu CY, Chen CY, Lin CH, Yamg YB, Huang BH, Yang KM. Establishing Entrustable Professional activities of Taiwan Radiation technologist. J Healthc Qual. 2022;16(3):64–70.

    Google Scholar 

  34. Hamza DM, Ross S, Oandasan I. Process and outcome evaluation of a CBME intervention guided by program theory. J Eval Clin Pract. 2020;26(4):1096–104.

    Article  Google Scholar 

  35. Ten Cate O. Competency-based Postgraduate Medical Education: past, Present and Future. GMS J Med Educ. 2017;34(5):Doc69.

    Google Scholar 

  36. Holzhausen Y, Maaz A, Renz A, Bosch J, Peters H. Development of Entrustable Professional activities for entry into residency at the Charite Berlin. GMS J Med Educ. 2019;36(1):Doc5.

    Google Scholar 

  37. Rabski JE, Saha A, Cusimano MD. Setting standards of performance expected in neurosurgery residency: a study on entrustable professional activities in competency-based medical education. Am J Surg. 2021;221(2):388–93.

    Article  Google Scholar 

  38. Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J. International Competency-based Medical Education C: a Core Components Framework for evaluating implementation of competency-based Medical Education Programs. Acad Med. 2019;94(7):1002–9.

    Article  Google Scholar 

  39. Schiller PT, Phillips AW, Straus CM. Radiology Education in Medical School and Residency: the views and needs of Program directors. Acad Radiol. 2018;25(10):1333–43.

    Article  Google Scholar 

  40. Isaak RS, Chen F, Martinelli SM, Arora H, Zvara DA, Hobbs G, Stiegler MP. Validity of Simulation-Based Assessment for Accreditation Council for Graduate Medical Education Milestone Achievement. Simul Healthc. 2018;13(3):201–10.

    Article  Google Scholar 

  41. Wagner JP, Lewis CE, Tillou A, Agopian VG, Quach C, Donahue TR, Hines OJ. Use of Entrustable Professional activities in the Assessment of Surgical Resident Competency. JAMA Surg. 2018;153(4):335–43.

    Article  Google Scholar 

  42. Werho DK, DeWitt AG, Owens ST, McBride ME, van Schaik S, Roth SJ. Establishing Entrustable Professional activities in Pediatric Cardiac critical care. Pediatr Crit Care Med. 2022;23(1):54–9.

    Article  Google Scholar 

  43. Cully JL, Schwartz S. The argument for Entrustable Professional activities in Pediatric Dentistry. Pediatr Dent. 2019;41(6):427–8.

    Google Scholar 

  44. Lockman K, Lowry MF, DiScala S, Lovell AG, Uritsky TJ, Kematick BS, Schmidt M, Wetshtein AM, Scullion B, Herndon CM, et al. Development of Entrustable Professional activities for specialist hospice and Palliative Care pharmacists. J Pain Symptom Manage. 2022;64(1):37–48.

    Article  Google Scholar 

  45. Chiang YH, Yu HC, Chung HC, Chen JW. Implementing an entrustable professional activities programmatic assessments for nurse practitioner training in emergency care: a pilot study. Nurse Educ Today. 2022;115:105409.

    Article  Google Scholar 

  46. Francischetti I, Holzhausen Y, Peters H. Entrustable professional activities for junior Brazilian medical students in community medicine. BMC Med Educ. 2022;22(1):737.

    Article  Google Scholar 

  47. Chen HC, van den Broek WE, ten Cate O. The case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90(4):431–6.

    Article  Google Scholar 

  48. Duitsman ME, Fluit C, van Alfen-van der Velden J, de Visser M, Ten Kate-Booij M, Dolmans D, Jaarsma D, de Graaf J. Design and evaluation of a clinical competency committee. Perspect Med Educ. 2019;8(1):1–8.

    Article  Google Scholar 

  49. Ekpenyong A, Padmore JS, Hauer KE. The purpose, structure, and process of clinical competency committees: Guidance for members and Program directors. J Grad Med Educ. 2021;13(2 Suppl):45–50.

    Article  Google Scholar 

Download references

Acknowledgements

The authors thank the Joint Commission of Taiwan and Taiwan Association of Medical Radiation Technologists (TAMRT) for their support. Special thanks to all participants of this study.

Funding

The study was supported by grants from the Ministry of Science and Technology of Taiwan (grant number: MOST 111-2628-H-002-011-MY3), and National Taiwan University Hospital (grant number: NTUH 107-S3871, NTUH 109-S4671).

Author information

Authors and Affiliations

Authors

Contributions

In the collaborative endeavor that produced this research, each author contributed significantly and holistically. CYT and KMH were instrumental in drafting and editing the manuscript. The data collection, analysis, and interpretation were meticulously carried out by CHC and WJL. The subsequent editing and revisions were diligently performed by CHL. Finally, the project’s conception, oversight, and further revisions were overseen by CWY. Each author’s individual efforts have collectively led to the fruition of this study.

Corresponding author

Correspondence to Chih-Wei Yang.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Research Ethics Committee of National Taiwan University Hospital (NTUH-REC No. 201907120RINB). Written informed consent was given by the participants. All methods were carried out in accordance with relevant guidelines and regulations.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tu, CY., Huang, KM., Cheng, CH. et al. Development, implementation, and evaluation of entrustable professional activities (EPAs) for medical radiation technologists in Taiwan: a nationwide experience. BMC Med Educ 24, 95 (2024). https://doi.org/10.1186/s12909-024-05088-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-024-05088-9

Keywords