Skip to main content

Surgery goes EPA (Entrustable Professional Activity) – how a strikingly easy to use app revolutionizes assessments of clinical skills in surgical training

Abstract

Objective

Entrustable Professional Activities (EPAs) are increasingly being used in competency-based medical education approaches. A general lack of time in clinical settings, however, prevents supervisors from providing their trainees with adequate feedback. With a willingness for more administrative tasks being low in both trainees and educators, the authors developed a radical user-friendly mobile application based on the EPA concept called “Surg-prEPAred”.

Design

Surg-prEPAred is designed to collect micro-assessment data for building competency profiles for surgical residents according to their curriculum. The goal of Surg-prEPAred is to facilitate the performance and documentation of workplace-based assessments. Through aggregated data the app generates a personalized competency profile for every trainee. During a pilot run of 4 months, followed by ongoing usage of the application with a total duration of 9 months (August 2019 to April 2020), 32 residents and 33 consultants made daily use of the application as a rating tool. Every rating included knowledge, skills and professional attitudes of the trainees. Before the initiation of the App and after the 9-month trial period trainees and supervisors where both sent questionnaires to evaluate the user friendliness and effectiveness of the App.

Results

Five hundred ten App based assessments were generated. Out of 40 pre-defined EPAs, 36 were assessed. 15 trainees and 16 supervisors returned the questionnaires and stated the surg-prEPAred App as very valuable, effective and feasible to evaluate trainees in a clinical setting providing residents with an individual competence portfolio to receive precision medical education.

Conclusions

The authors expectation is that the Surg-prEPAred App will contribute to an improvement of quality of medical education and thus to the quality of patient care and safety. In the future the goal is to have the App become an integral part of the official Swiss surgical curriculum accepted by the Swiss professional surgical society.

Peer Review reports

Background

 “Entrustable Professional Activities” (EPAs) were initially introduced in 2005 and are increasingly being used in competency-based medical education approaches [1, 2]. An EPA is defined as a unit of professional practice that can be fully entrusted to a trainee, as soon as he or she has proven the required capability to perform the activity without supervision [3]. Depending on the field of specialty chosen, a set of EPAs necessary to master can be pre-defined by medical educators, each EPA consisting of a set of competencies needed for the task at hand. The task of monitoring and documenting the learning progress of each trainee has been proven to be a challenge [4,5,6,7,8]. In a more and more demanding clinical setting with a general lack of time and thus motivation to evaluate trainees, an easy-to-use tool is required to facilitate a more efficient and meaningful feedback culture among trainees and educators. We believe that mobile applications increase feedback numbers, support entrustment decision-making and also visualizes a trainee’s level of expertise more adequately. To our knowledge, only two references on combining the EPA framework with a mobile platform can be retrieved in literature [9, 10]. Whereas the published paper focuses on second year psychiatry residents, we would like to discuss our mobile app Surg-prEPAred which we designed for our surgical resident curriculum at the Kantonsspital Luzern, the largest non-university hospital in Switzerland. The following paper describes both implementation and results after a pilot run of 4 months, followed by ongoing usage of the application with a total duration of 9 months (August 2019 to April 2020).

Intention and background information

A lot of alternative forms of workplace-based assessments such as “Direct Observation of Procedural Skills” (DOPS) and “Mini-Clinical Evaluation Exercise” (Mini-CEX) are well-founded from a theoretical teaching perspective, but they are difficult to implement within the clinical setting [11]. They are said to be too much of a checklist approach to medical education, while leaving out important competencies with relevant differences that are difficult to measure [6, 12,13,14,15]. Numerous studies have shown that assessments and feedback are rarely done on a regular basis and often show a lack of advice on how the trainee can improve her/his clinical expertise [12, 16]. Very often the competencies of a trainee are also fragmented to subgroups, such as communication skills, knowledge, manual skills, etc. without focusing on the greater picture. Thus, the concept of Entrustable Professional Activities is better suited for everyday clinical life. The focus of EPAs lies on the holistic rating of a specific and observable clinical task [1]. The level of competency is the level of required supervision, a scale that educators use implicitly every day, but failed to document till now. However, each clinical task is suitable for learning and its assessment should be used as a teaching tool with only little effort. With the understanding that the willingness for more administrative tasks is very low in both trainees and educators, we developed a radical user-friendly mobile application based on the EPA concept called “Surg-prEPAred”. The app has been developed using a design-thinking approach. To ensure user-centered design, the original concept by one of the authors (APM) has been improved through prototyping, testing and refining in several cycles including all stakeholders. With grant money from the University of Zurich’s “Competitive Teaching Grant” and a grant from the Swiss Institute for Postgraduate and Further Education in Medicine (SIWF), a first functional prototype was developed by an external software company in 2019.

In fall 2020, APM founded a company (precisionED Ltd) to rebuild the App from scratch and to provide a sustainable high quality assessment system. precisionED holds all intellectual property rights and guarantees state-of-the art protection of any data by complying with GDPR-standards.

Starting in July 2019, we motivated all surgical staff, trainees and consultants as educators, of the Cantonal Hospital of Lucerne to download and use the App on pre-defined EPAs during their daily business. Taking part in the pilot study was not mandatory. The only technical tool necessary was a smartphone. Each rating was set to take less than 2 minutes per case. In alignment of the EPA concept every rating included knowledge, skills and professional attitudes of the trainees. Trainee and educator were able to use the App together during their clinical work. A trainee would select the task and show a generated QR code on her/his smartphone for the educator to scan with the supervisor version of the App. After an independent rating of the trainee’s performance by both trainee and educator, specific feedback and documentation of a learning goal were an optional part of the rating process facilitated by the prEPAred-App.

In order to better assess possible improvements, a survey on the status quo of the feedback quality was sent to all users prior to the pilot study and at the end. The pre- and post-feedback surveys, as well as the usability surveys are derived from the feedback literature and common usability questionnairs [17]. They were pilot-tested by the authors, trainees and supervisors for clarity and feasibility (Questionnaires are provided in the Additional file 1). By accumulating the collected workplace-based assessment data during the 4-month pilot project phase, each trainee generated an individual, color-coded competency profile.

Figure 1 shows the intuitive user interface for both supervisors on the left and trainees on the right, whereas Fig. 2 illustrates an example for a trainee competency profile on the left and the self-assessment tool on the right.

Fig. 1
figure 1

User interface of supervisor (left) and trainee (right)

Fig. 2
figure 2

Example of a trainee competency profile and self-assessment in visceral surgery

Aims of the new assessment tool

The main goal of the Surg-prEPAred-App was to facilitate the performance and documentation of workplace-based assessments of short clinical tasks (EPAs). Through the aggregated data the app generates a personalized competency profile for every trainee.

This transparency is supposed to help to identify gaps and strengths of a trainee and allows supervisors in the busy clinical setting to tailor their supervision and teaching more efficiently to the needs of the trainee. Surg-prEPAred-App is feasible and effective assessment tool that can replace time consuming and less competency-based former assessment tools. The automatically generated competency profile belongs to the trainee and shows her/his individual strengths and performance gaps as well as significant short-term learning objectives. In addition, the App offers each trainee the opportunity for self-assessment, an essential and important feedback mechanism [18]. Each trainee has maximum control over her/his competency profile. The trainee can allow the educator to have access to the personal competency profile and also take it along when rotating through different sub-specialties or when changing teaching institutions. In this way learning outcomes can be individually adjusted to a person’s needs and redundancies can be avoided. With access to a trainee’s personal competency profile, an educator can customize educational goals and adjust levels of supervision depending on the level of competency from direct supervision to distant or no supervision. We expect that this leads to an improvement of educators` motivation as well as, better use of resources and even higher safety for patients.

Implementation and results

In the beginning the application was introduced at the department of orthopedics and traumatology as well as the department of general and visceral surgery of the Cantonal Hospital of Lucerne. 50 of the participating residents were registered as trainees, 23 of which were inexperienced (0–2 years of postgraduate training), 17 intermediate (2–4 years of training) and 10 experienced (more than 4 years of clinical experience). Out of the 40 consultants and senior consultants as educators 1 was considered an inexperienced supervisor (less than 1 year after completing resindency), 18 intermediate (few years of experience) and 21 experienced (supervisor for many years). During the trial period between August 2019 and April 2020 a total of 510 App based assessments were generated, which is about 60 per month. Out of the pre-defined 40 relevant EPAs, 36 were assessed.

In comparison there were significantly less paper-based competency-based assessments done in the same number of months before the launch of the App. Since not all institutions keep statistics on what paper-based assessments were done, the exact number is hard to tell. Generally, there are an estimated 150 paper-based assessments done over the same amount of time. It is very likely that the minimum number of assessments required per year – in Switzerland 4 - is not met by most institutions. According to international consensus meetings, it is beneficial to have short assessments more frequent, instead of longer assessments at irregular intervals [17]. Thus, the Surg-prEPAred App is an easily applicable and efficient tool for assessing trainees in the actual clinical setting. There has been an increase in the number of assessments by about 340%.

Analysis of user-friendliness through pre- and post-trial questionnaires

Before the initiation of the App and after the 9-month trial period trainees and supervisors where both sent questionnaires to evaluate the user friendliness and effectiveness of the App. A total number of 15 trainees and 16 supervisors returned the questionnaires with the following results:

Trainees (n = 15):

  • 93% want to use the App often

  • 93% say, the App is easy to use

  • 100% claim they learned quickly how to use it

  • 50% think assessments by App are better than the old paper-based assessments (50% being neutral towards both)

  • 93% think the usability of the App is good to excellent

Out of those trainees that did not use the App, 29% claimed to have been too busy otherwise, 12% had second thoughts about data security, 29% were on an external rotation, and 18% claimed to not have the technical equipment to do so.

The free text fields mainly proved that excitement and approval for the App were dominant attitudes, that more detailed feedback is important and that a culture change for feedback conversations is imminent.

Supervisors (n = 16):

  • 93% want to use the App often

  • 93% say, the App is easy to use

  • 87% claim they learned quickly how to use it

  • 66% think assessments by App are better than the old paper-based assessments (27% being neutral towards both)

  • 93% think the usability of the App is good to excellent

Out of those supervisors that did not use the App, one person claimed to have been too busy to do so, another said no trainee asked to be assessed and a third person is a supervisor in the orthopedic department that has not yet been integrated into the project.

The free text fields showed constructive feedback such as that feedback should be made obligatory for each EPA assessment and some supervisors showed second thoughts concerning the low complexity of the assessments and wondered if they are detailed enough to decide about competence levels and necessary level of supervision.

Effect on feedback quality

Pre- and post-trial questionnaires showed an increase in feedback given in addition to the assessment by 20% with immediate feedback being given 20% more often than before. The specificity of the feedback increased by 11 and 31% agreed that the feedback includes an action plan of how to improve in the future. It is usually the supervisors that dominate the feedback rounds. The quality of the feedback increased by 24% towards high quality feedback. 17% of the trainees were more satisfied with the feedback given at the end of the 9-month trial period than before usage of the App.

Other key figures evaluated

Duration of rating

The average duration per rating with regard to complexity of the EPA and level of supervision without feedback took 4 minutes and 19 seconds for trainees and 1 minute and 42 seconds for supervisors. The average amount of time per feedback including the definition of new learning goals was 2 minutes and 27 seconds. A complete rating for supervisors thus took 6 minutes and 46 seconds. 93% of all ratings were done completely. These numbers that are illustrated in Figs. 3 and 4 show that the time effort for a rating via the present App is very little. Since it is the trainee who initiates the rating, the time effort for the supervisor is minimal and can be done in less than two minutes if they opt to go without the feedback part. Thus, the App can be applied perfectly in the daily clinical setting. This is proven by the fact that only 9% of all initiated ratings were not finished.

Fig. 3
figure 3

Average duration of ratings for trainees

Fig. 4
figure 4

Average duration of ratings for supervisors

Usage of feedback option

In 40% of all ratings feedback was given. The answer “No time for feedback” was chosen in 40% of all returned questionnaires. In 6% learning objectives were defined but no feedback was given (see Fig. 5). The surg-prEPAred App is designed for optimizing a conjoint evaluation by trainees and supervisors with respect to the complexity of the EPA and the level of supervision needed. This data by itself is highly valuable for both parties since it shows a clear picture of how one judges the situation. Each data set is a combination of self- and external assessment. The feedback option has been labeled as optional by intent and it is thus highly positive to see that in 50% of all ratings either feedback has been given or learning goals have been defined anyways.

Fig. 5
figure 5

Percentage of feedback options

Discussion and further work

So far, our experience with the App has been positive. After our first go-live, though, a number of improvements were defined to be implemented in future versions of the App. There will be additional levels of supervision in accordance with the ten Cate scale [1]. The App can generally be used in each medical specialty since EPAs can be defined individually according to the specialty’s curriculum requirements. Each trainee can build up a mobile, individual and sustainable competency profile to be used at follow-up training institutions without losing competency when changing positions. Usability will be improved by new graphics for the EPA profile of each trainee, including new legends and management options for learning tools to be defined. In order to generate even more feedback, new feedback options by including pictures or videos are being tested. In future a pdf file of the profile will be generated that can be submitted to educational committees since EPAs are expected to be an integral and mandatory part of medical education. We used the original English version of the application till now, German, French and Italian translations of the interface are just created. Due to the very good first experience with the Surg-prEPAred App in our teaching hospital, the next goal should be to implement both EPAs and App into the specialty training on a national level. The Swiss College of Surgeons intend to implement their new EPA-based core curriculum with the Surg-prEPAred app. Since the use of mobile technology for documenting WBA’s is also part of the newest consensus statements on assessment in medical education [19, 20], several specialties in Switzerland are now using prEPAred to gain experience in working with EPAs. Several studies on this topic are on the way.

Conclusion

The Surg-prEPAred App is a very valuable, effective and feasible tool to evaluate trainees in a busy clinical setting. The feedback is in “real-time” and thus more specific and meaningful. Thanks to an individual competence portfolio each resident is always aware of her/his level of training and the next steps in surgical training. Supervisors are now capable of customizing their supervision and teaching in accordance with the competence level of the residents.

Through secure transfer and storage of data, the data security of all data can be assured. Thanks to Surg-prEPAred, we are one step closer to “Precision Medical Education”. Our expectation is that the Surg-prEPAred App will eventually also contribute to an improvement of quality of medical education and thus to the quality of patient care and safety.

Availability of data and materials

All data used and analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

DOPS:

Direct Observation of Procedural Skills

EPAs :

Entrustable Professional Activities

GDPR:

General Data Protection Regulation

Mini-CEX :

Mini-Clinical Evaluation Exercise

SIWF:

Swiss Institute for Postgraduate and Further Education in Medicine

WBA:

Workplace-based assessment

References

  1. Ten Cate O. A primer on entrustable professional activities. FEM. 2017;20(3):95–102.

    Article  Google Scholar 

  2. Ten Cate O. Entrustable professional activities and competency-based training. Med Educ. 2005;39:1176–7.

    Article  Google Scholar 

  3. Ten Cate O, et al. Curriculum development for the workplace using Entrustable Professional Activities (EPAs). AMEE Guide No. 99. Med Teach. 2015;37:983–1002.

    Article  Google Scholar 

  4. Brooks MA. Medical education and the tyranny of competency. Perspect Biol Med. 2009;52:90–102.

    Article  Google Scholar 

  5. Grant J. The incapacitating effects of competence: a critique. Adv Health Sci Edu Theory Pract. 1999;4:271–7.

    Article  Google Scholar 

  6. Talbot M. Monkey see, monkey do: a critique of the competency model in graduate medical education. Med Educ. 2004;38:587–92.

    Article  Google Scholar 

  7. St-Onge C, Lachiver EV, Langevin S, Boileau E, Bernier F, Thomas A. Lessons from the implementation of developmental Progress assessment: a scoping review. Med Educ. 2020;54(10):878–87.

    Article  Google Scholar 

  8. Proske A, Link BC, Beeres F, Nebelung S, Füchtmeier B, Knobe M. Weiterbildung unter der Lupe (Teil 2) – Wie bereiten sich Weiterbildungsassistenten auf Notfalloperationen vor? Chirurg. 2021;92:62–9.

    Article  Google Scholar 

  9. Young JQ, McClure M. Fast, easy, and good: assessing Entrustable professional activities in psychiatric residents with a Mobile app. Acad Med. 2020;95(10):1546–9.

    Article  Google Scholar 

  10. George BC, Bohnen JD, Williams RG, Meyerson SL, Schuller MC, Clark MJ, et al. Readiness of US general surgery residents for independent practice. Ann Surg. 2017;266(4):582–94.

    Article  Google Scholar 

  11. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA. 2009;302(12):1316–26.

    Article  Google Scholar 

  12. Gray TG, Hood G, Farrell T. The results of a survey highlighting issues with feedback on medical training in the United Kingdom and how a smartphone app could provide a solution. BMC Res Notes. 2015;8:653.

    Article  Google Scholar 

  13. Glass JM. Competency-based training is a framework for incompetence. BMJ. 2014;348:g2909.

    Article  Google Scholar 

  14. Huddle TS, Heudebert GR. Taking apart the art: the risk of anatomizing clinical competence. Acad Med. 2007;82:536–41.

    Article  Google Scholar 

  15. Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the accreditation Council for Graduate Medical Education: a systematic review. Acad Med. 2009;84:301–9.

    Article  Google Scholar 

  16. Blankush JM, Shah B, et al. What are the associations between the quantity of faculty evaluations and residents' perception of quality feedback? Ann Med Surg. 2017;16:40–3.

    Article  Google Scholar 

  17. Ramani S, Krackov SK. Twelve tips for giving feedback effectively in the clinical environment. Med teach. 2012;34(10):787–91.

    Article  Google Scholar 

  18. Subha R, et al. Uncovering the unknown: a grounded theory study exploring the impact of self-awareness on the culture of feedback in residency education. Med Teach. 2017;39(10):1065–73.

    Article  Google Scholar 

  19. Boursicot K, Kemp S, Wilkinson T, Findyartini A, Canning C, Cilliers F, et al. Performance assessment: consensus statement and recommendations from the 2020 Ottawa conference. Medical Teacher. 2021;43(1):58–67.

    Article  Google Scholar 

  20. Heeneman S, de Jong LH, Dawson LJ, Wilkinson TJ, Ryan A, Tait GR, et al. Ottawa 2020 consensus statement for programmatic assessment–1. Agreement on the principles. Med Teach. 2021;43(10):1139–48.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

APM received grants to develop “prEPAred” from the University of Zurich (“Competitive Teaching Grant”) and from the Swiss Institute for Postgraduate and Further Education in Medicine (SIWF).

Author information

Authors and Affiliations

Authors

Contributions

ND, JMG, HF and APM contributed to the study conception and design. Data collection and analysis were performed by ND, JMG, HF and APM. The first draft of the manuscript was written by ND. All authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Nadine Diwersi.

Ethics declarations

Ethics approval and consent to participate

The study was granted exemption by swissethics, Project-ID 2019–00242. All methods were carried out in accordance with relevant guidelines and regulations. During registration all users of prepared have to actively agree to the “Terms of Use” including the “prepared Privacy Notice” which, among other topics, states the following: “We may share aggregated pr pseudonymized information (i.e., no identification of you will be possible) for research and statistical purposes with (…) scientists (…). We will only allow scientists of reputable institutions with a clear track record access to information; all user information is pseudonymized or aggregated such as you cannot be identified directly or indirectly.” Details can be found under www.prepared.app/legal.

Transmission of each data point between the app and the server is encrypted. Data is stored on a high-security swiss server. For study purposes all data sets are evaluated anonymized.

Consent for publication

ND and HF agree that their images as used in Fig. 1: user interface of educator (left) and trainee (right) are being used for publication.

Competing interests

The authors declare to have no competing interests as defined by BMC, or other interests that might be perceived to influence the results and/or discussion reported in this paper.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Diwersi, N., Gass, JM., Fischer, H. et al. Surgery goes EPA (Entrustable Professional Activity) – how a strikingly easy to use app revolutionizes assessments of clinical skills in surgical training. BMC Med Educ 22, 559 (2022). https://doi.org/10.1186/s12909-022-03622-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-022-03622-1

Keywords