Skip to main content

Adaptations in clinical examinations of medical students in response to the COVID-19 pandemic: a systematic review

Abstract

Introduction

Clinical examinations (assessments) are integral to ensuring that medical students can treat patients safely and effectively. The COVID-19 pandemic disrupted traditional formats of clinical examinations. This prompted Medical Schools to adapt their approaches to conducting these examinations to make them suitable for delivery in the pandemic. This systematic review aims to identify the approaches that Medical Schools, internationally, adopted in adapting their clinical examinations of medical students in response to the COVID-19 pandemic.

Methods

Three databases and four key medical education journals were systematically searched up to 22 October 2021; a grey literature search was also undertaken. Two reviewers independently screened at title, abstract stage and full text stage against predefined eligibility criteria. Discrepancies were resolved by discussion and involvement of senior authors. Risk of bias assessment was performed using an adapted version of a pre-existing risk of bias assessment tool for medical education developments. Results were summarised in a narrative synthesis.

Results

A total of 36 studies were included, which documented the approaches of 48 Medical Schools in 17 countries. Approaches were categorised into in-person clinical examinations (22 studies) or online clinical examinations (14 studies). Authors of studies reporting in-person clinical examinations described deploying enhanced infection control measures along with modified patient participation. Authors of studies reporting online clinical examinations described using online software to create online examination circuits. All authors reported that adapted examinations were feasible, scores were comparable to previous years’ student cohorts, and participant feedback was positive. Risk of bias assessment highlighted heterogeneity in reporting of the clinical examinations.

Conclusions

This review identified two broad approaches to adapting clinical examinations in the pandemic: in-person and online. Authors reported it was feasible to conduct clinical examinations in the pandemic where medical educators are given sufficient time and resources to carefully plan and introduce suitable adaptations. However, the risk of bias assessment identified few studies with high reporting quality, which highlights the need for a common framework for reporting of medical education developments to enhance reproducibility across wider contexts. Our review provides medical educators with the opportunity to reflect on past practises and facilitate the design and planning of future examinations.

Peer Review reports

Background

Clinical examinations, or assessments, are integral to ensuring medical students are competent to progress to higher levels of training or a medical qualification [1, 2]. The most widely used form of clinical examination is the Objective Structured Clinical Examination (OSCE) [3,4,5]. OSCEs involve candidates rotating around a circuit of stations where each station has a different task ranging from procedural skills to history taking.

The COVID-19 pandemic and subsequent implementation of social distancing rules disrupted traditional formats of clinical examinations [6, 7]. Typically, these examinations involve numerous participants, including candidates, examiners and patients, performing tasks such as physical examinations in a confined venue. These formats were no longer appropriate for delivery in the early months of the COVID-19 pandemic, therefore Medical Schools had to adapt swiftly their approaches to clinical examinations in order to conform with local and national COVID-19 restrictions, and to ensure the safety of all participants [8, 9].

Three systematic reviews have investigated medical education developments due to the pandemic [10,11,12] of which two considered developments in assessment [11, 12]. Gordon et al. [12] subsequently conducted an updating scoping review [13] and identified a growing body of literature on adaptations to clinical examinations due to the COVID-19 pandemic, concluding that there was now a need for a systematic review focussed on assessment to capture and summarise developments in this field. Medical educators now need an up-to-date review of policy and practice changes instituted in order to learn from the experiences of the last two years, inform future designs for clinical examinations, and determine what could and should remain post-pandemic to facilitate efficient, safe and effective assessment practice. We aimed to address this need by undertaking a systematic review that addressed the following three main questions:

  • How were clinical examinations adapted in response to the COVID-19 pandemic? (i.e., description or ‘what was done?’)

  • What were the successes and challenges of designing and implementing clinical examinations? (i.e., evaluation or ‘what went well and what didn’t?’)

  • What are the recommendations for future practices informed by lessons learnt by the study authors? (i.e., implications or ‘what’s next?’)

Methods

Our systematic review was conducted from January to October 2021. Prior to commencing, a study protocol was uploaded onto the Center for Open Science (OSF) registry (https://doi.org/10.17605/OSF.IO/R64NZ) [14]. We conducted this review in accordance with the STructured apprOach to the Reporting In healthcare education of Evidence Synthesis (STORIES) statement [15] and the Best Evidence Medical Education (BEME) review guidance [16].

Search strategy

Our original search took place in February 2021. However, because of the topical nature of this review and the rate at which new literature is emerging, we conducted a final updating search on 22nd October 2021 using the same search strategy as described below.

We searched three electronic databases: MEDLINE, EMBASE and ERIC (Education Resources Information Centre). We piloted our search strategy on MEDLINE. Our final search strategy consisted of three axes which were combined with Boolean operators: (one representing the COVID-19 pandemic) AND (one representing medical education) AND (one representing clinical examination). Each axis consisted of keywords with their truncated forms and Medical Subject Headings (MeSH)/subject headings or descriptors specific to the database. We cross-checked our database searches against our searches of indexed journals to confirm the database searches captured papers identified from our journal searches.

In addition to our database search, we hand-searched the online publications of four key medical education journals (The Clinical Teacher, Medical Education, Medical Teacher and MedEdPublish) using the same Boolean combination of axes described above.

We also included Google Scholar in our search as it captured international texts and other forms of non-peer reviewed material. Since Google Scholar yields large volumes of irrelevant results, we made the decision prior to the search to stop screening when no results were passing through title and abstract screening after two pages of results (20 hits) [17].

We searched for grey literature on the An International Association for Medical Education (AMEE) website [18], including the COVID-19 page with webinars [19] and the virtual conference book from September 2020 [20]; The Association for the Study of Medical Education (ASME) website [21], and ASME-Bite-Size Youtube playlist [22] (a platform created for ASME members and the wider medical education community to discuss challenges faced in the COVID-19 pandemic and disseminate good practices) [21, 22]; and the MedEdPORTAL website [23], including the virtual learning resources during COVID-19 page [24].

Results were exported onto Endnote reference manager [25] and subsequently to Rayyan systematic review software [26].

Study eligibility

The SPIDER model [27], a search strategy tool used for qualitative research, was used to refine our review question and determine study eligibility criteria.

We defined a structured clinical examination as an examination in a simulated clinical environment in which candidates perform pre-designed tasks, are examined by appointed examiners, and where multiple candidates are examined in turn. This contrasts with a work-based assessment which we define as individual or small groups of students being assessed on placement in a hospital or clinical environment by a clinician.

Inclusion and exclusion criteria were piloted at the title and abstract screening stage (first 100 results) and full text screening stage (first 10 results).

The following inclusion criteria were applied:

  • Studies describing how one or more Medical Schools adapted their clinical examination of medical students in response to the COVID-19 pandemic

  • Studies describing adaptations to any type of non-work based, structured clinical examination (e.g., OSCE)

  • Studies including medical students in any year of study

  • Studies available as pre-publications and non-peer reviewed material in addition to published articles in peer reviewed scientific journals

  • Studies from any location and in any language

The following exclusion criteria were applied:

  • Studies describing any non-clinical, knowledge-based examinations (e.g., recall written examinations)

  • Work based assessments

  • Studies including examination candidates with any form of provisional medical registration

  • Opinion pieces where authors do not include a description of the adaptations to the design and implementation of their own deployed clinical examination

Two independent reviewers (SC and ET) screened titles, abstracts and full text articles against the inclusion and exclusion criteria. Rayyan [26] was used to record independent screening decisions by the two reviewers and conflicts were discussed with the additional involvement of senior authors if agreement was not reached. Inter-rater reliability was calculated at each screening stage; prior to commencing we set a threshold value for Cohen’s Kappa of 0.61 and if it was below this value, a senior reviewer would also conduct the screening and compare with the other two reviewers’ results. Results were reported according to ‘Preferred Reporting Items for Systematic Reviews and Meta-Analyses’ (PRISMA) statement for referred reporting items for systematic reviews and meta-analyses [28].

Data extraction

We used Excel to create a data extraction form. Initially, one reviewer (SC) piloted the data extraction form on five studies. Two independent reviewers (SC and ET) performed data extraction on a random sample of 15% of the included studies to check for alignment then subsequently SC performed data extraction on the other 85%. Broadly, data were extracted in three categories: study characteristics, what was reportedly done and study author evaluations.

Risk of bias

Currently, there is no consensus method for assessing study quality in studies included in medical education systematic reviews [12, 29,30,31]. Authors’ postulate this is due to the complex nature of medical education developments [12]. Therefore, we adapted a pre-existing tool [12] previously used to assess reporting bias in this context [31,32,33].

Our final version of the tool rated underpinning bias, setting bias, resource bias and evaluation bias as high quality, unclear quality, or low quality (Table 1). This tool appraises how study authors reported the implementation and conduct of the revised clinical examination, so while it is an indicator of study quality, it does not directly assess the quality of the intervention itself; it relies on the subjective judgement of the reviewer due to the absence of marked quantitative thresholds. Therefore, we use the term ‘risk’ of bias, and we included all papers in the narrative synthesis with no respective weightings given to studies due to their quality score.

Table 1 Risk of bias assessment (adapted from Gordon et al. [12])

Two reviewers (SC and ET) performed a risk of bias assessment on a random sample of 15% of the studies to check for alignment and identify major discrepancies followed by SC assessing the remaining 85%. Again, inter-rater reliability was calculated using Cohen’s Kappa, with the same threshold as above.

Narrative synthesis

Because of the descriptive nature of the studies, we planned to summarise results in a narrative synthesis using guidance from Popay et al. [34]. This approach recommends using tables to identify common practises in the phenomena of interest and group studies accordingly. This led to grouping studies into ‘online examinations’ or ‘in-person examinations’. Further sub-categorisation was also determined using the same approach utilising the data extraction form headings. Our narrative synthesis broadly aimed to describe ‘what was done’ and ‘study author reflections’ in line with our review’s aims, supplemented with the use of visual aids such as photos or tables to report the general characteristics of studies, as well as risk of bias score.

Results

Study selection

Our search strategy identified a total of 6,972 hits, which following de-duplication, resulted in 5,628 unique records (Fig. 1). Following screening at title and abstract stage, 255 articles were obtained for screening at the full text stage. The primary reasons for exclusion at full text stage were: opinion pieces in which the authors were not reporting direct experiences of adapting or delivering an examination; studies describing medical education initiatives, but not specifically clinical examinations; studies describing other types of examination (e.g. theory based); and examinations in the wrong population (not medical students). Subsequently, 36 studies were included in the narrative synthesis.

Fig. 1
figure 1

PRISMA flow diagram [28]

Considering the two reviewers, Cohen’s Kappa was 0.79 at title and abstract stage and 0.92 at full text stage, representing substantial agreement.

Study characteristics

General study characteristics

Some studies reported findings from multiple Medical Schools, and some Medical Schools published multiple studies, therefore the 36 included studies reported the approach adopted by 48 Medical Schools. Twenty-two studies were published in 2020 and fourteen in 2021 with an overall date range from March 2020 to September 2021. Studies reported findings from 17 countries in six continents. The largest number of studies came from Asia (39%) and Europe (36%). One study required translation from Spanish into English using an externally sourced translator before inclusion.

Twenty-seven studies primarily focused on structured clinical examinations while 9 described clinical examinations as part of a wider focus on medical education developments in the pandemic as a whole.

Types of clinical examinations

Thirty-four studies described adaptations to OSCEs, one described adaptation to the M3 structured clinical examination (the German state licensing examination taken after six years of Medical School), and one described development of the Virtual Clinical Encounter Examination (VICEE; a structured examination designed to predominantly assess non-psychomotor clinical skills). Of the thirty-four studies describing OSCEs, thirteen described end of year OSCEs; eleven described end of rotation/clerkship OSCEs; five described supplementary OSCEs; and five did not specify the type of OSCE examinations.

Reported outcomes

Out of the 36 studies, only ten reported quantitative outcomes comparing candidate scores from previous years to the present year, and 24 reported either formal (e.g., survey) or informal feedback from stakeholders and participants who took part in the clinical examinations. Furthermore, six studies reported the number of COVID-19 cases in participants following in-person clinical examinations.

Risk of bias

Only three studies scored highly in all four domains in the risk of bias assessment tool (Table 2) [35,36,37]. Two of these studies were by the same author group [35, 36], and the description of the examinations by all three studies were thorough enough to allow replication. No studies scored low quality in all four domains; the two lowest rated studies scored low quality in two domains and unclear quality in two domains [38, 39]. Of these, one study was a letter to the editor, meaning it did not undergo the same rigorous review as a research paper [38]; and the other study did not focus on clinical examinations as its primary aim, rather it was a description of adaptations to medical education as a whole [40].

Table 2 Study characteristics and risk of bias assessment

Summary of the clinical examinations

Studies reported either in-person or online clinical examinations amended or newly developed in response to the pandemic; fourteen studies described in-person examinations and 22 described online examinations (Table 2). We assumed that when authors described standardised or simulated patients, they were referring to an individual trained to act as a patient. However, we have used the same descriptive terms that the authors used in their studies in our reporting of results.

In-person examinations

What was done?

Infection control measures

All studies describing in-person examinations described enhanced infection control measures deployed to minimise risk of transmission of COVID-19 (Table 3). These measures included reduced participant numbers, the use of personal protective equipment (PPE) by all participants and even remote examiners using videoconferencing systems to observe stations (Fig. 2, a visual representation of several infection control measures at Duke-NUS University, Singapore).

Table 3 Infection control measures described by study authors
Fig. 2
figure 2

An image from an OSCE with infection control measures (off-site examiner using video-conferencing; PPE) at Duke-NUS Medical School [72]. (Taken with permission from the study author. Participants and simulated patients in the photo gave written consent.)

Patient participation

Study authors described modifying their usual approach to patient participation in response to the pandemic. Two studies described replacing patients with mannequins and simulators [38, 45] and five studies described using hybrid stations that mixed a professional encounter with patients/simulated patients and a physical examination/practical procedure demonstrated on a mannikin or task trainer [36, 40, 42, 49]. Several studies explained how they increased the use of simulated patients in stations: one described how simulated patients were trained to mimic clinical conditions by eliciting clinical signs (examples given in Fig. 3 from National University, Singapore) [49, 73] while two others described using visual aids to support simulated patients in their role [44, 45], e.g. make-up artistry and wigs that were used to transform actors into elderly patients [44]. Three studies described how they safely used patients with specific medical conditions, allocating each a nurse to assist them [35, 46, 50].

Fig. 3
figure 3

Examples of simulated patient cases at National University, Singapore [73]. (Taken with permission from the study author)

Station content

Two studies described history-taking stations which were undertaken via videoconferencing with patients offsite while students remained onsite [36, 38]. Additionally, two studies described replacing activities used in traditional stations [41, 46]; one study replaced fundoscopic examination with interpretation of retinal photos [46], and another replaced patient encounters with questioning from an examiner [41].

Study author evaluations

Reported outcomes

Four studies reported pass/fail rates were comparable to those seen in previous years’ cohorts [39, 45, 46, 48, 50]. Four studies reported positive feedback was received from participants and stakeholders regarding the clinical examination [35, 42, 44, 45]; this included participants describing the examination as “smooth and successful” [42] and external examiners commending the defensibility of the examination [35]. No recorded cases of COVID-19 transmission were reported in participants by the five studies that investigated this outcome during their respective follow-up periods [35, 36, 46, 49].

Challenges (including recommendations)

Several challenges were described. Some studies reported that participants were anxious about being exposed to COVID-19 [36, 40, 46, 49]; one study noted that some patients declined invitations to participate because of this fear [46]; several authors recommended holding regular briefings and check-ups to reassure participants in future years [45, 49]; and one study recommended offering compensation for patients if they contract COVID-19 after the examination [49]. Some authors also described difficulties in examination planning. Two studies described how their plans were constantly disrupted due to the changing local COVID-19 situation in Singapore such as changing lockdown rules [42, 49]. Additionally, authors described difficulties in recruiting sufficient numbers of clinical staff due to their clinical deployment in the pandemic [35, 46]. The validity of the examination was also described as a challenge; one study reported how the exclusion of severely ill patients meant that the range and severity of clinical conditions could not be fully represented [36, 49]. One study also reported how planned infection control measures incurred a high cost [46].

Successes (including recommendations)

Several studies noted that their teamwork and planning had been a success and recommended a significant investment in the planning of future examinations [35, 36, 42, 49]. One study additionally recommended conducting a thorough risk assessment and tailoring risk mitigation strategies accordingly, as well as ensuring there is sufficient PPE available for all attendees [36]. Two studies reported successful use of technology in the examination, using digital scoring and recording student performance as well as using videoconferencing in history-taking stations. These studies recommended use of technology in future examinations [36, 38].

Online examinations

What was done?

Circuit structure

Authors described attempting to make the online examinations resemble in-person examinations by creating different online rooms as part of a circuit structure using the functionality of videoconferencing software. Several online resources and software were used, as shown in Table 4. In Zoom, this was done using the ‘breakout room’ function [45, 51, 55, 56, 60, 63, 66, 69, 71]; candidates started in a main Zoom room where they were briefed or given pre-encounter notes [62,63,64,65,66], subsequently they joined individual Zoom ‘breakout rooms’ where examiners and patients were pre-positioned. Once the candidate-patient encounters were complete, candidates filled in their post encounter notes and were placed back into the main Zoom room, and the process repeated for the next case [60, 63]. Other studies reported using similar functions in Microsoft Teams [37, 52, 62, 68, 71], where candidates remained in a ‘channel’ while simulated patients and examiners rotated between candidates. (Fig. 4, a visual representation of the OSCE structure on Teams from the University of Buckingham) [52, 71, 74].

Table 4 Technological software described and their uses
Fig. 4
figure 4

A diagram to illustrate the OSCE cycle on Microsoft Teams at the University of Buckingham [74]. (Taken with permission from the study author)

Patient participation

There was considerable variation between studies in the approach to patient participation in stations. Four studies described holding examinations without real or simulated patients [37, 53, 57, 66]; examiners asked candidates questions based on sequential images or laboratory results shared with the candidate using the ‘share-screen’ function of Zoom [53], or Microsoft Teams [37, 62]. In addition, candidates were questioned on what they would then ask patients in their history to formulate a diagnosis [37, 53]. Two studies incorporated virtual patients [56, 60]; authors in one study created a 3D virtual patient in a consultation room (Fig. 5, from Centro Universitario Christus) [75] using specialist software. Candidates could move 360° around the patient and a virtual script was created with responses activated according to which option the candidate selected, for example, “order exams” with a list of drop-down options to select [56]. Four studies reported the use of standardised patients [55, 56, 63, 69] and seven included simulated patients [52, 62, 64, 66, 68, 71]. Consultations between simulated patients, examiners and students were described as a three-way telehealth consultation with a focus on history-taking skills [66]. Another study described examiners acting as standardised patients [69].

Fig. 5
figure 5

3D prototype of a virtual patient created at Centro Universitario Christus (Campyus Parque Ecologico) [75]. (Taken with permission from the study author)

Station content

Authors described modifying practical skills and physical examinations, though an explanation of how practical skills were examined was described in only one study [62]. Candidates were sent equipment including suturing equipment, catheters, simulated injections and written documents like notes and drug charts in advance for subsequent use in individual stations, e.g. candidates were asked to suture a banana or vaccinate an orange while on camera during videoconferencing [62]. Three studies described modifications to physical examinations; in two cases candidates narrated physical examinations and verbalised manoeuvres whilst standardised patients verbalised findings [56, 61]. In the other study, candidates were only asked to undertake a neurological examination as the study authors stated it relies heavily on inspection [62]. Two studies described adjusting traditional in-person scoring rubrics to make them suitable for an online clinical examination by modifying physical examination expectations [56, 65], and adding elements of a telemedicine OSCE [65]. One study stated that the new standardised patient checklists, communication scoring tools and faculty observation rubrics would be maintained in the future [63].

Study author evaluations

Reported outcomes

Overall, study authors reported positive feedback from participants and stakeholders regarding online examinations [37, 54, 56, 58, 60, 62, 66, 68,69,70,71]. Eight studies also described how stakeholders greatly valued the training, internet and bandwidth checks, as well as the briefings given to them before the examination [56, 59, 61, 64, 66, 68, 71]. Five studies reported that candidate scores were comparable to those of previous years’ student cohorts [58, 63, 65,66,67,68], while one reported that candidate scores were lower compared to previous years [37].

Challenges (including recommendations)

A commonly noted limitation of the examinations included the inability to assess physical examination skills [37, 53, 56, 58, 64, 66, 68] and practical procedural skills [37, 52, 53, 68, 71]. Additionally, several studies reported difficulties with internet connectivity. Four studies reported minor technological issues [60, 61, 67, 69]. However, five studies reported no technological issues [53, 57, 62, 68, 71]. Contingency plans for internet connectivity issues included hosting sessions prior to the examination to check the compatibility of stakeholders’ computers with the hosting platform, and ensuring participants had sufficient internet bandwidth [53, 66]. Additionally, several studies reported recording students’ performance for retrospective marking in case internet cut off in the middle of a station and examiners were unable to mark the station in real time [37, 51, 53, 71]. Trial runs of OSCEs were also held and were recommended for future examinations to identify potential problems [37, 52, 62, 65, 66, 68, 71]. Additionally, authors found that using a hosting platform familiar to candidates through previous teaching reduced the chance of technological or compatibility problems arising [51, 62, 66, 71].

Successes (including recommendations)

Improvements in examination planning was frequently described in the studies and all recommended taking a more active and detailed approach to planning in the future [52, 62, 68, 70, 71]. Several studies recommended ensuring that adequate numbers of staff (including a ‘super-host’/controller host) should be made available [62], and these staff should have a more diverse set of skills (including IT skills) [37, 70, 71]. Additionally, several studies noted the need to prepare sufficient resources; one recommended distributing documents prior to the examination, including instructions for candidates (contacts for troubleshooting, videoconferencing instructions, and camera and microphone set-up) [70]. Another study recommended continuing to print copies of all relevant documents (e.g. marking grids) so examiners and hosts could concentrate on the candidate on the screen [62]; this study also recommended sending candidates a list of equipment before the examination (e.g., suturing equipment), but that this list not be limited to the specific examination stations, so students cannot predict what will come up. Several studies emphasised the importance of adequate training and communication for all participants [66, 68, 70]; one study noted that ‘over-communication’ is very important, especially for candidates [66]. Three studies recommended using Zoom due to its functionality [54, 70], and in general, study authors reported that examinations had been successful and effective at discriminating between candidates on the basis of the comparability between candidate scores this year and from years’ cohorts [53, 56, 59, 62, 66, 68, 71].

Future plans

Five studies indicated future plans to hold online OSCEs [51, 61,62,63, 67, 71]; two reported imminent plans to continue online OSCEs during the pandemic [61, 66] and five indicated plans to use online OSCEs beyond the pandemic [37, 51, 62, 63, 71], noting that it could be a beneficial tool for students on remote placements [62] or that it saved time and resources compared to in-person examinations [37, 63]. One study reported wanting to return to traditional face-to-face OSCEs when the pandemic permits [53] and two studies noted that while virtual clinical examinations held promise in the pandemic [60, 69], it was important to recognise their limitations, especially in widening the technological deprivation gap between students [69]. Seven studies referred to telehealth in their conclusions, where telehealth is defined as the “delivery of health care services, where patients and providers are separated by distance” [76]; authors noted that online OSCE stations could be a useful tool for examining telehealth skills [56, 62, 63, 65]. The authors who created the 3D prototype patient (Fig. 5) also indicated more research is needed into the functionality of this 3D prototype for examining medical students [56].

Discussion

Summary of results

To the best of our knowledge, this systematic review represents the first synthesis of the approaches adopted by Medical Schools to undertake clinical examinations in context of restrictions imposed by the COVID-19 pandemic. Our review found that there were two main approaches to conducting the examinations: adaptations to in-person examination or a switch to online examinations.

Study authors describing in-person clinical examinations recounted deploying stringent infection control measures in conjunction with modifying station content and patient participation to reduce the risk of transmission of COVID-19. Common adaptations included replacing real patients with simulated patients and utilising mannequins or task trainers for practical skills or physical examination skills assessment. None of the studies that recorded postliminary COVID-19 cases reported any cases of transmission in the examination participants, though many study authors reflected that it was a challenge to address participants’ fear of catching COVID-19. Commonly articulated successes and recommendations included good teamwork in the planning of the examinations.

Study authors describing online clinical examinations reported devising OSCE circuits on online software and modifying station content to enable delivery online. Zoom and Microsoft Teams were the most common assessment hosting platforms; online OSCE station rooms were constructed using ‘breakout rooms’ or ‘channels’ accordingly. Studies frequently reported replacing candidate-patient interactions with examiner questioning where candidates would be asked to verbalise physical examinations/manoeuvres in lieu of actually performing the physical examination. Indeed, all study authors noted the inability to assess physical examination and practical procedural skills as a major limitation to online delivery of examinations as well as the heavy dependence on a stable internet connection this approach requires for all participants. Future recommendations included hosting online briefings to check internet bandwidth and computer compatibility prior to the examination (which was also listed as a common success). Study authors also noted that with sufficient planning and development, online examinations could be effective at examining clinical skills and indicated that in future, online clinical examinations could be very useful for examining students on remote placements and provided an authentic approach to assessing telehealth skills.

In both approaches to adapting clinical examinations, candidate scores were reported to be comparable to previous years’ student cohorts and there was generally positive feedback received from participants and stakeholders.

Quality of the evidence base

Unlike research with outcome measures, most studies included in our review focused on sharing practises, therefore we elected to assess for risk of reporting bias in line with Gordon et al. [12]. It is understandable that amid a pandemic there is demand for practice-developments and research to be disseminated swiftly, which means that authors might not undertake outcome evaluation due to pressures of time and the pandemic response. Nonetheless, it is imperative authors uphold rigour in the reporting of medical education developments. Our review considered that high quality reporting should describe the: reasons for the adaptation of the clinical examination; setting of the examination; resources used; and evaluation in the form of study author reflections or research outcomes. Omission of any of these key details would make the reported adaptation to clinical examinations less reproducible across different contexts. Unfortunately, few papers in our review met these reporting criteria fully our risk of bias assessment highlights the heterogeneity in reporting of medical education developments. Part of the explanation for this could be that several studies were published as short reports. However, this also emphasises the necessity for systematicity in the reporting of medical education developments.

Comparison with existing literature

Though no prior systematic review has specifically considered the adaptations to clinical examinations required by the pandemic, two systematic reviews examining medical educational developments more broadly have been published; Gordon et al. [12, 13] and Dedeilia et al. [11]. Both reviews included sub-sections on assessment; Dedeilia et al. [11] described teleconferencing to assess clinical skills and Gordon et al. [12] briefly outlined the use of online OSCEs and in-person OSCEs with additional infection control measures. However, neither review provided more than a brief description of the adaptations over a few sentences. Four studies from our review overlapped with those identified by Gordon et al. [12], who undertook their searches in May 2020 whilst ours ran to October 2021; from May 2020 to October 2021, numerous medical education papers were published at a fast rate so we were able to identify and incorporate a number of additional papers in our review. In common with our findings, Gordon et al. [12] also noted that included studies did not report developments in sufficient detail and can be regarded as low quality in terms of reporting bias in the context of medical educational developments. They recommended future study authors use the questions- ‘what?’, ‘so what?’, ‘now what?’ when reporting developments (also known as Borton’s model of reflection) [77].

Strengths and limitations

The strengths of our review included the systematicity and methodological rigour we employed throughout. We performed comprehensive literature searches including key bibliographic databases, key journals, grey literature and relevant websites. This meant we encompassed a range of sources and retrieved international studies, including those published in languages other than English. Additionally, we found substantial agreement between the two reviewers at both stages of screening. We effectively piloted eligibility criteria and the data extraction form prior to commencing each stage of our review and adapted a pre-existing medical education development reporting risk of bias tool to make it suitable for use in our review. Furthermore, two reviewers independently assessed a sample of studies for risk of bias, in addition to undertaking extensive discussions with senior authors to determine suitable thresholds for quality ratings in each of the four risk of bias assessment domains. Finally, we updated all searches in October 2021 given the currency of our review.

There were some limitations to our study. It was challenging to develop a search strategy that was sufficiently sensitive to capture key papers, yet specific enough to make it feasible to screen the retrieved papers in a timely manner. This may have resulted in missed studies and was compounded by the observation that medical education publications were indexed inconsistently in the databases we searched. The approaches of just 48 Medical Schools were included in our review; however, all Medical Schools internationally would have had to adapt their approaches to clinical examinations in the pandemic. Numerous factors could have influenced whether Medical Schools sought to publish adaptations to their practices, such as local COVID-19 restriction timescales, familiarity with the research and publication process, and staff capacity; however, exploring this is not within the scope of this review. Of the 39 Medical Schools included, a significant proportion of studies originated in Asia, meaning our review may not fully represent the international range of approaches that Medical Schools adopted to facilitate clinical examinations. Finally, our risk of bias assessment was subjective, and occasionally it remained unclear whether studies should be rated low quality, unclear quality or high quality in each of the four domains.

Future recommendations

Firstly, regardless of the approaches adopted, study authors all concluded that it was feasible to adapt existing clinical examinations and deliver effective assessments despite the restrictions placed on Medical Schools by the pandemic. Commonly stated enabling factors to the success of these adaptations were sufficient workforce and other resources, and the investment in time for planning and carrying out these examinations (which may take longer than traditional clinical examinations), and we would urge that medical educators are given adequate time, facilities, and resources in order to plan and deliver the necessary changes.

Several study authors describing a move to online clinical examinations concluded that these approaches were unable to assess key domains such as physical examination skills and practical procedural skills. Therefore, adaptations to in-person clinical examinations with stringent infection control measures may be preferred where possible. However, online delivery of clinical examinations may provide a useful tool for examining telehealth skills. The advent of ubiquitous internet usage, which has been expedited by the pandemic, means that telehealth has become a prominent mode of delivery in many settings. Telehealth demands a distinctive skill-set compared to conventional face-to-face consultations, in which medical students will need to gain skills and demonstrate competence in the future [78, 79]. Some recent studies describe the development of a telehealth curriculum [80, 81] and we prompt further primary research to establish the optimum approach for telehealth curriculum delivery and examination of telehealth skills of medical students.

In addition, we would also recommend that Medical Schools and regulatory bodies work together to ensure adequate contingency plans are prepared for future disruptions to critical clinical assessments, for example, having a blueprint for an online clinical examination ready for immediate use. Moreover, we urge medical educators to consider whether their assessment strategy is resilient to future disruptions, for example through reducing reliance on final year high-stakes clinical examinations and more focus on programmatic assessment and work-based assessments. Medical educators could consider a model such as Miller’s pyramid; a framework which emphasises the need for different assessment modalities for different expected outcomes in medical education [82]. This enables Medical Schools to draw on a body of evidence regarding the competence and performance of individual students if it were not possible to proceed with planned clinical examinations in the future.

Because of the disparities in both the indexing and reporting of clinical examinations described in the studies included in our review, we recommend that medical education research and publication continues to work towards increased systematicity. Examples of highly systematic work that readers might study include BEME guides [16, 83]. Increased systematicity could include agreed common terminology for medical education research that can be used when indexing studies. Additionally, we would recommend that either Gordon et al.’s [12]. ‘what?’, ‘so what?’, ‘now what?’ questions be considered when reporting developments in medical education, or that a framework for reporting these developments be created afresh. We appreciate that reporting changes in education practises are different to the standard reporting of primary research. However, greater systematicity in reporting will improve the ability to synthesise and disseminate new practises and enhance the reproducibility of study findings across wider contexts.

Conclusions

We conducted a systematic review to identify the approaches that Medical Schools, internationally, used when conducting clinical examinations of medical students in the COVID-19 pandemic. We identified two broad approaches, adaptation to in-person examinations and a switch to online clinical examinations. Study authors reporting both types of adaptation concluded that conducting clinical examinations was feasible, but it required a significant investment in planning, time, and resource. Of note, a major limitation to online examinations was the inability to examine physical examination skills or practical procedural skills, though other advantages of online approaches are a potential area for further research. We hope our review will be used as a resource by the international medical education community to understand what adaptations have been made in response to the pandemic to help design future clinical examinations in their own institutions that meet local needs; to facilitate reflections on past practises; and to determine what should remain post-pandemic.

Availability of data and materials

All data generated or analysed during this study are included in this published article [and its supplementary information files].

Abbreviations

AMEE:

An International Association for Medical Education

ASME:

The Association for the Study of Medical Education

BEME:

Best Evidence Medical Education

COVID-19:

Coronavirus Disease

ERIC:

Education Resources Information Centre

MeSH:

Medical Subject Headings

OSCE:

Objective Structured Clinical Examination

PPE:

Personal Protective Equipment

PRISMA:

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

STORIES:

 STructured apprOach to the Reporting In healthcare education of Evidence Synthesis

VICEE:

Virtual Clinical Encounter Examination

References

  1. General Medical Council. Assessment in undergraduate medical education. 2011. https://www.gmc-uk.org/-/media/documents/Assessment_in_undergraduate_medical_education___guidance_0815.pdf_56439668.pdf. Accessed 21 Dec 2020.

  2. General Medical Council. Outcome for Graduates 2018. 2018. https://www.gmc-uk.org/-/media/documents/outcomes-for-graduates-2020_pdf-84622587.pdf. Accessed 21 Dec 2021.

  3. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975. https://doi.org/10.1136/bmj.1.5955.447.

    Article  Google Scholar 

  4. Harden RM, Lilley P, Patricio M. The definitive guide to the OSCE: The objective structured clinical examination as a performance assessment. 1st ed. Edinburgh: Elsevier Health Sciences; 2015.

  5. Khan KZ, Ramachandran S, Gaunt K, Pushkar P. The objective structured clinical examination (OSCE): AMEE guide no. 81. Part I: An historical and theoretical perspective. Med Teach. 2013. https://doi.org/10.3109/0142159X.2013.818634.

    Article  Google Scholar 

  6. World Health Organisation. WHO director-general’s opening remarks at the media briefing on COVID-19 – 11 March 2020. https://www.who.int/director-general/speeches/detail/who-director-general-s-opening-remarks-at-the-media-briefing-on-covid-19---11-march-2020. Accessed 28 Dec 2020.

  7. World Health Organisation. Coronavirus disease (COVID-19) advice for the public. https://www.who.int/emergencies/diseases/novel-coronavirus-2019/advice-for-public. Accessed 28 Dec 2020.

  8. General Medical Council. Joint statement: early provisional registration for final year medical students. https://www.gmc-uk.org/news/news-archive/early-provisional-registration-for-final-year-medical-students. Accessed 28 Dec 2020.

  9. Lapolla P, Mingoli A. COVID-19 changes medical education in Italy: will other countries follow? Postgrad Med J. 2020. https://doi.org/10.1136/postgradmedj-2020-137876.

    Article  Google Scholar 

  10. Wilcha RJ. Effectiveness of virtual medical teaching during the COVID-19 crisis: A systematic review. JMIR Med Educ. 2020. https://doi.org/10.2196/20963.

    Article  Google Scholar 

  11. Dedeilia A, Sotiropoulos MG, Hanrahan JG, Janga D, Dedeilias P, Sideris M. Medical and surgical education challenges and innovations in the COVID-19 era: A systematic review. In Vivo. 2020. https://doi.org/10.21873/invivo.11950.

    Article  Google Scholar 

  12. Gordon M, Patricio M, Horne L, Muston A, Alston SR, Pammi M, et al. Developments in medical education in response to the COVID-19 pandemic: A rapid BEME systematic review: BEME guide no. 63. Med Teach. 2020;42(11):1202–15.

    Article  Google Scholar 

  13. Daniel M, Gordon M, Patricio M, Hider A, Pawlik C, Bhagdev, et al. An update on developments in medical education in response to the COVID-19 pandemic: A BEME scoping review: BEME Guide No. 64. Med Teach. 2021. https://doi.org/10.1080/0142159X.2020.1864310.

    Article  Google Scholar 

  14. Cartledge S, Ward D, Stack RJ, Terry E. Adaptations of clinical examinations of medical students in response to the COVID-19 pandemic: A systematic review protocol. 2021https://doi.org/10.17605/OSF.IO/R64NZ

  15. Gordon M, Gibbs T. STORIES statement: Publication standards for healthcare education synthesis. BMC Med. 2014. https://doi.org/10.1186/s12916-014-0143-0.

    Article  Google Scholar 

  16. Best Evidence Medical Education. The BEME collaboration. https://www.bemecollaboration.org/. Accessed 12 Apr 2021.

  17. Elseman J. Google scholar searching: 10 ways to avoid retrieving millions of hits. https://library.law.yale.edu/news/google-scholar-searching-10-ways-avoid-retrieving-millions-hits. Accessed 1 Feb 2021.

  18. Association for Medical Education in Europe. AMEE resource centre (ARC). https://amee.org/home. Accessed 13 Feb 2021.

  19. Association for Medical Education in Europe. Covid-19. https://amee.org/covid-19. Accessed 13 Feb 2021.

  20. Association for Medical Education in Europe. amee 2020 the virtual conference: Abstract book. 2020. https://amee.org/getattachment/Conferences/AMEE-Past-Conferences/AMEE-2020/AMEE-2020-Virtual-Abstract-Book-FINAL-resize.pdf. Accessed 13 Feb 2021.

  21. The Association for Study of Medical Education. Welcome to the association for the study of medical education. https://www.asme.org.uk/. 13 February 2021.

  22. Youtube. ASMEBITESIZE. https://www.youtube.com/watch?v=C-jT7U7J7-0&list=PLwl2JZruRn7m_9XbM7thHG4Bdx539wkhg. Accessed 13 Feb 2021.

  23. MedEdPortal. Featured Publications. https://www.mededportal.org/. Accessed 13 Feb 2021.

  24. MedEdPortal. Virtual learning resources during COVID-19. https://www.mededportal.org/virtual. Accessed 13 Feb 2021.

  25. The Endnote Team [computer program]. EndNote X9. Philadelphia: Clarivate Analysis; 2013.

    Google Scholar 

  26. Mourad Oi, Hossam H, Zbys Fand Ahmed E. Rayyan — a web and mobile app for systematic reviews. Syst Rev. 2016; https://doi.org/10.1186/s13643-016-0384-4.

  27. Cooke A, Smith DM, Booth A. Beyond PICO: the SPIDER tool for qualitative evidence. Qual Health Res. 2012. https://doi.org/10.1177/1049732312452938.

    Article  Google Scholar 

  28. Moher D, Liberati A, Tetzlaff J, Altman DG, The PRISMA Group 2009. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009. https://doi.org/10.1371/journal.pmed1000097.

    Article  Google Scholar 

  29. Buckley S, Coleman J, Davison I, Khan KS, Zamora J, Malick S, et al. The educational effects of portfolios on undergraduate student learning: a best evidence medical education (BEME) systematic review: BEME guide no. 11. Med Teach. 2009. https://doi.org/10.1080/01421590902889897.

    Article  Google Scholar 

  30. Critical Appraisal Skills Programme (CASP). CASP qualitative checklist. 2014. https://casp-uk.net/wp-content/uploads/2018/01/CASP-Systematic-Review-Checklist_2018.pdf. Accessed 13 Feb 2021.

  31. Gordon M, Hill E, Stojan J, Daniel M. Educational interventions to improve handover in health care: an updated systematic review. Acad Med. 2018. https://doi.org/10.1097/ACM.0000000000002236.

    Article  Google Scholar 

  32. Gordon M, Farnan J, Grafton-Clarke C, Ahmed R, Gurbutt D, McLachlan J, et al. Non-technical skills assessments in undergraduate medical education: a focused BEME systematic review: BEME Guide No. 54. Med Teach. 2019. https://doi.org/10.1080/0142159X.2018.1562166.

    Article  Google Scholar 

  33. Reed D, Price E, Windish D, Wright S, Gozu A, Hsu E, et al. Challenges in systematic reviews of educational intervention studies. Ann Intern Med. 2005. https://doi.org/10.7326/0003-4819-142-12_part_2-200506211-00008.

    Article  Google Scholar 

  34. Popay J, Robert H, Sowden A, Petticrew M. Arai L, Rodgers M et al. Guidance on the conduct of narrative synthesis in systematic reviews. Lancaster University. 2006. https://www.lancaster.ac.uk/media/lancaster-university/content-assets/documents/fhm/dhr/chir/NSsynthesisguidanceVersion1-April2006.pdf. Accessed 14 Dec 2020.

  35. Boursicot K, Kemp S, Ong TH, Wijaya L, Goh SH, Freeman K, et al. Conducting a high-stakes OSCE in a COVID-19 environment. MedEdPublish. 2020. https://doi.org/10.15694/mep.2020.000054.1.

    Article  Google Scholar 

  36. Canning CA, Freeman KJ, Curran I, Boursicot K. Managing the COVID-19 risk: the practicalities of delivering high stakes OSCEs during a pandemic. MedEdPublish. 2020. https://doi.org/10.15694/mep.2020.000173.1.

    Article  Google Scholar 

  37. Shorbagi S, Sulaiman N, Hasswan A, Kaouas M. Evaluating the feasibility and effectiveness of e- OSCE in the COVID- 19 era. ResearchGate. 2021. https://doi.org/10.21203/rs.3.rs-506145/v1.

    Article  Google Scholar 

  38. Bastanhagh E, Safari S, Mafinejad MK. Letter to Editor: Recommendations for safer management of holding OSCE during COVID-19 outbreak. Med J Islam Repub Iran. 2020. https://doi.org/10.47176/mjiri.34.156.

    Article  Google Scholar 

  39. Wiedenmann C, Wacker K, Böhringer D, Maier P, Reinhard T. Online examination course instead of classroom teaching: adaptation of medical student teaching during the COVID-19 pandemic. Ophthalmologe. 2021. https://doi.org/10.1007/s00347-021-01372-x.

    Article  Google Scholar 

  40. Samarasekera DD, Li D, Goh M, Yeo SP, Ngiam NSP, Aw MM, et al. Response and lessons learnt managing the COVID-19 crisis by school of medicine, National University of Singapore. MedEdPublish. 2020. https://doi.org/10.15694/mep.2020.000092.1.

    Article  Google Scholar 

  41. Adeleke OA, Cawe B, Yogeswaran P. Opportunity for change: Undergraduate training in family medicine. S Afr Fam Pract. 2020. https://doi.org/10.4102/safp.v62i1.5225.

    Article  Google Scholar 

  42. Ashokka B, Ong SY, Tay KH, Loh NHW, Gee CF, Samarasekera DD. Coordinated responses of academic medical centres to pandemics: Sustaining medical education during COVID-19. Med Teach. 2020. https://doi.org/10.1080/0142159X.2020.1757634.

    Article  Google Scholar 

  43. Association for Medical Education in Europe. COVID-19 Assessment, professionalism and progression. Part 1- Best practise in online assessment [vimeo]. 2020. https://vimeo.com/423211603. Accessed 13 Feb 2021.

  44. Bauer D, Germano M, Stierlin J, Brem B, Stöckli Y, Schnabel KP. Delivering a geriatric OSCE station in times of Covid-19 using makeup artistry. GMS J Med Educ. 2020. https://doi.org/10.3205/zma001382.

    Article  Google Scholar 

  45. Fritsche V, Siol AF, Schnabel KP, Bauer D, Schubert J, Stoevesandt D, et al. Use of simulation patients in the third section of the medical examination. GMS J Med Educ. 2020. https://doi.org/10.3205/zma001383.

    Article  Google Scholar 

  46. Lee CH, Ng PY, Pang SYY, Lam DCL, Lau CS. Successfully conducting an objective structured clinical examination with real patients during the COVID-19 pandemic. Hong Kong Med J. 2021. https://doi.org/10.12809/hkmj208839.

    Article  Google Scholar 

  47. Lee V, Lau J, Wong F, Mok E, Tang KY, Choi H, et al. Safe conduct of professional examinations during the COVID-19 pandemic in Hong Kong: A descriptive study. Lancet. 2020. https://doi.org/10.2139/ssrn.3616160.

    Article  Google Scholar 

  48. Lengerke TV, Afshar K, Just I, Lange K. Classroom teaching with simulated patients during COVID-19: the communication skills course in the second year of the model medical curriculum HannibaL. GMS J Med Educ. 2020. https://doi.org/10.3205/zma001374.

    Article  Google Scholar 

  49. Ngiam N, Yasol G, Goh DL-M. Clinical examinations for medical students during the COVID-19 outbreak: a simulated patient programme perspective. BMJ Simul Technol Enhanc Learn. 2020. https://doi.org/10.1136/bmjstel-2020-000693.

    Article  Google Scholar 

  50. Nourkami-Tutdibi N, Hofer M, Zemlin M, Abdul-Khaliq H, Tutdibi E. Teaching must go on: flexibility and advantages of peer assisted learning during the COVID-19 pandemic for undergraduate medical ultrasound education – perspective from the “sonoBYstudents” ultrasound group. GMS J Med Educ. 2021. https://doi.org/10.3205/zma001401.

    Article  Google Scholar 

  51. Anraham JR, Foulds JL, Lee JA, Sonnenberg. vOSCEs 2.0: Operationalising a universal low-cost virtual OSCE. Med Educ. 2021. https://doi.org/10.1111/medu.14492.

    Article  Google Scholar 

  52. Blythe J, Patel NSA, Spiring W, Easton G, Evans D, Meskevicius-Sadler E, et al. Undertaking a high stakes virtual OSCE (“VOSCE”) during Covid-19. BMC Med Educ. 2021. https://doi.org/10.1186/s12909-021-02660-5.

    Article  Google Scholar 

  53. Boyle JG, Colquhoun I, Noonan Z, McDowall S, Walters MR, Leach JP. Viva la VOSCE? BMC Med Educ. 2020. https://doi.org/10.1186/s12909-020-02444-3.

    Article  Google Scholar 

  54. Brown R, Brew-Girard E, Souza SD. Remote Mock OSCE (ReMO): The “new normal”? BJPsych Open. 2021. https://doi.org/10.1192/bjo.2021.368.

    Article  Google Scholar 

  55. Conti I, Gilkinso C. Preparing students for psychiatry OSCE’s in the COVID-19 pandemic. How can PsychSocs help? BJPsych Open. 2021. https://doi.org/10.1192/bjo.2021.101.

    Article  Google Scholar 

  56. Craig C, Kasana N, Modi A. Virtual OSCE delivery: The way of the future? Med Educ. 2020. https://doi.org/10.1111/medu.14286.

    Article  Google Scholar 

  57. Faria AL, Perdigão ACB, Marçal E, Kubrusly M, Peixoto RAC, Peixoto Junior AA. OSCE 3D: a virtual clinical skills assessment tool for coronavirus pandemic times. Rev Bras Educ Med. 2021. https://doi.org/10.1590/1981-5271v45.2-20200460.ING.

    Article  Google Scholar 

  58. Farrell SE, Junkin AR, Hayden EM. Assessing clinical skills via telehealth objective standardized clinical examination: feasibility, acceptability, comparability, and educational value. Telemed J E Health. 2021. https://doi.org/10.1089/tmj.2021.0094.

    Article  Google Scholar 

  59. García-Seoane J, Ramos-Rincón JM, Lara-Muñoz P, el grupo de trabajo de la CCS-ECOE de la CNDFME. Changes in the objective structured clinical examination (OSCE) of University Schools of Medicine during COVID-19 Experience with a computer-based case simulation OSCE (CCS-OSCE). Rev Clin Esp. 2021. https://doi.org/10.1016/j.rce.2021.01.004.

    Article  Google Scholar 

  60. Hamdy H, Sreedharan J, Rotgans JI, Zary N, Bahous SA, et al. Virtual Clinical Encounter Examination (VICEE): A novel approach for assessing medical students’ non-psychomotor clinical competency. Med Teach. 2021. https://doi.org/10.1080/0142159X.2021.1935828.

    Article  Google Scholar 

  61. Hannon P, Lappe K, Griffin C, Roussel D, Colbert-Getz J. An objective structured clinical examination: From examination room to Zoom breakout room. Med Educ. 2020. https://doi.org/10.1111/medu.14241.

    Article  Google Scholar 

  62. Hopwood J, Myers G, Sturrock A. Twelve tips for conducting a virtual OSCE. Med Teach. 2020. https://doi.org/10.1080/0142159X.2020.1830961.

    Article  Google Scholar 

  63. Lara S, Foster CW, Hawks M, Montgomery M. Remote assessment of clinical skills during COVID-19: A virtual, high-stakes, summative pediatric objective structured clinical examination. Acad Pediatr. 2020. https://doi.org/10.1016/j.acap.2020.05.029.

    Article  Google Scholar 

  64. Major S, Sawan L, Vognsen J, Jabre M. COVID-19 pandemic prompts the development of a Web-OSCE using Zoom teleconferencing to resume medical students’ clinical skills training at Weill Cornell Medicine-Qatar. BMJ Simul Technol Enhanc Learn. 2020. https://doi.org/10.1136/bmjstel-2020-000629.

    Article  Google Scholar 

  65. Martinez L, Holley A, Brown S, Abid A. Addressing the rapidly increasing need for telemedicine education for future physicians. PRiMER. 2020. https://doi.org/10.22454/PRiMER.2020.275245.

    Article  Google Scholar 

  66. Ryan A, Carson A, Reid K, Smallwood D, Judd T, et al. Fully online OSCEs: A large cohort case study. MedEdPublish. 2020. https://doi.org/10.15694/mep.2020.000214.1.

    Article  Google Scholar 

  67. Setiawan E, Sugeng B, Luailiyah A, Makarim FR, Trisnadi S. Evaluating knowledge and skill in surgery clerkship during covid 19 pandemics: A single-center experience in Indonesia. Ann Med Surg. 2021. https://doi.org/10.1016/j.amsu.2021.102685.

    Article  Google Scholar 

  68. Shaban S, Tariq I, Elzubeir M, Alsuwaidi AR, Basheer A, Magzoub M. Conducting online OSCEs aided by a novel time management web-based system. BMC Med Educ. 2021. https://doi.org/10.1186/s12909-021-02945-9.

    Article  Google Scholar 

  69. Shaiba LA, Alnamnakani MA, Temsah MH, Alamro N, Alsohime F, Alrabiaah A, et al. Medical Faculty’s and Students’ Perceptions toward Pediatric Electronic OSCE during the COVID-19 Pandemic in Saudi Arabia. Healthcare (Basel). 2021. https://doi.org/10.3390/healthcare9080950.

    Article  Google Scholar 

  70. Shehata MH, Kumar AP, Arekat MR, Alsenbesy M, Ansari AMA, Atwa H, et al. A toolbox for conducting an online OSCE. Clin Teach. 2020. https://doi.org/10.1111/tct.13285.

    Article  Google Scholar 

  71. Stewart C, Kumaravel B, O’Dowd J, McKeown A, Harris J. The rOSCE: A remote clinical examination during COVID lockdown and beyond. MedEdPublish. 2021. https://doi.org/10.15694/mep.2021.000011.1.

    Article  Google Scholar 

  72. Canning CA, Freeman KJ, Curran I, Boursicot K. Managing the COVID-19 risk: the practicalities of delivering high stakes OSCEs during a pandemic. MedEdPublish. 2020. https://doi.org/10.15694/mep.2020.000173.1 (Figure 1, Configuration of simulated patient, candidate and cameras to facilitate off-site examination; p. 4. Permission given to use image from study author).

    Article  Google Scholar 

  73. Ngiam N, Yasol G, Goh DL-M. Clinical examinations for medical students during the COVID-19 outbreak: a simulated patient programme perspective. BMJ Simul Technol Enhanc Learn. 2020. https://doi.org/10.1136/bmjstel-2020-000693 (Table 1, Examples of modification of examination stations; p. 2. Permission given to use image from study author).

    Article  Google Scholar 

  74. Stewart C, Kumaravel B, O’Dowd J, McKeown A, Harris J. The rOSCE: A remote clinical examination during COVID lockdown and beyond. MedEdPublish. 2021. https://doi.org/10.15694/mep.2021.000011.1 (Figure 3, A diagram to illustrate the OSCE cycle; p. 4. Permission given to use image from study author).

    Article  Google Scholar 

  75. Faria AL, Perdigão ACB, Marçal E, Kubrusly M, Peixoto RAC, Peixoto AAJ, et al. OSCE 3D: a virtual clinical skills assessment tool for coronavirus pandemic times. Rev Bras Educ Med. 2021. https://doi.org/10.1590/1981-5271v45.2-20200460.ING (Figure 1, Visual aspects of the virtual environment simulated by the OSCE 3D prototype; p. 13. Permission given to use image from study author).

    Article  Google Scholar 

  76. World Health Organisation. Global health observatory data: Telehealth. 2021. https://www.who.int/gho/goe/telehealth/en/. Accessed 29 May 2021.

  77. Borton T. Reach, touch and teach. London: Hutchinson; 1970.

    Google Scholar 

  78. Economics E. Regulatory approaches to telemedicine. London: Europe Economics; 2021.

    Google Scholar 

  79. Frankl SE, Joshi A, Onorato S, Jawahir GL, Pelletier SR, Dalrymple JL, et al. Preparing future doctors for telemedicine: an asynchronous curriculum for medical students implemented during the COVID-19 pandemic. Acad Med. 2021. https://doi.org/10.1097/ACM.0000000000004260.

    Article  Google Scholar 

  80. Muntz MD, Jose F, Ferguson C, Ark TK, Kalet A. Telehealth and medical student education in the time of COVID-19- and beyond. Acad Med. 2021. https://doi.org/10.1097/ACM.0000000000004014.

    Article  Google Scholar 

  81. Wamsley M, Cornejo L, Kyrzhanovskaya I, Lin BW, Sullivan J, Yoder J, et al. Best practices for integrating medical students into telehealth visits. J Med Internet Res. 2021. https://doi.org/10.2196/27877.

    Article  Google Scholar 

  82. Cate OT, Carraccio C, Damodaran A, Gofton W, Hamstra SJ, Hart DE. Entrustment decision making: extending Miller’s pyramid. Acad Med. 2021. https://doi.org/10.1097/ACM.0000000000003800.

    Article  Google Scholar 

  83. Hammick M, Dornan T, Steinert Y. Conducting a best evidence systematic review. Part 1: from idea to data coding: BEME guide 13. Med Teach. 2010. https://doi.org/10.3109/01421590903414245.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

No funding was given for this review.

Author information

Authors and Affiliations

Authors

Contributions

SC is the primary author and carried out the search, screening, analysis, and write-up. ET carried out co-screening and edited the manuscript. DW and RS are co-authors and provided senior advice throughout the review and edited the final manuscript. All authors read and approved the final manuscript.

Authors’ information

SC and ET are fifth-year medical students at the University of Birmingham. This review was written as part of SC dissertation for a BMedSci degree in Public Health and Population Sciences at the University of Birmingham. DW is a Reader in Public Health and Medical Education at the University of Birmingham. RS is a Senior Lecturer in Assessment in the Institute of Clinical Sciences in the College of Medical and Dental Sciences at the University of Birmingham.

Corresponding author

Correspondence to Sapphire Cartledge.

Ethics declarations

Ethical approval and consent to participate

Not applicable.

Consent for publication

Informed consent has been obtained from all participants for publication of identifying information/images in an online open-access publication.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cartledge, S., Ward, D., Stack, R. et al. Adaptations in clinical examinations of medical students in response to the COVID-19 pandemic: a systematic review. BMC Med Educ 22, 607 (2022). https://doi.org/10.1186/s12909-022-03662-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-022-03662-7

Keywords