Skip to main content

Administration of the American Board of Anesthesiology’s virtual APPLIED Examination: successes, challenges, and lessons learned

Abstract

In response to the COVID-19 pandemic, the American Board of Anesthesiology transitioned from in-person to virtual administration of its APPLIED Examination, assessing more than 3000 candidates for certification purposes remotely in 2021. Four hundred examiners were involved in delivering and scoring Standardized Oral Examinations (SOEs) and Objective Structured Clinical Examinations (OSCEs). More than 80% of candidates started their exams on time and stayed connected throughout the exam without any problems. Only 74 (2.5%) SOE and 45 (1.5%) OSCE candidates required rescheduling due to technical difficulties. Of those who experienced “significant issues”, concerns with OSCE technical stations (interpretation of monitors and interpretation of echocardiograms) were reported most frequently (6% of candidates). In contrast, 23% of examiners “sometimes” lost connectivity during their multiple exam sessions, on a continuum from minor inconvenience to inability to continue. 84% of SOE candidates and 89% of OSCE candidates described “smooth” interactions with examiners and standardized patients/standardized clinicians, respectively. However, only 71% of SOE candidates and 75% of OSCE candidates considered themselves to be able to demonstrate their knowledge and skills without obstacles. When compared with their in-person experiences, approximately 40% of SOE examiners considered virtual evaluation to be more difficult than in-person evaluation and believed the remote format negatively affected their development as an examiner. The virtual format was considered to be less secure by 56% and 40% of SOE and OSCE examiners, respectively. The retirement of exam materials used virtually due to concern for compromise had implications for subsequent exam development. The return to in-person exams in 2022 was prompted by multiple factors, especially concerns regarding standardization and security. The technology is not yet perfect, especially for testing in-person communication skills and displaying dynamic exam materials. Nevertheless, the American Board of Anesthesiology’s experience demonstrated the feasibility of conducting large-scale, high-stakes oral and performance exams in a virtual format and highlighted the adaptability and dedication of candidates, examiners, and administering board staff.

Peer Review reports

Introduction

The third and final step in the American Board of Anesthesiology’s (ABA; Raleigh, NC) staged examination process for initial certification [1,2,3,4] is the APPLIED Examination, which consists of two components, a Standardized Oral Examination (SOE) [3] and an Objective Structured Clinical Examination (OSCE) [4,5,6]. The SOE includes two 35-minute sessions, during which candidates answer examiners’ guided questions about the scientific rationale underlying clinical management decisions [3]. The OSCE includes five Communication & Professionalism stations based on clinical scenarios, in which candidates interact with actors playing standardized patients or standardized clinicians, and two Technical Skills stations, during which candidates are asked to interpret monitors, echocardiograms, or to apply ultrasonography [4]. The APPLIED Examination is usually administered at a dedicated assessment center in Raleigh, North Carolina to approximately 2000 candidates per year. Candidates must pass both components to pass the APPLIED Examination; they may retake any failed component(s). In 2020, disruption associated with the COVID-19 pandemic led to the cancellation of the on-site examinations, requiring the development and implementation of a remote, internet-based form of both components of the examination – the ABA Virtual APPLIED Examination (VAE).

The processes used to develop the ABA VAE and deliver it using the Zoom platform (Zoom Video Communications, San Jose, CA) have been described in detail [7]. We have also reported the psychometric performances of the VAE by comparing the virtual formats of the SOE and the OSCE with their in-person equivalents [8, 9]. Candidate performance and examiner grading severity were comparable between the in-person and virtual formats for both the SOE and the OSCE, supporting the reliability and validity of the virtual examinations, although OSCE scenarios delivered virtually were more difficult than those delivered in-person. In this paper, we detail the challenges, successes, and failures of operational logistics and administration of this high-stakes, career-defining physician assessment during the pandemic. In addition to a narrative review, we provide survey-generated performance data for our communication strategies, technology infrastructure, and staffing models. Further, we present candidate and examiner perceptions of the examination infrastructure and their exam experiences. The practicalities of virtually delivering the ABA’s certifying examination during the pandemic documented in this paper will provide insight for other assessment organizations that may need to deploy a large-scale, high-stakes, virtual performance exam in the future.

Candidates and examiners

Candidates had completed Accreditation Council for Graduate Medical Education (Chicago, IL)-accredited anesthesiology residency training and had passed the ABA BASIC [1] and ADVANCED [2] examinations. The majority of VAE candidates completed their residency in 2019 or 2020. Those who graduated in 2019 and had their APPLIED Examinations canceled in 2020 were examined during VAE pilot testing in December 2020 or during one of nine weeks between February and April 2021 (VAE Window 1). Residency graduates from 2020 were examined between July and November 2021 (VAE Window 2). Examiners were volunteer ABA board-certified anesthesiologists who were clinically active and participating in the Maintenance of Certification in Anesthesiology program [10, 11]. Examiners for in-person and virtual SOEs and OSCEs were drawn from the same examiner pool.

Surveys

Evaluation of VAE candidate and examiner experiences was planned prior to VAE implementation; survey results were described in the appropriate sections of this narrative review. The surveys were determined by the WCG Institutional Review Board (Puyallup, WA) to be exempt from review.

All candidates and examiners who participated in either or both components of the VAE were invited by email to respond to anonymous online surveys, using SurveyMonkey (San Mateo, CA). Candidates and examiners who had both in-person and virtual experiences were invited to complete separate online surveys that specifically queried how their virtual SOE or OSCE experiences compared with their previous in-person examinations, using QuestionPro (Beaverton, OR). Most virtual candidates were first-time takers of the APPLIED Examination and so could not compare in-person and virtual experiences, but VAE Window 1 included some candidates who had previously failed the in-person SOE, or the in-person OSCE, or both. Candidates were invited to their survey(s) within days of examination administration and could respond until exam results were released. Examiners received survey invitations after concluding their assigned examination weeks. Full or partial completion of the survey was taken as an indication of consent for participation.

Both candidates and examiners were asked about their perception of communications from the ABA before the examinations, their experiences regarding the technology infrastructure required to conduct the exams, and their reflections on the process of taking or conducting/scoring the exams. In addition to questions on the Likert scales, respondents could provide comments to multiple open-ended questions. The candidate and examiner comparison surveys focused on how the virtual delivery of the SOE or OSCE affected their preparation effort, perceived professionalism level of others, interaction between candidates and examiners, and how the virtual format affected their ability to demonstrate or evaluate the qualities that the SOE or OSCE is designed to assess.

Of the 3059 candidates who took the VAE, 1452 (47%) responded to the VAE survey. The vast majority of these candidates (95%) had taken both virtual SOE and virtual OSCE, almost all on the same day. Of the 228 candidates who had previously failed an in-person SOE, 113 (50%) responded. Of the 56 candidates who had previously failed an in-person OSCE, 28 (50%) responded. Among 317 examiners who had conducted both the in-person and virtual SOEs and were invited to the SOE examiner comparison survey, 201 (63%) completed the survey. Among 254 examiners who scored both the in-person and virtual OSCEs and were invited to the OSCE examiner comparison survey, 170 (67%) completed the survey.

VAE pilot

Eighty-eight (88) candidates voluntarily participated in a pilot administration of the VAE over two days in December 2020, and all took both the SOE and the OSCE. Fifty-two (52) SOE examiners examined a median of 8 candidates each (range 1–8), and 43 examiners scored a median of 14 OSCE candidate-stations (range 3–42). Those candidates who passed both SOE and OSCE became ABA certified; those who failed one or both components were offered an opportunity to retake the exam in early 2021. As previously described, the pilot went sufficiently well that the full-scale VAE proceeded, but many opportunities for improvement were identified and addressed before the first operational administration in February 2021 [7]. Pilot administration survey results are reported in Supplemental Tables 1 and 2.

Based on survey feedback from the pilot, changes to the SOE process included the provision of a “one minute remaining” warning to examiners and candidates, a virtual transition room for candidates after their interactions with examiners concluded and before they were disconnected from Zoom, and a smoother exam end for examiners (rather than abruptly ending their Zoom session). For the OSCE, modifications were made to improve the on-screen scenario display, to make transitions between stations smoother, and to explain the technical station process more clearly. Lastly, pre-examination information and preparatory materials for operational virtual exam weeks were revised and sent to candidates and examiners sooner than for the pilot.

VAE logistics

The operational administration of the ABA VAE began on Feb. 1, 2021, and continued through Nov. 18, 2021. In this period, 3059 candidates were examined, including 2916 taking both the SOE and OSCE, 95 taking the SOE only (83 for Part 2 in the traditional exam system and 12 for the SOE only in the staged exam system), and 48 taking the OSCE only. The total number of candidates examined in 2021 was 1.7 times the candidate volume examined in a typical year (Table 1). This cleared the backlog of candidates whose exams were postponed because of the pandemic (and who chose to take the exam in 2021), while examining those who became eligible to take their exams in 2021. Four hundred (400) examiners participated in the VAE, with 340 administering and scoring SOEs, 279 scoring OSCEs, and 219 participating in both.

Table 1 Virtual APPLIED Exam delivery results and irregularities

During Window 1 (February – April 2021), candidates were examined on 3 or 4 weekdays per week between 7:30 am and 7:30 pm Eastern Time over 6 exam periods per day. For Window 2 (July – November 2021), candidates were examined Monday through Thursday between 8:30 am and 4:30 pm Eastern Time, using the pre-pandemic in-person schedule of 4 periods per day and 4 days per week. The additional periods 5 and 6 scheduled during Window 1 were to accommodate candidates in the Mountain and Pacific time zones and candidates who needed to reschedule on the same day due to technical issues. These periods were under-utilized in Window 1 and were thus not scheduled during Window 2.

Five ABA APPLIED Examination staff engaged in exam scheduling and supporting exam delivery, supplemented by 19 temporary staff hired for the role of exam facilitators. The organization of examination support staff has been described previously [7]. Twenty-eight (28) professional actors played the roles of standardized patients and/or standardized clinicians. Examinations were scheduled across 5 time zones, with 15 virtual SOE and 14 virtual OSCE “rooms” running simultaneously. In addition, we accommodated a small number of candidates in other time zones who were deployed overseas on active-duty military service.

Irregular events

Contingency plans were in place for many - but not all - eventualities. For example, during Week 3 there was a relatively brief power outage at the assessment center, from where the VAE was coordinated, causing the postponement of 12 SOEs and 10 OSCEs. Early in the administration, a few candidates were examined by a single examiner in one of their SOE sessions, because the other examiner experienced technical difficulties and there was not enough time to bring in a replacement examiner. These candidates were later scored asynchronously by a second examiner based on examination recordings (Table 1). In each case of “irregularity”, ABA Directors discussed the details with APPLIED Exam staff to understand the nature of the deviation from the norm and decided whether an SOE session or OSCE station should be allowed to proceed, invalidated, or rescheduled, with the intention to favor candidates to a reasonable extent. For example, if a technical issue (e.g., audio problems because of poor internet connectivity or suboptimal microphone recording) caused an individual OSCE station to be unscoreable, the candidate would be awarded the highest possible score for that station as long as their six other OSCE stations were scored under “normal” conditions. In situations where communication between a candidate and examiners during an SOE session was disrupted beyond an acceptable threshold, the candidate was given the opportunity to re-test on the same or subsequent day.

Standards and pass rates

Scoring the SOE and the OSCE uses the many-facet Rasch model and the techniques have been described previously [3, 4, 12]. We have also described the psychometric performances of the virtual SOE and virtual OSCE [8, 9]. After the first cohort of candidates completed the VAE in February 2021, a standard-setting exercise took place for the virtual OSCE because of its structural change from the in-person OSCE (e.g., the ultrasound station was excluded from the virtual OSCE), and the resulting standard was used throughout 2021. For the virtual SOE, the existing standard was reviewed —no substantial changes other than transitioning to the virtual format warranted a new standard. Maintenance of the existing standard was subsequently validated by the stable performance of the virtual SOE candidates. Operational pass rates for both components of the VAE were similar to those seen in previous and subsequent years for in-person exams (Fig. 1).

Fig. 1
figure 1

2021 virtual Standardized Oral Examination (SOE) and Objective Structured Clinical Examination (OSCE) pass rates in comparison to those of in-person APPLIED Exams. Note: the 2020 pass rates include a one-week March 2020 in-person exam and a December 2020 two-day virtual pilot exam

Communications from the ABA

A multi-pronged approach was taken to inform candidates and examiners of the details of the VAE, including live webinars with question-and-answer sessions, websites with downloadable materials including procedure manuals and infographics, SOE- and OSCE-specific briefing videos, email communications using an ABA’s APPLIED Exam-specific e-mail address, and telephone helpline [7]. The timeliness of delivery of materials improved for the operational VAE compared with the pilot administration, and both candidates and examiners found each type of material provided to be useful (Table 2). However, despite the early availability of preparatory materials, 3% of 2021 virtual candidates still perceived that materials and examination information were late and “left them scrambling at the last minute” (compared with 8% in the pilot, Supplemental Table 2). Anecdotally, ABA e-mails blocked by institutional firewalls or directed to spam folders accounted for many of these difficulties, despite a counsel to candidates to add the ABA’s dedicated e-mail address to their safe sender list. The detailed candidate- and examiner-specific procedure manuals were perceived as being the most useful preparatory materials.

Table 2 Candidate and examiner perceived utility of the ABA-provided exam materials

Technology infrastructure

The Zoom platform was used to administer the VAE; an up-to-date Google Chrome web browser was required to generate the videoconference links and to display exam-related material [7]. Both candidates and examiners were required to conduct a “system check” before the day of the examination. The overwhelming majority of systems passed on the first attempt or after minor adjustments. Fewer than 1% of participants (candidates and examiners) had to change either their computer system or their intended physical location for the exam because of the system check (Table 3).

Table 3 Candidate and examiner experiences with technology

The majority of candidates (82%) and examiners (87%) “always” or “often” started their exams at the scheduled time; still, a small percentage of candidates (6%) and examiners (3%) “never” started their exams on time (Table 3). Some delays were related to technical difficulties on the first day of the first examination administration week in February 2021, when a coding problem resulted in failure to generate some Zoom meeting links for examiners. Others occurred intermittently over the course of the VAE administration and were almost entirely related to internet connectivity or participant audiovisual problems.

Internet connectivity was generally reliable — 84% of candidates and 77% of examiners never lost connectivity during the exam. For those who reported “sometimes” losing connectivity during the exam (14% of candidates and 23% of examiners), their loss of connectivity was likely on a continuum ranging from audio and/or video interruption lasting a few seconds to a complete inability to continue the exam (Table 3). For 74 (2.5%) SOE candidates and 45 (1.5%) OSCE candidates, connectivity problems were severe enough that their exams had to be rescheduled (Table 1). A higher percentage of examiners reported connectivity issues, perhaps explained by the fact that examiners had more chances for exam interruption than candidates – typically conducting SOEs for multiple candidates. It is worth noting that 10% of candidates were aware of technical issues on the part of their examiners (each candidate is examined by a total of four SOE examiners). On the other hand, about one-third of candidates and examiners reported that Zoom exceeded their expectations, with examiners (36%) being more likely to be unexpectedly impressed than candidates (28%; Table 3), perhaps reflecting a generational divide in the familiarity with audio-visual interfaces [13].

Candidates’ more granular evaluations revealed that they experienced “significant issues” most frequently at OSCE technical stations (interpretation of monitors and interpretation of echocardiograms) at 6%, followed by OSCE stem display (4%) and OSCE stem preparation time (4%; Table 3). Despite explicit instructions in the candidate procedures manual and the availability of the exam facilitator administering the technical stations to troubleshoot, this small portion of candidates had difficulty navigating the combination of Zoom and internet browser windows in the technical stations. This may have contributed to some reports of insufficient OSCE station preparatory time. As previously mentioned, in circumstances of obvious technical disruptions, OSCE stations were rated in the candidate’s favor.

Perceptions of the virtual SOE

The vast majority of candidates indicated agreement with statements supporting the professionalism of the examiners (93%) and “smooth” interactions with examiners (84%; Table 4). About 70% of SOE candidates agreed that the virtual SOE allowed them to demonstrate their knowledge and skills without obstacles and that the virtual SOE effectively measured their ability to analyze clinical situations, adapt to changing clinical scenarios, make appropriate clinical judgments, or organize and present information; the remaining 30% of candidates were neutral about, disagreed, or strongly disagreed with these statements. Recognizing that surveys were answered before exam results were released, two-thirds of candidates reported a positive experience of the virtual SOE, approximately 30% were neutral, and 4% reported a negative experience.

Table 4 Candidate and examiner experiences of the virtual Standardized Oral Examination (SOE)

Although cases and guided questions were of similar difficulty between the in-person and virtual SOEs, examiners tended to perceive that the virtual format was a more difficult experience for both candidates (38% considered virtual as being more difficult vs. 25% easier) and examiners (52% considered virtual more difficult vs. 11% easier). 45% of SOE examiners believed that their interactions with virtual candidates were not as effective as in-person and 39% found it more challenging to evaluate candidates virtually. In addition, 56%, 47%, and 41% of examiners cited less secure exam, “Zoom fatigue”, and less examiner development as additional negative factors of the virtual SOE, respectively (Table 4).

Of the 113 “repeaters” who took both the in-person and virtual SOEs, 28% reported that the web-based nature of the virtual SOE prompted them to increase their preparatory efforts (Supplemental Table 3). It’s unclear as to whether this increased effort was related to the virtual format of the exam or because they were re-attempting the exam, although 65% of this cohort reported a similar level of preparation as with their previous failed attempt. Despite their favorable view of the level of professionalism exhibited by their examiners, more candidates considered the virtual format to have hindered rather than helped their interaction with examiners. There was almost an even split in candidates’ perception of whether the web-based nature of the virtual SOE positively or negatively affected their ability to demonstrate their proficiency: 23% perceived a positive effect, 20% a negative effect, and 57% were neutral. This poses an interesting question as to whether the proximity and immediacy of an in-person examination is a necessary stressor to help identify candidates who will demonstrate the attributes of an ABA diplomate in pressurized clinical situations versus an artificial impediment to the demonstration of a candidate’s ability because of nervousness or performance anxiety [14, 15].

Perceptions of the virtual OSCE

79% of virtual OSCE candidates indicated that the actors playing standardized patients or standardized clinicians portrayed the clinical scenarios authentically and 89% had “smooth” interactions with them (Table 5). These data are reassuring in the context of an exam requiring complex interactions with actors in multiple stations and quick task and role switching between stations. Other responses were less assuring — while 75% of OSCE candidates considered themselves to be able to demonstrate their knowledge and skills without obstacles, up to 30% of OSCE candidates were at best neutral regarding the virtual OSCE’s ability to effectively measure their communication skills and professionalism, and more than half of respondents were at best neutral that the virtual OSCE effectively measured their technical skills.

Table 5 Candidate and examiner experiences of the virtual Objective Structured Clinical Examination (OSCE)

Examiners perceived the virtual OSCE to be more difficult for the candidates than the in-person format (41% believed the virtual format to be more difficult vs. 12% easier). 26% of examiners considered the virtual OSCE to be worse than the in-person format for allowing candidates to demonstrate their proficiency, and the other 74% were neutral. Although a few examiners disagreed with statements that the virtual OSCE effectively measured candidate attributes (disagreement rates for technical skills, professionalism, and communication skills were 8%, 4%, and 3%, respectively), examiners overall had more faith than candidates in the virtual OSCE’s ability to effectively measure candidate attributes (Table 5). In comparison with the in-person OSCE, examiners were concerned about virtual OSCE’s security (40% less secure vs. < 1% more secure), the authenticity of actors portraying the clinical scenarios in the virtual format (17% less authentically vs. 3% more authentically), candidates’ ability to demonstrate their proficiency (26% worse vs. 0% better), and their ability to evaluate candidate performance (18% more difficult vs. 6% easier; Table 5).

Of the 28 virtual OSCE respondents who had previously taken the in-person OSCE, 15 (54%) reported more preparatory effort for the virtual OSCE than for their previous in-person OSCE. Although 89% of OSCE retakers thought the actors virtually portrayed the scenarios authentically, about 40% of them felt that the virtual OSCE made their interactions with the actors more difficult than when in person and negatively affected their ability to demonstrate proficiency (Supplemental Table 4). This recollection is consistent with our previous psychometric analysis showing that OSCE scenarios were more difficult when administered virtually compared with in-person [9]. 46% of OSCE retakers agreed that the virtual OSCE effectively measured their communication skills (vs. 39% who disagreed) and professionalism (vs.29% who disagreed). Of even more concern, only 25% agreed that the virtual OSCE effectively measured their technical skills (vs. 50% disagreed) and 59% considered the virtual technical stations more difficult than the in-person format (vs. 0% easier).

Security

Security concerns associated with high-stakes virtual exams are well-recognized [16, 17]. The direct and real-time audiovisual interactions of the VAE mitigate some of the concerns associated with computer-based written examinations. Nonetheless, multiple possibilities for security breaches remain, including taking screenshots or video of exam scenarios, monitoring loops or echocardiogram images for later distribution, or surreptitiously receiving real-time aid from an unseen helper. At the assessment center, in-person candidates are prohibited from bringing electronic devices or other possible aids into the orientation or examination rooms. Such strict controls are not possible in the virtual format. The presumption was that candidates would adhere to the legally binding agreement they signed at examination registration and act honestly, honorably, and professionally. Some exam audiovisual recordings were reviewed by ABA Directors when concerns regarding irregular behavior or possible cheating were raised by examiners or staff. No cheating was detected, and no candidates had their VAE invalidated due to exam misconduct, although the absence of evidence of cheating does not necessarily mean evidence of absence. Of note, typically 2 to 3 candidates per year have their ABA in-person written exams invalidated because of breaches of the exam rules. Virtual exams pose inherent challenges in ensuring a fully secure environment — 56% of SOE examiners and 40% of OSCE examiners believed that the virtual SOE and virtual OSCE were less secure than their in-person equivalents, respectively. The remaining examiners thought that security was similar between the two formats (Tables 4 and 5). Security concerns did influence one major operational decision: while SOE guided questions and OSCE scenarios from in-person administrations remain active in the examination bank, the exam materials used virtually were discarded because of the potential for compromise. This considerably increases the cost and effort associated with generating new scenarios and questions for future examinations.

Examiner mentorship

Examiners reported that training, mentorship, networking, and camaraderie suffered in the virtual format (Table 4), which was also reflected in many free-text comments not reported here. The significance should not be underestimated. An excellent examiner pool requires examiners’ sustained commitment to examine and continuous guidance and advice from examiner mentors. Networking opportunities are highly valued by examiners, and long-standing professional relationships and friendships are made and maintained over time. Even within a single year of remote examinations, it was clear that examiner interaction suffered by not being together on-site. 71% of the virtual SOE examiners reported that their interactions with fellow examiners were worse than when in-person and 41% believed that their development as an examiner was negatively affected because of the virtual format. These could have adverse implications for future exams. Other ABMS member boards have reported on similar sentiments, and the American Board of Emergency Medicine (personal communication, 2023) and the American Board of Obstetrics and Gynecology [18] changed – at least temporarily – to a hybrid exam model in which examiners are physically together in an examination center and candidates across the country are examined remotely.

Return to in-person examinations

Determination of whether a physician possesses the attributes required to achieve certification by a medical specialty board is a high-stakes endeavor with implications for the physician, the specialty, and the public [19, 20]. The ABA successfully administered the VAE to its 3059 candidates in a single year, 2724 of whom became certified in anesthesiology, in a manner both practically feasible and psychometrically reliable [8, 9]. As vaccination against COVID-19 became widely available and pandemic-related disruptions lessened, the ABA, despite the successes of the VAE, returned to in-person testing for the APPLIED Examination in February 2022. Universal masking, the requirement of proof of vaccination for examiners and candidates (with some exceptions for candidates), COVID-19 contact tracing, and other infection prevention and mitigation measures were enforced. The planned six weeks of exams in 2022 were successfully completed for 2165 candidates, and similarly, 2117 candidates were examined in-person in 2023. Other certifying boards faced similar decisions about returning to in-person or staying virtual and made the same or different choices [18, 21,22,23,24,25,26].

The ABA’s return to the in-person format for the APPLIED Examination was based on several considerations with standardization and security foremost. The variability of internet speed, quality, and reliability across the country means that candidates have a less standardized experience than when the exam is taken in person. This inconsistency increases the potential for construct-irrelevant variance due to technical disruptions and technology-related candidate or examiner anxiety [27]. Despite taking steps to mitigate the risks, virtual exams are more open to breaches of security than those conducted in the tightly controlled in-person environment of the assessment center. There are additional concerns about the ability to remotely test communication and professionalism skills that are required to be used in person during daily practice, and the virtually-administered OSCE technical stations appear to be more difficult [28]. Current technology does not allow assessing some important content in a virtual format, such as physically acquiring and interpreting ultrasound images. Other potential domains for future assessment, such as team-based assessment and multi-disciplinary collaborative assessment, would also be difficult to conduct in a non-standard environment.

Development and implementation of the VAE were associated with substantial short-term investment, including the financial implications of reassignment of ABA personnel from other projects to work on the VAE, the costs of software development and licensing fees, and the need to hire temporary staff to act as exam facilitators. In keeping with reports from other ABMS member boards, [16, 18] the virtual examination was very labor-intensive for ABA staff. The ABA’s examiner-associated travel costs (flights, hotels, meals, etc.) were lower during the VAE. Weighed against this, however, was the clearly articulated diminished examiner experience, which raises concerns about volunteer examiner engagement, mentorship, and sustainability. Candidate expenses are lower for the virtual format - travel and hotel costs are negated and time away from work is likely to be less. The ABA is not unsympathetic to these considerations for early career anesthesiologists, but must weigh all factors holistically to ensure the fairness, integrity, and sustainability of the certification examination process.

Experiences of other ABMS member boards

Until the introduction of an in-person OSCE by the American Board of Urology in 2023, the ABA was the only ABMS member board to utilize an OSCE. However, 14 ABMS member boards used some form of oral examination as a component of their initial certification process, and all converted to virtual delivery during the pandemic [20, 29]. Several ABMS member boards, including the American Boards of Emergency Medicine, Obstetrics and Gynecology, Ophthalmology, and Surgery, have published their experience of virtual oral exams, with varying levels of detail [18, 21, 23,24,25]. The nature of the virtual exams differed from their respective in-person exams for some boards (e.g., major changes in the duration of exam sessions and the number of examiners each candidate sees, abandonment of the use of images or videos). While the ABA managed the technology infrastructure in-house and directly hired temporary staff for exam proctoring/administration, some boards utilized vendors to manage those responsibilities. Post-pandemic plans also vary. Some boards have transitioned back to in-person exams, [30, 31] others continued the virtual format, [23, 32] and still others were exploring other formats such as bringing examiners physically together at an exam center to examine remote candidates [33].

Lessons learned

The year-long administration of the ABA VAE demonstrated the capability of a medical specialty certifying board to remotely deliver high-stakes SOEs and OSCEs to thousands of physicians in a practically feasible and psychometrically rigorous manner. The technology required for internet-based examinations worked, albeit not perfectly and with concerns for lack of standardization. The urgent nature of its development meant that the VAE required a rapid conversion of well-established in-person examination procedures to a remote format; under less exigent circumstances, more innovative methods of remote assessment could be designed, pilot-tested, introduced, and refined over a longer timeline. Multiple methodologies to communicate with candidates and examiners are necessary. Minor practical details can have significant implications, especially where technology interfaces with humans under stressful, time-sensitive circumstances. Although cheating was neither expected nor detected, examination security remains a major concern. The VAE experience documented the cognitive and practical difficulties associated with remote exam delivery, highlighted the dedication of the ABA staff and a large cohort of volunteer anesthesiologist examiners, and reinforced the value of the in-person examiner experience. Finally, and perhaps most importantly, the VAE experience demonstrated the resilience of the early-career anesthesiologists who adapted to multiple changes and prepared for this major professional milestone despite their examinations, careers and lives being disrupted by the COVID-19 pandemic.

Data availability

The datasets generated and/or analyzed during the current study are not publicly available due to the confidentiality and sensitivity of the survey data and the stated terms with the survey respondents, but they can be made available from the Corresponding Author on reasonable requests.

References

  1. Zhou Y, Sun H, Lien CA, et al. Effect of the BASIC examination on knowledge acquisition during anesthesiology residency. Anesthesiology. 2018;128:813–20.

    Article  Google Scholar 

  2. Zhou Y, Sun H, Macario A, Martin DE, Rathmell JP, Warner DO. The American board of anesthesiology’s staged examination system and performance on the written certification examination after residency. Anesth Analg. 2019;129:e159–62.

    Article  Google Scholar 

  3. Sun H, Warner DO, Patterson AJ, et al. The American board of anesthesiology’s standardized oral examination for initial board certification. Anesth Analg. 2019;129:1394–400.

    Article  Google Scholar 

  4. Warner DO, Isaak RS, Peterson-Layne C, et al. Development of an objective structured clinical examination as a component of assessment for initial board certification in anesthesiology. Anesth Analg. 2020;130:258–64.

    Article  Google Scholar 

  5. Warner DO, Lien CA, Wang T, et al. First-year results of the American board of anesthesiology’s objective structured clinical examination for initial certification. Anesth Analg. 2020;131:1412–8.

    Article  Google Scholar 

  6. Wang T, Sun H, Zhou Y, et al. Construct validation of the American board of anesthesiology’s APPLIED examination for initial certification. Anesth Analg. 2021;133:226–32.

  7. Keegan MT, McLoughlin TM Jr, Patterson AJ, et al. A coronavirus disease 2019 pandemic pivot: development of the American board of anesthesiology’s virtual APPLIED examination. Anesth Analg. 2021;133:1331–41.

    Article  Google Scholar 

  8. Keegan MT, Harman AE, Deiner SG, Sun H. A comparison of psychometric properties of the American board of anesthesiology’s in-person and virtual standardized oral examinations. Acad Med. 2024; Epub ahead of print.

  9. Sun H, Deiner SG, Harman AE, Isaak RS, Keegan MT. A comparison of the American Board of Anesthesiology’s in-person and virtual objective structured clinical examinations. J Clin Anesth. 2023;91:111258.

    Article  Google Scholar 

  10. Culley DJ, Sun H, Harman AE, Warner DO. Perceived value of Board certification and the Maintenance of Certification in Anesthesiology Program (MOCA®). J Clin Anesth. 2013;25:12–9.

    Article  Google Scholar 

  11. Zhou Y, Sun H, Macario A, et al. Association between performance in a maintenance of certification program and disciplinary actions against the medical licenses of anesthesiologists. Anesthesiology. 2018;129:812–20.

    Article  Google Scholar 

  12. Linacre J. Many-Facet Rasch Measurement. 2nd ed. Univ of Chicago Social Research; 1994.

  13. Calvo-Porral C, Pesqueira-Sanchez R. Generational differences in technology behaviour: comparing millennials and Generation X. Kybernetes. 2020;49:2755–72.

    Article  Google Scholar 

  14. Shebrain S, Nava K, Munene G, Shattuck C, Collins J, Sawyer R. Virtual surgery oral board examinations in the Era of COVID-19 pandemic. How I Do it! J Surg Educ. 2021;78:740–5.

    Article  Google Scholar 

  15. Manisundaram AD, Sathyanarayanan S, Govande JV, Marques ES, Nguyen PD. Cross-institutional virtual mock oral examination: a new paradigm? Plast Reconstr Surg Glob Open. 2023;11:e4822.

    Article  Google Scholar 

  16. Jones AT, Barry CL, Ibáñez B, LaPlante M, Buyske J. The development of a virtual pilot for the American board of surgery certifying examination. Am J Surg. 2021;221:764–7.

    Article  Google Scholar 

  17. Hurtz GM, Weiner JA. Comparability and integrity of online remote vs. onsite proctored credentialing exams. J Appl Test Technol. 2022;23:36–45.

    Google Scholar 

  18. Shivraj P, Chadha R, Dynis D, et al. The American Board of Obstetrics and Gynecology’s remote certifying examination: successes and challenges. AJOG Glob Rep. 2022;2:100136.

    Article  Google Scholar 

  19. Lipner RS, Hess BJ, Phillips RL Jr. Specialty board certification in the United States: issues and evidence. J Contin Educ Health Prof. 2013;33(Suppl 1):S20–35.

    Article  Google Scholar 

  20. American Board of Medical Specialties. What is the value of ABMS board certification? https://www.abms.org/board-certification/value-of-board-certification/. Accessed 21 Dec 2023.

  21. Bartley GB. COVID-19 and the American board of ophthalmology: When the best-laid plans go awry. Ophthalmology. 2020;127:e53–4.

    Article  Google Scholar 

  22. American Board of Physicial Medicine and Rehabilitation. ABPMR Moves Part II Examination To Virtual Administrations This Fall. 2020. Accessed December 21, 2023. https://www.abpmr.org/NewsCenter/Detail/part-ii-exam-virtual-2020.

  23. Chen H, Tseng JF, Chaer R, et al. Outcomes of the first virtual general surgery certifying exam of the American board of surgery. Ann Surg. 2021;274:467–72.

    Article  Google Scholar 

  24. Huber TS, Brown KR, Lee JT, et al. Implementation of the vascular surgery board virtual certifying examination. J Vasc Surg. 2022;76:1398–404.e4.

    Article  Google Scholar 

  25. Chudnofsky CR, Reisdorff EJ, Joldersma KB, Ruff KC, Goyal DG, Gorgas DL. Early validity and reliability evidence for the American board of emergency medicine virtual oral examination. AEM Educ Train. 2023;7:e10850.

    Article  Google Scholar 

  26. Royal College of Anaesthetists. Primary FRCA OSCE. https://rcoa.ac.uk/examinations/primary-frca-examinations/primary-frca-osce. Accessed 21 Dec 2023.

  27. American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Standards for educational and psychological testing. 2014.

  28. Ramsingh D, Bronshteyn YS, Haskins S, Zimmerman J. Perioperative point-of-care ultrasound: from concept to application. Anesthesiology. 2020;132:908–16.

    Article  Google Scholar 

  29. American Board of Medical Specialties. ABMS member boards oral exams go virtual on July 1, 2020. https://www.abms.org/newsroom/abms-member-boards-oral-exams-go-virtual/. Accessed 21 Dec 2023.

  30. The American Board of Obstetrics and Gynecology. 2023 Specialty Certifying Examination Bulletin. Accessed December 21, 2023. https://www.abog.org/docs/default-source/bulletins/2023/2023-certifying-examination-in-obstetrics-and-gynecology-12.19.2022a7c4eaec-0c0d-4380-8f1d-433a8364c834.pdf?sfvrsn=ab15f5b9_3.

  31. The American Board of Plastic Surgery. Quick tips: oral exam. https://www.abplasticsurgery.org/candidates/oral-examination/quick-reference-tips-oral-exam-candidates/. Accessed 21 Dec 2023.

  32. Bartley GB, Schnabel SD, Comber BA, Gedde SJ. The American board of ophthalmology virtual oral examination: crisis catalyzing innovation. Ophthalmology. 2021;128:1669–71.

    Article  Google Scholar 

  33. The American Board of Neurological Surgery. American board of neurological surgery update. https://nsadmin.org/wp-content/uploads/2023/06/ABNS-Update-2023.pdf. Accessed 21 Dec 2023.

Download references

Acknowledgements

The authors thank the directors and staff of the American Board of Anesthesiology (ABA), especially the ABA Virtual Exam Task Force, for their essential roles in developing and implementing the virtual APPLIED Examination. They appreciate the candidates who took the virtual APPLIED Examination during the pandemic, and are extremely grateful for the dedicated service of the ABA volunteer examiners, standardized patients, and members of the Standardized Oral Examination and Objective Structured Clinical Examination Committees.

Funding

Support was provided solely from institutional and/or departmental sources.

Author information

Authors and Affiliations

Authors

Contributions

M.T.K. contributed directly to the conception, design, and implementation of the work, drafted and revised the manuscript, and approved the final version. A.E.H. contributed directly to the conception, design, and implementation of the work, critically revised the manuscript for important intellectual content, and approved the final version. T.M.M., Jr. contributed directly to the conception, design, and implementation of the work, critically revised the manuscript for important intellectual content, and approved the final version. A.M. contributed directly to the conception, design, and implementation of the work, critically revised the manuscript for important intellectual content, and approved the final version. S.G.D. contributed directly to the conception, design, and implementation of the work, critically revised the manuscript for important intellectual content, and approved the final version. R.R.G. contributed directly to the conception, design, and implementation of the work, critically revised the manuscript for important intellectual content, and approved the final version. D.O.W. contributed directly to the conception, design, and implementation of the work, critically revised the manuscript for important intellectual content, and approved the final version. S.S. contributed directly to the conception, design, and implementation of the work, critically revised the manuscript for important intellectual content, and approved the final version. H.S. contributed directly to the conception, design, and implementation of the work, critically revised the manuscript for important intellectual content, and approved the final version.

Corresponding author

Correspondence to Huaping Sun.

Ethics declarations

Ethical approval

In accordance with the Code of Federal Regulation 45 CFR 46 and associated guidance, the WCG Institutional Review Board (Puyallup, WA) determined that this project does not require IRB review, and provided a waiver of the need for written informed consent from the participants. All the methods analyzing physician responses to the surveys were carried out in accordance with the Declaration of Helsinki and relevant guidelines and regulations.

Consent for publication

Not applicable.

Competing interests

Ann E. Harman and Huaping Sun are staff members of the American Board of Anesthesiology (ABA); Stacie G. Deiner is an ABA Director, receives an honorarium for her participation in ABA activities, and has given expert witness testimony; Robert R. Gaiser, Mark T. Keegan, Alex Macario, Thomas M. McLoughlin Jr. are ABA Directors and receive an honorarium for their participation in ABA activities; Santhanam Suresh and David O. Warner are ABA former directors.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Keegan, M.T., Harman, A.E., McLoughlin, T.M. et al. Administration of the American Board of Anesthesiology’s virtual APPLIED Examination: successes, challenges, and lessons learned. BMC Med Educ 24, 749 (2024). https://doi.org/10.1186/s12909-024-05694-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-024-05694-7

Keywords