Student experience of remote online summative MCQ exam delivery during the COVID-19 pandemic was generally positive, with few students experiencing technical or practical problems. We did not find any evidence of negative effects on candidate performance.
Test anxiety
Test anxiety was an important consideration given the context of these exams taking place during a global pandemic. Undoubtedly lockdown had unpredictable effects on individual students’ learning, revision and mental wellbeing as they adapted to a new normal [8], and this may have contributed to the heightened anxiety experienced by some students. It is interesting that the higher stakes nature of sitting a summative assessment seems to have resulted in many of our students worrying about access issues, particularly initial log in to the exam, despite their familiarity with using the system remotely. The literature suggests less test anxiety with remote delivery [1, 3], and we confirmed that this remained the case for a number of students where individual personality traits, learning styles or relief from negative associations with exam hall settings seem to have outweighed potential anxiety generated by the lockdown situation.
Test anxiety has been usefully conceptualised using distractional theories, in particular attentional control theory [9]. Addressing a task requires working memory to focus attention on several pieces of relevant information while inhibiting irrelevant information. According to attentional control theory, anxiety disrupts the balance between two competing attentional systems, with the one influenced by salient stimuli becoming stronger at the expense of the goal-directed system [9]. As working memory has a limited capacity, the addition of stress therefore reduces a candidate’s ability to use relevant information during a test resulting in underperformance. However, despite perceptions of increased or decreased test anxiety in our study, there was no obvious indication that any candidates actually did better or worse than predicted from previous performance.
Test security
Although concerns have been expressed about increased opportunities for cheating during remote assessments, there is lack of objective evidence that this is more widespread than in invigilated campus-based exams [10]. We did not use remote proctoring but, in accordance with the argument put forward by Fuller et al. [5], invested trust in our students to behave professionally during remote exams, reminding them of appropriate conduct immediately before the start of their test. The ability to present items in a random order to individual candidates and the time-limited nature of the tests were felt to be adequate countermeasures against attempted collaboration. We did not take any steps to formally verify candidate identity for these assessments, e.g. by webcam, due to cohort sizes, simply sending the specific exam entry pass code to students’ university email accounts immediately before the exam started. As previously discussed, there were no candidates whose actual performance was unexpectedly greater than predicted from their previous performance in formative progress tests.
Test performance
The remote online assessments were open-book by default and arguably the ability to quickly access and appraise relevant information to support clinical decision making is now a vital part of modern medical practice, underpinning professional values and patient safety. As such, testing candidates’ ability to do this by allowing access to resources during time-limited exams enhances their validity. This approach is already established in must-pass examinations such as the UK Prescribing Safety Assessment [11], where candidates have access to an online reference drug formulary.
We found that remote delivery aided candidate performance in the Year 4 exam, but not in Year 5. This was most likely due to Year 4 questions being less complex and covering more basic diagnosis, investigation and management of core and common medical conditions and therefore easier for candidates to check information to help them reach the correct answers. Our results contrast with the findings of Sam et al. who recently published a brief report of their experiences using an almost identical approach to exam delivery during the pandemic, but found no change in performance in any year group [12].
Item leakage
Finally, the resource implications of running summative exams remotely have to be acknowledged. All items used have effectively been released into the public domain precluding their inclusion in future summative assessments. This approach therefore has significant costs in terms of replenishing exam item banks.
Limitations
This was a naturalistic inquiry rather than pre-designed cohort study, comparing results from similar, but non-identical, assessments and across different time periods and therefore caution has to be taken in interpreting the results.
The generalisability of our findings is limited by this being a single institution study. In particular, we have a relatively low proportion of widening participation students who may be more likely to be disadvantaged by the remote approach.
While our results are encouraging it is difficult to clearly separate any effects arising specifically from lockdown.