This study attempted to examine whether it is effective to use a blended assessment of CBEs and OBEs to reduce test anxiety and enhance academic performance among dental students. This study aimed to determine whether a blended assessment can discriminately measure the learning outcomes of dental students by comparing their academic achievement in this form assessment with that of traditional assessment.
The findings revealed that the anxiety level of the OBEs did not decrease compared to that of the CBEs. In previous studies, OBEs have generally exhibited lower test anxiety [5], but in this study, anxiety about OBEs was found to be slightly higher than that of CBEs. This seems to suggest that the students felt no less anxious in the OBEs than in the CBEs and they experienced the same level of tension during the OBEs. In fact, most dental professors may be concerned that there are instances where OBEs tend to lessen students’ tension, thus negatively affecting exam preparation as compared to CBEs. However, students’ perceived responses indicated that OBEs can serve as a meaningful evaluation that maintains appropriate tension and that they spent as much time preparing for the OBEs as for the CBEs.
On the other hand, it has been reported that OBEs do not reduce test anxiety relative to CBEs among students who find OBEs unfamiliar and unpredictable and felt pressured to write answers using their knowledge [16, 17]. Some students’ responses indicated that the preparation for the OBEs was a burden, mainly due to a lack of understanding of how to prepare. Thus, informing students in advance that an exam will be open book and asking them to analyze clinical scenarios or perform problem-solving can change their approach to studying for the exam. In addition, medical educators should provide OB test samples to help students prepare for the OBEs before they encounter them in high-stakes settings to reduce anxious and target their preparation to the novel test format [15]. Assessment workshops for faculty training could be adopted to share pilot results and practical tips in adopting OBEs for their courses.
Test anxiety has reportedly been associated with poor performance in exams due to worrying about the outcome and experiencing negative emotions during the test [17]. Previous studies found a significant negative correlation between test anxiety and academic achievement, including students’ GPA [18]. Interestingly, a systematic review indicates that there was compelling evidence of substantial prevalence of anxiety disorders, particularly in women and young adults, which negatively influenced their accumulative academic performances [19]. In our sample, test anxiety level depending on test formats did not significantly differ with gender; however, the mean of test anxiety score (M = 2.22) among female students was slightly higher than that (M = 2.05) among male ones. It may be inferred that more pressure is placed on females than males to succeed in school, and academic counseling for female students should focus more on dealing with test anxiety [19].
In previous studies, the mean scores were similar for both OBEs and CBEs [20, 21]. However, in this study, on average, the final OBE scores for 2021 were lower than the final CBE scores in 2019. This suggests that this type of exams was unfamiliar to many learners. In this study, the students reported difficulty in preparing for the OBEs, as it was their first time sitting for this kind of test. In line with the findings of previous studies, high-achieving students exhibited superior performance in their cumulative GPA in both the traditional and blended assessments conducted in 2019 and 2021 [22]. Interestingly, an increased adjusted R square of the regression model found that the scores of students at a lower rank were better predicted by obtaining lower scores in a blended test format, which suggests that blended assessments would be more effective in monitoring the low-achieving group than traditional assessments. Thus, implementing a blended assessment could play a significant role in targeting the low-performing group and contributing to developing dental competencies, such as critical thinking ability, that are difficult to acquire through rote memorization alone.
For the OBEs to be discriminatory and meaningful, it is essential to design exams that require high contextual and high-order thinking. Notably, OBEs are basically designed not to evaluate learners’ knowledge but to evaluate their ability to solve real problems [22, 23]. Consistent with Bloom’s theory of taxonomy, the students’ self-perceived responses to the OBEs showed that they were motivated to primarily focus on understanding and synthesizing knowledge acquired in classes [24]. Previous studies reported that there was no significant difference between evaluations of lower-order thinking skills in open- and closed-book assessment, but OBEs had a greater effect on developing higher-order thinking ability, such as application, analyzing, evaluating, and creating [8]. While a knowledge-based class that tests learners based on rote memorization makes it easy to reach lower-order learning goals, an open-book assessment encourages learners to reach higher-order learning goals by synthesizing knowledge and promoting long-term memory.
Although the need to establish an online evaluation platform for dental education has been recognized, the amount of discussion and research on the topic has been inadequate [25, 26]. Dental education is confronting new challenges worldwide due to the COVID-19 pandemic, and most lectures have been switched from offline to online learning platforms to maintain social distancing [27, 28]. The context of the COVID-19 pandemic gives rise to the question of assessment methods in education in the healthcare profession [29]. In the survey, the students indicated that taking an OBE in a comfortable situation made them feel relieved and less stressed, as well as guaranteeing their individual safety from COVID-19. Because online evaluation must be done where the instructor and the learner are not in the same space, online open-book assessment can be an alternative to traditional assessment under a situation that does not permit face-to-face interactions.
The perception survey also showed that the OBEs were more effective in developing dental competencies than traditional CBEs in terms of reducing the learning burden and enhancing the learning effects. Generally, the type of exam adopted can influence how the exam is studied for. It has been reported that students preparing for OBEs tend to pay more attention and integrate their knowledge during class compared to those preparing for the traditional CBEs [6, 30]. In an open-ended survey on the assessment, the students perceived that OBEs reduced the burden of memorizing unnecessary details; therefore, they could spend more time concentrating on problem-solving than on rote memorization. Thus, it may be inferred that employing the OB test format provides students with greater opportunity to be critical and develop the higher-order thinking skills required of dental graduates.
Improvements were also suggested in the blended test format. In online OBEs, fairness was cited as the first issue of concern. While the midterm CBEs were conducted through in-person attendance and supervised by instructors, the final OBEs were implemented online without physical attendance. For the online OB assessment, Turnitin, a text matching system within an electronic document, was used to check whether the students’ work matched with previously submitted works. In addition, the OB answers were evaluated using clear standards that employed a rubric and through a dual scoring process that involved two experienced instructors. Recent research findings have supported that experienced markers can differentiate between genuine works and those completed by third parties [31]. However, as per previous studies and the students’ responses, there is still issue of fairness needs to be carefully addressed in blended assessments [20, 31].
This study is not free from limitations. Caution is required when translating this study’s findings to other medical and dental course settings. Because the sampling is a census, it was almost impossible to control all factors that threaten the validity of this nonrandomized, quasi-experimental study. The study population consisted of two different cohorts, which took the same course under the same instruction. We assumed that the cohort populations did not significantly differ in terms of gender, age, and admission requirements. However, the pandemic occurred in 2020, which inevitably characterized cohort 2019 as pre-pandemic and cohort 2021 as post-pandemic. Naturally occurring changes in study design over time may be confused with intervention effects. Accordingly, before applying the findings to different contexts in health profession education, convincing evidence for causal links needs to be examined so that instructors weigh the trade-offs between close- and open-book exams and verify whether there may be a factor other than test formats that affects student performance. To evaluate the effectiveness of this blended approach, it is imperative to use it repeatedly in different disciplines and classes.
In addition, the 2019 traditional assessment used question types such as true or false, multiple-choice, and short-answer questions that are associated with rote learning, but the 2021 OB questions were designed to assess the students’ problem-solving skills. Finally, the validity of the OB assessment methods used in this study need to be tested by proving whether they really worked as authentic, real-life-like assessments to improve students’ higher-order thinking [29]. In follow-up assessment studies, the roles and effects of invigilation in the online exams should be carefully monitored, and the extent to which OBEs allow students to access resources, including self-made materials, needs to be examined in terms of exam types.