Skip to main content

Effects of a blended design of closed-book and open-book examinations on dental students’ anxiety and performance

Abstract

Background

While closed-book examinations (CBEs) have traditionally been implemented in dental education, open-book examinations (OBEs) are being introduced for the purpose of acquiring higher levels of knowledge and promoting long-term memory. This study examines whether it is effective to use a blended assessment of CBEs and OBEs for dental students to reduce test anxiety and enhance academic performance.

Methods

Using a quasi-experimental research method, a blended assessment that combined CBEs in class and OBEs online was designed for a dental course. In 2020, when the pandemic was at its peak, student assessment was ineffective, and the 2020 cohort was omitted for our study; instead, two cohorts of predoctoral dental students (N = 178) enrolled in Restorative Dentistry in the spring semesters of 2019 and 2021 were included in the study. These students were informed about the experimental design, and they provided written consent for data collection, thereby voluntarily participating in the survey. Their self-perceived responses to open-ended survey questions on assessment methods were qualitatively analyzed.

Results

There was no significant difference in test anxiety between the CBEs and OBEs at the p-values of 0.001 in all items. Traditional and blended assessment showed a similar trend of lower scores in midterm exams compared to higher scores in final exams, thus discriminating against students’ performances. In particular, a low-achieving group was better predicted by a blended assessment. An analysis of the students’ self-perceived responses produced highly topical themes, including exam burden, learning effects, and fairness issues.

Conclusions

This study confirmed the feasibility of blended assessment that can be implemented in online and in-person educational environments. Moreover, it can be used as the groundwork to develop new models of assessment in dental education.

Peer Review reports

Background

Closed-book examinations (CBEs) are more preferred than open-book examinations (OBEs), and CBEs are considered as the common assessment in medical and dental schools [1]. Due to the challenges posed by the COVID-19 pandemic to in-person instruction and assessment in the online environment, medical educators have begun to implement OBEs as an alternative form of assessment in their online courses [2]. As views regarding what is required to be a competent healthcare professional are changing, educators must understand the pros and cons of OBEs and CBEs.

The concept of OBEs is not entirely new, and it is related to the higher-order thinking of taxonomy, in which Bloom et al. in 1935 distinguished the levels of cognitive thinking domains. OBEs do not stop at simply memorizing knowledge for learners, but they also evaluate whether the acquired knowledge can be used at a higher cognitive level [3]. Beyond measuring the memorization of knowledge, OBEs can evaluate higher-order cognitive abilities such as application, analysis, and evaluation [4, 5]. This type of exam stimulates students’ long-term memory and helps them improve the effectiveness of their learning by developing metacognition to enable the extension of memory for longer periods [6].

However, proponents of CBEs assert that students tend to invest more time and mental efforts in preparing for closed-book exams, stressing that medical students’ expert performance is closely linked to well-organized subject knowledge [2]. This argument gives rise to question of whether open-book items can accurately measure the cognitive level of students [7]. Another problem is that when students are assessed through OBEs, the lower tension is likely to result in lower learning duration and learning volume [7]. On the other hand, opponents of CBEs argue that it primarily aims to store information for quick retrieval and that rote memorization leads to superficial learning [8]. This type of learning ends up cramming that results in dumping information immediately after tests [2, 7].

It is still an open question which approach is more beneficial to learning, as CBEs and OBEs have had mixed effects, exhibiting both benefits and costs [9,10,11]. Previous studies have found that dental students who take OBEs tend to become more responsible and self-directed learners [12]. A survey of student perceptions recently conducted regarding OBEs in a UK dental school showed that dental students with a strong preference for OBEs showed greater effectiveness in their learning than those who preferred CBEs [7]. More importantly, since text anxiety may be associated with the experience of negative emotions during exams, it is important to note that OBEs can lower learners’ anxiety during assessments, thus enabling higher performance [3, 6, 7, 13].

Specifically, unlike previous studies, which have simply proposed one-way OBEs, this study compared a blended CB-OB exam format with traditional CB formats and examined the applicability of this approach in online settings. To the best of our knowledge, there is a lack of studies with experimental designs showing the effects of a blended approach of CB and OB tests on dental students’ outcomes. As competency-based dental education assesses competencies other than the maintenance of dental knowledge, more attention must be paid to blended assessments in education for the health professions. This approach assessed here consists of, first, a CBE that evaluates concepts that students should know even without resources being available, followed by a second half, open-book assessment that capitalizes on real problems that students will be expected to solve with available resources. Specifically, the CBE format used true or false, multiple-choice, and short-answer questions, but essay questions were designed for the OBE format.

This study investigated whether a blended approach involving CBEs in class and OBEs online is more effective than traditional assessments in dental education. In 2020, when the pandemic was at its peak, every dental class was conducted online and instructors were unprepared to assess students’ learning online; thus, the 2020 cohort for the same course was omitted for our study. Instead, we compared different cohorts taking the same course led by the same instructors in the 2019 pre-pandemic class using CBE and 2021 post-pandemic class using blended CBE and OBE.

To confirm the feasibility of a blended approach, this study examined potential differences in students’ test anxiety level between midterm in-person CBEs and final online OBEs in a dental course in 2021 and then compared the distribution of their academic performance between the blended assessment in 2021 and the traditional one in 2019 for the same course. In particular, 2021’s final OBEs were conducted online due to the COVID-19 pandemic. Students’ self-perceived responses to open-ended questions concerning blended assessment were qualitatively analyzed.

Methods and materials

Research design

We examined the effects of a blended assessment on dental students’ anxiety and performance using a quasi-experimental research methodology. The students were fully informed of our experimental design by means of the course syllabus before the class began and provided written informed consent to the collection of data and voluntarily participated in the survey. The study was approved by the Institutional Review Board of the School of Dentistry, Wonkwang University, Republic of Korea (Institutional Review Board No. WKIRB-202009-BM-062), and it adhered to university policy on research with human participants.

Procedure and setting

The students who took the course in Restorative Dentistry offered by the university in 2019 took the midterm and final exams as CBEs, while in the same course administered a blended CBE and OBE assessment in 2021. This course generally covers the theoretical basis and formulas of direct and indirect methods of aesthetically recovering damaged teeth. The midterm CBEs in 2021 were administered in a face-to-face setting with students physically attending class just like it was before the COVID-19 pandemic, time was limited to a duration of one hour, and the questions included true or false questions, factual knowledge multiple-choice questions, and short-answer questions.

For the 2021 final exams, the students took OBEs with textbooks available in an online setting. The OBEs consisted of three questions that aimed at assessing the students’ problem-solving skills. For example, the students were asked to check the patient’s condition and work out the patient’s treatment plans. Several measures were put in place to secure the validity of the OBEs and prevent exam irregularities. During the one-hour online open-book test, the students were required to keep their cameras on to show their faces on Zoom Meetings, a cloud-based video conferencing platform. To enable impartial evaluation, plagiarism was detected using the Turnitin program, a web-based plagiarism prevention service. The 2021 scores of the midterm CBEs and final OBEs were then examined and compared with the 2019 scores of full CBEs under the same course.

Data analysis

In this study, to prove the feasibility of the assessment methods, the test anxiety level and exam scores were compared on average within the same group, and therefore, the researchers conducted the paired sample t-test in 2019 and 2021, respectively. The study also attempted to confirm the relationship between the test formats and the learners’ cumulative grade point average (GPA) by dividing the students’ GPAs into two groups (i.e., high achievers and low achievers) based on a median split. A linear regression was calculated to predict the exam scores in 2019 and 2021, and data analysis was conducted using SPSS version 23 (SPSS Inc., Chicago, IL, USA).

To examine students’ self-perceived responses to the blended CB-OB exams in 2021, the data collected through the open-ended survey questions were qualitatively analyzed in three stages: transcription, coding, and theme identification. The two researchers, who were now familiar with the students’ overall perceptions, performed open coding by making lists of responses and categorizing them, developing themes to encompass its meaning. We reviewed the research results using methodological triangulation to increase the validity and reliability of qualitative research, taking into account multiple perspectives, frequencies, and contexts of the results [14].

Participants

A total of 178 predoctoral dental students who enrolled in Restorative Dentistry during the spring semester of 2019 and 2021 signed an informed consent to the collection of their records. The population was the total group of students enrolled in the class, and those who agreed to the study and had no history of mental illness were sampled. In the performance of a paired t-test, the sample size must be 27 or larger to identify a medium effect size and to have a power of 0.8 or higher. This study had a sample of 89 participants, indicating a sufficient level of power. The 2021’s sample contained 56 (62.9%) male and 33 (37.1%) female students, and the 2019’s sample contained 50 (56.2%) male and 39 (43.8%) female students.

Measures

Test anxiety

To measure test anxiety, Benson and El-Zahhar’s Revised Test Anxiety Scale (RTA) was used [15]. The RTA consists of 4 factors and 20 questions, including 5 questions on tension, 6 on the worry, 5 on the physical symptom, and 4 on the thinking unrelated to the test. The learners’ answers to the test anxiety questions were based on a four-point scale, ranging from rarely feeling (1) to always feeling (4). The reliability for the test anxiety portion was assessed with Cronbach’s alpha value of 0.932 at the midterm exam and 0.954 at the final exam. To verify the validity of the questionnaire, a Pearson correlation coefficient was identified for each item score and the total score, and the coefficients were 0.536 to 0.865 in the midterm exam and 0.613 to 0.833 in final exam. All items were significantly correlated with the total score, indicating that all items were valid. Table 1 shows the descriptive statistics from the 2021 sampling.

Table 1 Dental students’ test anxiety comparison between in-person CBEs and online OBEs

Perception survey on a blended assessment

To confirm the self-perception of the blended assessment, open-ended survey questions were developed and validated by four experts, including two clinical professors and two professors majoring in dental education. The survey featured open-ended questions concerning exam preparation, the perceived effect of the format on student learning, and students’ exam experiences. The students provided their perceptions in narrative form.

Results

Test anxiety comparison

No statistically different mean difference in the test anxiety level of a blended assessment was found between the in-person CBEs and the online OBEs. The mean anxiety level for the midterm CBEs was 2.108, and the level for the final OBEs was 2.134, an increase of 0.026 points. There was no significant difference in the p-values of 0.001 in all items.

Academic achievement comparison

Comparison between midterm CBEs and final CBEs in 2019 (before COVID-19)

When comparing the midterm with the final exam scores for Restorative Dentistry in 2019, the t-test showed a significant difference (t (88) =  − 16.02, p < 0.001). The mean score of the midterm CBEs was 75.79 (SD = 7.96) and that of the final CBEs was 92.72 (SD = 9.48).

Comparison between midterm CBEs and final OBEs in 2021 (under COVID-19)

When comparing the midterm scores with the final exam scores for Restorative Dentistry in 2021, the t-test showed a significant difference (t (88) =  − 8.01, p < 0.001). The mean score of the in-person midterm CBEs was 79.87 (SD = 4.92) and that of the online final OBEs was 87.88 (SD = 8.13).

Total score comparison between CBEs in 2019 and blended CBEs-OBEs in 2021

To compare the academic achievement of the same course in 2019 and 2021, the researchers drew a boxplot in Fig. 1. Both years showed a similar trend of lower scores in midterm exams compared to the high scores exhibited in the final exams. Regarding the midterm scores, the standard deviation of 2021 was smaller than that of 2019. The online OB final scores in 2021 were less skewed than the in-person CB final scores in 2019.

Fig. 1
figure 1

Box plot of 2019 CBE and 2021 blended exam scores for the Restorative Dentistry course (2019 n = 89, 2021 n = 89)

Prediction of high or low achiever’s performances on blended test scores

Table 2 shows that 2019’s traditional CB scores and 2021’s blended CB-OB scores were significantly predicted by students’ cumulative GPA, proving that both test formats could discriminate against students’ performances. The students in the higher GPA were expected to obtain a higher score on the blended test format and on the traditional test format. In particular, a sharp increase in the adjusted R square suggested the magnitude of improved prediction for the low-achieving group (Reg 6) in terms of blended assessment. The GPA was found to be a single significant predictor of each test score, but gender, age, and admission type were not significant in most of the regression models.

Table 2 Coefficient comparison for multiple regressions of cumulative GPA on student body and admission on 2019 CB and 2021 blended test scores

Perception survey on a blended assessment

An analysis of the responses to the open-ended questions exhibited several highly topical themes that frequently recurred (Table 3). In the theme of burdens of exams, the most frequently mentioned advantage of OBEs was reduced burden on the students about the test. Most of the students reported that they were able to reduce unnecessary memorization for the exam and remember important information, thus reducing the strain of preparing for it.

‘I always felt that I had to memorize for the test rather than what I had to know while studying, and it was good to be able to remember the significant points.’

‘The test was open-book, so the burden of memorizing unnecessary details was reduced. However, there seemed to be no significant difference in study time in that logical thinking was required.’

Table 3 Topical themes of dental students’ self-perceived responses about blended assessment (N = 89)

In the theme categorized as learning effects of exams, the students mentioned that CBEs motivated them to spend more time gaining domain knowledge and they could spend more time trying to understand study materials comprehensively. In particular, the students’ self-perceived responses denoted that OBEs can be applied to actual situations and they are instrumental in developing critical thinking skills.

‘It seems that more than 90% of the written tests at dental schools are traditionally evaluated based on knowledge memorization, but OBEs were very meaningful to evaluate all the knowledge I learned during the semester in relation to clinical practices.’

‘Based on what we learned in class, I think the critical thinking ability to determine what treatment to do based on the knowledge we have when we encounter a patient in real-life clinical practice is also a very important factor for us to have. In this test, it was not just memorization, but it was good to learn and describe the process of working out the treatment process ourselves.’

On the category of environments of exams, the issue of fairness was one of the main problems the students pointed out since the final open-book test was implemented online.

‘There may still exist the possibility of sharing answers, so it is unfair in terms of equity. It would be fairer to take an open-book exam together.’

‘The OB test is so good, but I don't think it can be overlooked that online OBEs cannot be free from the issue of fairness in evaluation.’

Discussions

This study attempted to examine whether it is effective to use a blended assessment of CBEs and OBEs to reduce test anxiety and enhance academic performance among dental students. This study aimed to determine whether a blended assessment can discriminately measure the learning outcomes of dental students by comparing their academic achievement in this form assessment with that of traditional assessment.

The findings revealed that the anxiety level of the OBEs did not decrease compared to that of the CBEs. In previous studies, OBEs have generally exhibited lower test anxiety [5], but in this study, anxiety about OBEs was found to be slightly higher than that of CBEs. This seems to suggest that the students felt no less anxious in the OBEs than in the CBEs and they experienced the same level of tension during the OBEs. In fact, most dental professors may be concerned that there are instances where OBEs tend to lessen students’ tension, thus negatively affecting exam preparation as compared to CBEs. However, students’ perceived responses indicated that OBEs can serve as a meaningful evaluation that maintains appropriate tension and that they spent as much time preparing for the OBEs as for the CBEs.

On the other hand, it has been reported that OBEs do not reduce test anxiety relative to CBEs among students who find OBEs unfamiliar and unpredictable and felt pressured to write answers using their knowledge [16, 17]. Some students’ responses indicated that the preparation for the OBEs was a burden, mainly due to a lack of understanding of how to prepare. Thus, informing students in advance that an exam will be open book and asking them to analyze clinical scenarios or perform problem-solving can change their approach to studying for the exam. In addition, medical educators should provide OB test samples to help students prepare for the OBEs before they encounter them in high-stakes settings to reduce anxious and target their preparation to the novel test format [15]. Assessment workshops for faculty training could be adopted to share pilot results and practical tips in adopting OBEs for their courses.

Test anxiety has reportedly been associated with poor performance in exams due to worrying about the outcome and experiencing negative emotions during the test [17]. Previous studies found a significant negative correlation between test anxiety and academic achievement, including students’ GPA [18]. Interestingly, a systematic review indicates that there was compelling evidence of substantial prevalence of anxiety disorders, particularly in women and young adults, which negatively influenced their accumulative academic performances [19]. In our sample, test anxiety level depending on test formats did not significantly differ with gender; however, the mean of test anxiety score (M = 2.22) among female students was slightly higher than that (M = 2.05) among male ones. It may be inferred that more pressure is placed on females than males to succeed in school, and academic counseling for female students should focus more on dealing with test anxiety [19].

In previous studies, the mean scores were similar for both OBEs and CBEs [20, 21]. However, in this study, on average, the final OBE scores for 2021 were lower than the final CBE scores in 2019. This suggests that this type of exams was unfamiliar to many learners. In this study, the students reported difficulty in preparing for the OBEs, as it was their first time sitting for this kind of test. In line with the findings of previous studies, high-achieving students exhibited superior performance in their cumulative GPA in both the traditional and blended assessments conducted in 2019 and 2021 [22]. Interestingly, an increased adjusted R square of the regression model found that the scores of students at a lower rank were better predicted by obtaining lower scores in a blended test format, which suggests that blended assessments would be more effective in monitoring the low-achieving group than traditional assessments. Thus, implementing a blended assessment could play a significant role in targeting the low-performing group and contributing to developing dental competencies, such as critical thinking ability, that are difficult to acquire through rote memorization alone.

For the OBEs to be discriminatory and meaningful, it is essential to design exams that require high contextual and high-order thinking. Notably, OBEs are basically designed not to evaluate learners’ knowledge but to evaluate their ability to solve real problems [22, 23]. Consistent with Bloom’s theory of taxonomy, the students’ self-perceived responses to the OBEs showed that they were motivated to primarily focus on understanding and synthesizing knowledge acquired in classes [24]. Previous studies reported that there was no significant difference between evaluations of lower-order thinking skills in open- and closed-book assessment, but OBEs had a greater effect on developing higher-order thinking ability, such as application, analyzing, evaluating, and creating [8]. While a knowledge-based class that tests learners based on rote memorization makes it easy to reach lower-order learning goals, an open-book assessment encourages learners to reach higher-order learning goals by synthesizing knowledge and promoting long-term memory.

Although the need to establish an online evaluation platform for dental education has been recognized, the amount of discussion and research on the topic has been inadequate [25, 26]. Dental education is confronting new challenges worldwide due to the COVID-19 pandemic, and most lectures have been switched from offline to online learning platforms to maintain social distancing [27, 28]. The context of the COVID-19 pandemic gives rise to the question of assessment methods in education in the healthcare profession [29]. In the survey, the students indicated that taking an OBE in a comfortable situation made them feel relieved and less stressed, as well as guaranteeing their individual safety from COVID-19. Because online evaluation must be done where the instructor and the learner are not in the same space, online open-book assessment can be an alternative to traditional assessment under a situation that does not permit face-to-face interactions.

The perception survey also showed that the OBEs were more effective in developing dental competencies than traditional CBEs in terms of reducing the learning burden and enhancing the learning effects. Generally, the type of exam adopted can influence how the exam is studied for. It has been reported that students preparing for OBEs tend to pay more attention and integrate their knowledge during class compared to those preparing for the traditional CBEs [6, 30]. In an open-ended survey on the assessment, the students perceived that OBEs reduced the burden of memorizing unnecessary details; therefore, they could spend more time concentrating on problem-solving than on rote memorization. Thus, it may be inferred that employing the OB test format provides students with greater opportunity to be critical and develop the higher-order thinking skills required of dental graduates.

Improvements were also suggested in the blended test format. In online OBEs, fairness was cited as the first issue of concern. While the midterm CBEs were conducted through in-person attendance and supervised by instructors, the final OBEs were implemented online without physical attendance. For the online OB assessment, Turnitin, a text matching system within an electronic document, was used to check whether the students’ work matched with previously submitted works. In addition, the OB answers were evaluated using clear standards that employed a rubric and through a dual scoring process that involved two experienced instructors. Recent research findings have supported that experienced markers can differentiate between genuine works and those completed by third parties [31]. However, as per previous studies and the students’ responses, there is still issue of fairness needs to be carefully addressed in blended assessments [20, 31].

This study is not free from limitations. Caution is required when translating this study’s findings to other medical and dental course settings. Because the sampling is a census, it was almost impossible to control all factors that threaten the validity of this nonrandomized, quasi-experimental study. The study population consisted of two different cohorts, which took the same course under the same instruction. We assumed that the cohort populations did not significantly differ in terms of gender, age, and admission requirements. However, the pandemic occurred in 2020, which inevitably characterized cohort 2019 as pre-pandemic and cohort 2021 as post-pandemic. Naturally occurring changes in study design over time may be confused with intervention effects. Accordingly, before applying the findings to different contexts in health profession education, convincing evidence for causal links needs to be examined so that instructors weigh the trade-offs between close- and open-book exams and verify whether there may be a factor other than test formats that affects student performance. To evaluate the effectiveness of this blended approach, it is imperative to use it repeatedly in different disciplines and classes.

In addition, the 2019 traditional assessment used question types such as true or false, multiple-choice, and short-answer questions that are associated with rote learning, but the 2021 OB questions were designed to assess the students’ problem-solving skills. Finally, the validity of the OB assessment methods used in this study need to be tested by proving whether they really worked as authentic, real-life-like assessments to improve students’ higher-order thinking [29]. In follow-up assessment studies, the roles and effects of invigilation in the online exams should be carefully monitored, and the extent to which OBEs allow students to access resources, including self-made materials, needs to be examined in terms of exam types.

Conclusion

This study investigated the effects of a blended assessment of CBEs and OBEs on anxiety and performance in dental students. A blended assessment was as effective as traditional assessment in distinguishing good from poor students. No significant differences were found in dental students’ anxiety level between traditional and blended assessments. Both types indicated a similar trend of lower scores in midterm exams followed by higher scores in final exams, but the lower-achieving group was better monitored by a blended assessment. In particular, an analysis of students’ self-perceived responses denoted what remains to be done for more effective assessment in terms of learning burden, learning effects, and fairness. This study also confirmed the feasibility of blended assessment that can be implemented in online and in-person educational environments. Moreover, it can be used as the groundwork to develop new models of assessment in dental education.

Availability of data and materials

The datasets used and/or analyzed during the present study are not publicly available due to limitations of ethical approval involving student data and anonymity but are available from the corresponding author on reasonable request.

Abbreviations

CB:

Closed-book

OB:

Open book

CBEs:

Closed-book examinations

OBEs:

Open-book examinations

COVID:

Coronavirus disease

References

  1. Beth J, Amber D, Jill M. A systematic review comparing open-book and closed-book examinations: evaluating effects on development of critical thinking skills. Nurse Educ Pract. 2017;27:89–94.

    Article  Google Scholar 

  2. Durning SJ, Dong T, Ratcliffe T, Schuwirth L, Artino AR, Boulet JR, Eva K. Comparing open-book and closed-book examinations: a systematic review. Acad Med. 2016;91:583–99.

    Article  Google Scholar 

  3. Dave M, Ariyaratnam S, Dixon C, Patel N. Open-book examinations. Br Dent J. 2020;229(3):149–149.

    Article  Google Scholar 

  4. Flinders DJ, Anderson LW, Sosniak LA. Bloom’s taxonomy: a forty-year retrospective. Hist Educ Q. 1996;36(1):76–8.

    Article  Google Scholar 

  5. Ng CKC. Evaluation of academic integrity of online open book assessments implemented in an undergraduate medical radiation science course during COVID-19 pandemic. J Med Imaging Radiat Sci. 2020;51(4):610–6.

    Article  Google Scholar 

  6. Ackerman R, Leiser D. The effect of concrete supplements on metacognitive regulation during learning and open-book test taking. Br J Educ Psychol. 2014;84(2):329–48.

    Article  Google Scholar 

  7. Dave M, Dixon C, Patel N. An educational evaluation of learner experiences in dentistry open-book examinations. Br Dent J. 2021;231(4):243–8.

    Article  Google Scholar 

  8. Sato B, He W, Wrschauer M, Kadandale P. The grass isn’t always greener: perceptions of and performance on open-note exams. Life Sci Educ. 2015;14:1–10.

    Google Scholar 

  9. Lee JY, Lee JE, Lim KY, Han SY. Challenges and tasks of online classes in the era of COVID-19. Educ Technol Res. 2020;36:671–92.

    Article  Google Scholar 

  10. Chakraborty P, Mittal P, Gupta MS, Yadav S, Arora A. Opinion of students on online education during the COVID-19 pandemic. Hum Behav Emerg Technol. 2021;3(3):357–65.

    Article  Google Scholar 

  11. Jervis CG, Brown LR. The prospects of sitting ‘end of year’ open book exams in the light of COVID-19: A medical student’s perspective. Med Teach. 2020;42(7):830–1.

    Article  Google Scholar 

  12. Ivry Z, Steven JD. Assessing open-book examination in medical education: The time is now. Med Teach. 2021;43(8):972–3.

    Article  Google Scholar 

  13. Karagiannopoulou E, Milienos FS. Exploring the relationship between experienced students’ preference for open- and closed-book examinations, approaches to learning and achievement. Educ Res Eval. 2013;19(4):271–96.

    Article  Google Scholar 

  14. Bekhet A, Zauszniewski J. Methodological triangulation: an approach to understanding data. Nurse Res. 2012;20(2):40–3.

    Article  Google Scholar 

  15. Benson J, El-Zahhar N. Further refinement and validation of the revised test anxiety scale. Struct Equ Model Multidisciplinary J. 1994;1(3):203–21.

    Article  Google Scholar 

  16. Eilertsen TO, Valdermo O. Open book assessment: a contribution to improved learning? Stud Educ Eval. 2000;26:91–103.

    Article  Google Scholar 

  17. Gharib A, Phillips W, Mathew N. Cheat sheet or open book? A comparison of the effects of exam types on performance, retention, and anxiety. Psychol Res. 2012;2(8):469–78.

    Google Scholar 

  18. Chapell S, Blanding B, Silverstein E, Takahashi M, Newman B, Gubi A, McCann N. Test anxiety and academic performance in undergraduate and graduate Students. J Educ Psych. 2005;97(2):268–74.

    Article  Google Scholar 

  19. Olivia R, Carol B, Rianne L, Louise L. A systematic review of reviews on the prevalence of anxiety disorders in adult populations. Brain Behav. 2016;6(7):1–33.

    Google Scholar 

  20. Curtise K, Cheung N. Evaluation of academic integrity of online open book assessments implemented in an undergraduate medical radiation science course during COVID-19 pandemic. J Med Imaging Radiat Sci. 2020;51:610–6.

    Article  Google Scholar 

  21. Robert MB. A discussion of the effect of open-book and closed-book exams on student achievement in an introductory statistics course. Primus. 2021;22(3):228–38.

    Google Scholar 

  22. Green SG, Ferrante CJ, Heppard KA. Using open-book exams to enhance student learning, performance, and motivation. J Effective Teach. 2016;16(1):19–35.

    Google Scholar 

  23. Moore R, Jensen PA. Do open-book exams impede long-term learning in introductory biology courses? J Coll Sci Teach. 2007;36(7):46–9.

    Google Scholar 

  24. Myyry L, Joutsenvirta T. Open-book, open-web online examinations: Developing examination practices to support university students’ learning and self-efficacy. Act Learn High Educ. 2015;16(2):119–32.

    Article  Google Scholar 

  25. Vlachopoulos D. COVID-19: Threat or opportunity for online education? High Learn Res Commun. 2020;10:16–9.

    Google Scholar 

  26. Chang T, Hong G, Paganelli C, Phantumvanit P, Chang W, Shieh Y, Hsu M. Innovation of dental education during COVID-19 pandemic. J Dent Sci. 2021;16(1):15–20.

    Article  Google Scholar 

  27. Ahmed H, Allaf M, Elghazaly H. COVID-19 and medical education. Lancet Infect Dis. 2020;20(7):777–8.

    Article  Google Scholar 

  28. Chang TY, Hsu ML, Kwon JS, Kusdhany MLS, Hong G. Effect of online learning for dental education in Asia during the pandemic of COVID-19. J Dent Sci. 2021;16(4):1095–101.

    Article  Google Scholar 

  29. Greenberg K, Lester JN, Evans K. Student learning with performance-based, in-class and learner-centered, online exams. Inter J Teach Learn in High Educ. 2009;20(3):383–93.

    Google Scholar 

  30. Williams J, Wong A. The efficacy of final examinations: a comparative study of closed-book, invigilated exams and open-book, open-web exams. Br J Educ Technol. 2009;40(2):227–36.

    Article  Google Scholar 

  31. Dawson P, Sutherland-Smith W. Can markers detect contract cheating? Results from a pilot study. Assess Eval High Educ. 2018;43(2):286–93.

    Article  Google Scholar 

Download references

Acknowledgements

The first two authors share the first authorship.

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

SH, BG, JR, and JI conceptualized and designed the study. SH, BG, CL, and JI conducted the review of literature and prepared the first draft. All authors contributed to the review and revision of the first draft and approved the final version.

Corresponding authors

Correspondence to Deog-Gyu Seo or Jungjoon Ihm.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Institutional Review Board of the School of Dentistry, Wonkwang University, Republic of Korea (Institutional Review Board No. WKIRB-202009-BM-062). All methods were performed in accordance with the relevant guidelines and regulations. All respondents were aware of their participation in this research and provided written informed consent in addition to concurrence with the anonymous use of their data for publication. All data were anonymously collected and analyzed.

Consent for publication

Not applicable.

Competing interests

The authors declare no potential conflicts of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Hong, S., Go, B., Rho, J. et al. Effects of a blended design of closed-book and open-book examinations on dental students’ anxiety and performance. BMC Med Educ 23, 25 (2023). https://doi.org/10.1186/s12909-023-04014-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-023-04014-9

Keywords

  • Blended assessment
  • Open-book exams
  • Closed-book exams
  • Test anxiety
  • Academic performance