Skip to main content

The framework of Systematic Assessment for Resilience (SAR): development and validation

Abstract

Background

Burnout and depression among health professions education (HPE) students continue to rise, leading to unwanted effects that ultimately jeopardise optimal medical care and patient health. Promoting the resilience of medical students is one solution to this issue. Several interventions have been implemented to foster resilience, but they focus on aspects other than the primary cause: the assessment system. The purpose of this study is to develop a framework to promote resilience in assessment planning and practice.

Methods

We followed the guidelines suggested by Whetten for constructing a theoretical model for framework development. There were four phases in the model development. In the first phase, different literature review methods were used, and additional students’ perspectives were collected through focus group discussions. Then, using the data, we constructed the theoretical model in the second phase. In the third phase, we validated the newly developed model and its related guidelines. Finally, we performed response process validation of the model with a group of medical teachers.

Results

The developed systematic assessment resilience framework (SAR) promotes four constructs: self-control, management, engagement, and growth, through five phases of assessment: assessment experience, assessment direction, assessment preparation, examiner focus, and student reflection. Each phase contains a number of practical guidelines to promote resilience. We rigorously triangulated each approach with its theoretical foundations and evaluated it on the basis of its content and process. The model showed high levels of content and face validity.

Conclusions

The SAR model offers a novel guideline for fostering resilience through assessment planning and practice. It includes a number of attainable and practical guidelines for enhancing resilience. In addition, it opens a new horizon for HPE students’ future use of this framework in the new normal condition (post COVID 19).

Peer Review reports

Background

The study of medicine is demanding and puts a significant strain on the mental and physical health of medical students, who perceive medical education as an anxiety-inducing and stressful field of study [1]. This perception has been mirrored by the high level of stress in medical students. Several systematic reviews and metanalyses [2,3,4], as well as local [5, 6] and multicenter studies [7, 8] have found a significant prevalence of stress among medical students ranging from 21 to 56%. Consequently, burnout and depression among medical students have increased [4]. Other negative effects of stress on medical students have also been reported, including poor clinical performance, poor decision-making, poor peer interaction, interpersonal conflict, academic dishonesty, and sleep problems [4, 9]. Stress has also been associated with suicide, alcoholism, and drug abuse [10,11,12,13]. These negative effects eventually jeopardise optimal medical care and impact patient health negatively [14, 15]. While medical students have identified a number of stressors, research indicates that examinations are the most frequently reported sources of stress [14, 16,17,18,19,20,21]. As a result, there is a growing body of research on how to improve the mental health of medical students and promote what is currently known as resilience.

Resilience is a psychological construct that refers to the characteristics needed to adapt to the dynamic changes of life and maintain mental well-being [22]. The topic of resilience is of interest in many disciplines, including developmental psychology, sociology, education and, in particular, health professions education (HPE) [23,24,25,26,27]. In psychology, resilience generally refers to the status of an individual who is adapting to significant adversity while maintaining good mental and physical well-being [28]. Alva [29] defined academically resilient students as those ‘who sustain high levels of achievement motivation and performance despite the presence of stressful events and conditions that place them at risk of doing poorly in school and ultimately dropping out of school’ [29]. Further, academically resilient students are able to maintain mental agility and continue growing and developing despite academic and life setbacks [30,31,32,33]. In a recent meta-analysis of resilience constructs across 21 resilience scales, Wadi et al. [34] identified four primary resilience characteristics: 1) control: maintaining composure and control in the face of stressful adversity; 2) involvement: being committed to overcoming adversity; 3) resourcefulness: using available resources for appropriate solutions to overcome adversity; and 4) growth: continuing to grow and rebounding stronger from adversity. These four constructs provide a solid foundation for the implementation of interventions fostering resilience.

Studies on HPE have shown that resilience is positively correlated with compassion, satisfaction, and patient care and negatively associated with burnout, secondary stress, anxiety, intolerance to ambiguity, and poor communication [35]. Numerous health institutions have implemented interventions based on these studies [15, 36, 37]. Common intervention guidelines include training workshops focussed on psychosocial skills, such as mindfulness, Stress Management and Resilience Training (SMART), and narrative and simulation training [38,39,40,41]. Although these interventions have been shown to have some positive effects [42], they lack a solid theoretical foundation and do not focus on assessment, the primary source of student stress. Resilience theory must be integrated with the assessment context to provide a solid basis to guide intervention strategies and maintain the quality of assessment. Interestingly, a recent paper explored the intersection between resilience and curricula. The authors presented a principle-based approach to curriculum design to foster resilience as an integral part of the curriculum in higher education [43]. Although this approach sheds light on the philosophical approach to building a curriculum to create resilient graduates, it does not address the exact link between assessment and students’ resilience. To the best of our knowledge, no study has incorporated resilience into the assessment process. Accordingly, the purpose of this study was to develop a framework to promote resilience in assessment planning and practice. In this way, assessment would serve as a source of resilience and promote the development of resilience-improving characteristics among students.

Methods

The researchers followed the guidelines for developing a theoretical model proposed by Whetten [44]. These guidelines include four essential questions for model development: (1) What are the constructs/factors that should be considered to explain the model? (2) How are these constructs/factors related to each other? (3) Why is the proposed relationship represented by this portray? (4) What are the implications of this model for research and practice? Each question refers to a developmental phase of the theoretical model. Accordingly, the researchers developed the model in these four phases. This involved establishing the foundation (literature review and focus group discussion) and triangulating the findings [45] through a content validation and response process. Figure 1 presents a flow chart summarising the study phases, questions, research areas, methods, and outcomes of each phase.

Fig. 1
figure 1

Flow chart summarizing study phases, questions, research areas, methods, and outcomes of each phase

Phase I

Identifying resilience constructs

In this phase, we aimed to answer the following question: ‘What are the constructs/factors that should be considered in developing the model?’ Accordingly, we first identified the research areas related to the model development: academic resilience, assessment, and test anxiety. Then, we examined narrative and scoping reviews and focus group discussions to find evidence in these areas.

Narrative reviews

We conducted three concurrent narrative reviews to collect sufficient scientific research for the model synthesis [46]. The first review identified the theoretical foundations and factors influencing academic resilience. The second review delineated the theoretical foundations of assessment in HPE. The third review identified the theories behind test anxiety. The articles included in this review were compiled into a table of evidence synthesis [47] (Appendix I) to extract key information regarding the theoretical foundations of academic resilience, test anxiety, and assessment systems and match them with the four resilience constructs [34].

Scoping review

Four of the authors (MW, MSBY, AFA, and NZ) conducted a scoping review to identify factors affecting test anxiety [48] following Arksey and O'Malley [49] stages of scoping reviews. Six electronic databases were used: PubMed, CINAHL, PsychINFO, ERIC (through EBSCOHST), SCOPUS, and ProQuest. The Preferred Reporting of Items for Systematic Reviews and Meta- Analyses (PRISMA) Statement [50] was followed to report the scoping review steps. Based on the factors identified in each study, codes from all studies were compiled to generate subthemes and overarching themes [48]. Appendix II contains a detailed description of the scoping review.

Focus group discussion (FGD)

Four of the authors (MW, MSBY, AFA, and NZ) conducted an FGD to elicit students’ knowledge, perspectives, and attitudes regarding test anxiety (TA) and their coping guidelines [51]. Appendix III contains a detailed description of the FGD steps and procedures.

Phase II

Relating the identified constructs and their factors with each other

Phase 2 answered Whetten’s second question: ‘How are these constructs/factors related to each other?, and based on the notion of triangulation [45], the authors performed three subsequent steps:

  • I. Based on the output of the scoping review and FGD, the factors decreasing TA were qualitatively analysed to generate guidelines for decreasing TA. Initially, similar factors were grouped together. Then, a suitable guideline was constructed capturing these factors. The initial guideline statements were revised to remove redundant statements and merge similar guidelines into a single statement. This iterative process was done until consensus was reached among the authors.

  • II. Guided by the assessment cycle [52] and the sociotechnical model of assessment [53], the guidelines were thematically categorised into five phases of assessment. Each phase was named and defined.

  • III. To evaluate the content validity of these guidelines, the authors utilised a structured tool called the Content Validity Index (CVI) [54, 55], which measures the proportional agreement when two or more expert panels independently evaluate the relevance of a model’s contents to the domain of interest. Ten experts, who were medical education and student assessment specialists, were invited to join the panels [54]. A four-point Likert scale was created in a Google form and sent to the experts via email. They were asked to evaluate the relevance of each guideline to its corresponding category (phase of assessment) [56]. An answer of 1 indicated the guideline was irrelevant, whereas an answer of 4 indicated that the item was extremely relevant.

  • IV. Three indices were calculated: item/guideline-level CVIs (I-CVIs), scale-level CVIs (S-CVI) over the universal agreement method (S-CVI/UA), and the average calculation method (S-CVI/Ave) [54, 55, 57]. In I-CVIs, the relevance of each guideline to its related phase of assessment was evaluated by experts. Using a dichotomous rating of relevance, experts’ ratings of 1 or 2, indicating non-relevance, were counted as 0, while ratings of 3 and 4, indicating relevancy, were counted as 1 [57]. In S-CVI/UA, the proportion of guidelines receiving a rating of 3 or 4 (relevant) from all expert panels was calculated. In S-CVI/Ave, the mean I-CVI score for all guidelines was summed [57]. Appendix V contains the full content validity protocol.

  • V. Based on these indices and the experts’ recommendations, five guidelines were removed, so the final set included 19 guidelines.

Phase III

Sorting and ranking guidelines by experts

This phase answered Whetten’s third question: ‘Why is the proposed relationship represented by this portray?’ The same experts who were invited for content validation were asked to sort and rank each guideline into the appropriate four resilience constructs [58]. Using the checkboxes grid on the Google form, all of the guidelines were listed in one column, and the resilience constructs were placed at the top of the four adjacent columns. A column headed ‘not applicable’ was added for any guidelines that were irrelevant to the resilience constructs.

The responses were analysed using Microsoft Excel’s sorting and ranking functions. For each resilience construct, an Excel-based graphical representation was created based on the consensus of 50% of the experts if they categorised this particular guideline under a specific resilience construct.

Finally, using the Draw.io website, a conceptual map was created to link the guidelines-based assessment phases and resilience constructs to the theoretical foundation of phase I.

Phase IV

Implication for practice

In phase IV, the authors aimed to answer Whetten’s final question: ‘What are the implications of this model for research and practice?’ We evaluated the guidelines from the users’ perspective based on the response process method [59]. Twenty [20] participants were invited via email [60]. Apart from the invitation, the email contained a description of the research objectives and the validation process. A link to a video describing the application of SAR in assessment practice was also included. A response process form was attached to the email. The medical teachers were asked to review all guidelines and rate each of them based on its clarity and comprehensibility using a four-point scale (1 = not clear and comprehensible, 2 = somewhat clear and comprehensible, 3 = clear and comprehensible, 4 = very clear and comprehensible). Additionally, the participants were asked to provide written feedback on open-ended questions about the feasibility and applicability of the guidelines. Appendix VI contains the full content validity protocol.

We calculated three FVI indices: item/guideline FVIs (I-FVIs), scale/model FVIs using the universal agreement method (S-FVI/UA), and scale/model FVIs using the average method (S-FVI/UA) (Ave). First, all ratings were converted to a dichotomous scale: not clear (ratings of 1 and 2) and clear (ratings of 3 and 4). Then, we calculated the percentage of medical teachers who gave each guideline a ‘clear’ rating (I-FVIs). The proportion of guidelines receiving a rating of 3 or 4 (clear) from all medical teachers was then determined in S-FVI/UA. The average score of all I-FVIs for all guidelines was calculated in S-FVI/Ave. Based on these indices and the recommendations of the medical educators, two guidelines were eliminated, resulting in a final set of 17 guidelines.

The final configuration of the framework

After all these phases, Microsoft Visio® was used to reshape the framework for the systematic assessment of resilience to make it more comprehensible and straightforward to implement. The four resilience constructs were placed in relation to the assessment phases, and the final list of phase-related guidelines was also placed.

Results

Phase I

The narrative reviews yielded relevant theoretical foundations in three areas: academic resilience, assessment, and test anxiety. These foundations were tabulated, and each relevant implication(s), through which resilience could be promoted, was identified (Table 1). In the first three columns, research areas, theoretical foundations/frameworks, and the implications of each study were presented, respectively. Guided by the integrated resilience model [34], every implication was matched to its suitable four resilience constructs: control, involvement, resourcefulness, and growth [34]. The matching was based on the agreement of the authors.

Table 1 Matching the implication of the identified theoretical frameworks with resilience constructs

The scoping review revealed that factors related to test anxiety can be categorised into two groups: those that increase TA and those that decrease TA (Appendix II). Likewise, the thematic analysis of the medical student focus group discussion yielded three major themes: students, academic resources, and examiners. Each theme was subdivided into subthemes that corresponded to an increase or decrease in TA [51] (Appendix III).

Phase II

Compiling and categorizing the generated guidelines

Based on the scoping review and FGD findings, the authors compiled a list of test anxiety-reducing guidelines (Appendix IV). The list was evaluated, and guidelines that were duplicated were eliminated. Guided by the assessment cycle [52] and the model of ‘assessment as a sociotechnical system’ [53], the authors categorised the generated guidelines into groups and subsequently named and defined each group. The following five phases of evaluation were identified:

  • 1. Assessment direction, which focuses on improving the candidate’s understanding of the assessment’s scope and procedure.

  • 2. Assessment preparation, which emphasises enhancing the candidate’s cognitive, mental, and psychomotor readiness to optimise assessment performance.

  • 3. Assessment experience, which enhances the formative assessment component.

  • 4. Examiner focus, which relates to improving examiner behaviour to improve the candidate’s performance and decrease the candidate’s anxiety.

  • 5. Student reflection, which encourages reflection by students.

Table 2 shows each phase and its related guidelines.

Table 2 Categorization of SAR guidelines based on stages and phases of assessment

Content validation of the guidelines

Six of the 10 expert panels responded to the invitation to participate in the content validation process. The expert panels had extensive experience in medical education, student assessment, and psychological well-being (Appendix V). Following the principles for calculating CVIs [55, 57], all items (guidelines) with an I-CVI of 0.83 or higher were deemed relevant and included in the response process study based on the CVI. A global acceptance level of 0.80 or higher for S-CVI/UA and S-CVI(Ave) indicates that all components are relevant to the framework. Before the subsequent step, modifications were made based on feedback, which resulted in the elimination of items scoring less than 0.83. Table 3 shows the list of guidelines based on the content validation.

Table 3 The CVIs of the guidelines after removing four items

Figure 2 illustrates the correlation and configuration of the relationship between the findings of phase I of the study. The right side of Fig. 2 presents the recommended guidelines for fostering resilience through five phases of assessment. They are connected to the four resilience constructs with arrows that point in both directions to illustrate the reciprocal relationship between guidelines and resilience constructs. The four resilience constructs were linked to the theoretical foundations of academic resilience and the assessment system, and they were shown to have an inverse relationship with test anxiety.

Fig. 2
figure 2

Mapping phase I output with the SAR framework and its guidelines

Phase III

Figures 3, 4, 5, and 6 display the results of the experts’ ranking and sorting. Each figure represents a resilience construct and its corresponding guidelines, on which 50% of the experts agreed.

Fig. 3
figure 3

Sorting and ranking of guidelines relating “self-control” construct

Fig. 4
figure 4

Sorting and ranking of guidelines relating “management of resources” construct

Fig. 5
figure 5

Sorting and ranking of guidelines relating “engagement” construct

Fig. 6
figure 6

Sorting and ranking of guidelines relating “growth” construct

Phase IV

The response process aimed to evaluate the use of SAR and its guidelines among medical teachers. Twelve [12] of the 20 invited panels responded to the invitation. The panels consisted of medical teachers from various disciplines and universities (Appendix VI). On the basis of the FVI [60], 17 items (guidelines) with an I-FVI of 0.83 or higher were retained, indicating their relevance, while two guidelines with scores below the threshold were eliminated (two guidelines). Table 4 presents the final list of guidelines. As a result, the overall framework S-FVI/UA and S-FVI (Ave) improved from 0.92 to 0.94, indicating the clarity and understandability of all framework components (Table 4).

Table 4 The FVI indices of SAR after removing some guidelines

The medical teachers also gave encouraging written insights about the guidelines and their clarity (Appendix VII), which are summarised as a word cloud in Fig. 7.

Fig. 7
figure 7

Word cloud of the most common comments made by medical educators

The SAR framework

The SAR framework (Fig. 8) incorporates the overarching relationship between the four resilience constructs [34], the five phases of the assessment process, and their relevant strategies for promoting resilience. The four constructs of resilience include 1) self-control, meaning that students should be able to govern themselves and face adversity, 2) management, which describes students’ ability to use available resources effectively to overcome obstacles, 3) engagement, which refers to students’ ability to be involved and committed to pursuing the challenge, and 4) growth, which reflects students’ ongoing development to face future challenges. The four constructs are part of an ongoing continuous cycle.

Fig. 8
figure 8

The Systematic Assessment for Resilience (SAR) framework

The ‘assessment experience’ phase is represented as the nucleus of the framework, as it is the core of the assessment process. In this phase, students’ resilience may be promoted through various strategies, such as increasing the frequency of formative assessment, encouraging targeted mock exams  [87,88,89,90,91], promoting collaborative assessment and open book exams, peer assessment, and introducing progress testing. In the ‘assessment direction’ phase, students’ knowledge of the assessment scope and process may be improved through various strategies, including the sharing of assessment mapping and the assessment rubric, briefing on the overall assessment coverage, establishing a briefing session before the exam, and familiarising students with the assessment methods. Such strategies will foster resilience, especially the self-control and growth constructs. In the ‘assessment preparation’ phase, students’ cognitive, mental, and psychomotor preparedness are improved to maximise their assessment performance. Various strategies may be used, such as advising students on study skills, time management, and exam skills, directing students to good materials for revision, and providing strategies for students to reduce test anxiety. These strategies will promote resilience, especially the self-control and management constructs. The ‘examiner focus’ assessment phase deals with examiners’ behaviour to maximise students’ performance and reduce their anxiety. This phase can be improved by establishing a non-threatening environment during examinations, for example, by smiling, engaging in welcoming and professional behaviour, building rapport, and showing a sense of humour. All of these will foster resilience, especially the engagement and management constructs. The ‘student reflection’ assessment phase promotes students’ reflection by providing space in the written assessment for students to offer feedback on the examiner. Such strategies will improve students’ resilience, particularly in the engagement and growth constructs.

Discussion

As the prevalence of pathological stress among medical and HPE students continues to rise [3,4,5,6,7,8], a number of interventional programmes have been designed to improve their mental health [15]. The primary criticism of these programmes is their emphasis on causes other than the primary cause, which is the assessment. This study provides a methodical approach for promoting resilience while practicing the assessment. A variety of strategies promote resilience through the five phases of assessment. Resilience is comprised of four constructs, which are shown in Fig. 8. In the following sections, we will discuss each assessment phase and how it can promote resilience.

Assessment experience

The assessment experience phase is purposively located at the centre of this framework, as it promotes the four resilience constructs. The assessment experience emphasises the frequency of formative assessment or any other assessment experience (e.g. mock exam), offering students an opportunity to engage in a simulated challenge similar to the real assessment. Such an experience creates a space for self-regulation and thus enhances self-efficacy and learning growth [92]. There is a growing body of research highlighting the crucial role of formative assessment in maximising learning [93, 94]. Formative assessment serves as a tool for practicing assessment, reflecting on performance, and identifying weak points and opportunities to improve actual future performance [68]. In this study, the strategies within the assessment experience were designed to maximise self-regulatory learning and evaluative judgment [92, 93]. In a nutshell, the more exposure to the assessment experience, the greater the options for strengthening evaluative judgment and hence self-regulation, which is expressed as ‘self-control’ in our framework. Furthermore, increasing ‘assessment experience’ will lead students to the focus on the prudent use of available resources, ‘management’, vividly experiencing the actual assessment, ‘engagement’, and self-esteem—in other words, on ‘growth’ (Fig. 8).

Assessment direction

The current assessment practice in the medical and HPE fields is competency-based assessment [95,96,97]. Consequently, it is crucial to communicate with students clearly about these competencies and how they will be evaluated [98]. The ‘assessment direction’ involves directing students towards assessment by providing them with clear instructions on how the assessment will be administered and what is expected of them. Knowing what is expected of them during assessment will enable them to shape their learning and direct their efforts to achieve these objectives [99]. A considerable body of studies has shown that defining the assessment expectations (objectives) will allow students to target their efforts to achieve them [98, 100]. However, there is an issue in terms of which assessment criteria and standards should be communicated to students. There is no research consensus on the suitable methods for communicating with students. In our framework, we provide a variety of strategies examiners could use to direct students toward assessment.

Articulating the assessment direction has benefits for both the examinee and examiner. The examinee can then tailor his or her efforts to the assessment direction. The examiner can select suitable assessment modalities to measure the desired outcomes [96]. Additionally, it reduces the burden on examiners related to answering examinees’ questions about exams. Moreover, examinees’ test anxiety will be reduced, and they will have more agency in meeting the expectations [101]. In summary, directing students toward assessment will enhance their self-control, supporting self-regulated learning [93], and empower them to grow and face future challenges [100].

Assessment preparation

When guiding students in the assessment direction, it is important to ensure that they have the proper tools to succeed. Hence, assessment preparation plays an important role in meeting the challenge of assessment. Studies have shown that helping students control their negative thoughts and advising them on learning skills and time management will maximise their learning [102, 103]. In our framework, we believe that teachers play the central role in maximising learners’ behaviour. In addition to preparing high-quality learning materials, teachers can also provide students with rich resources to improve their mental well-being and reduce the negative effects of assessment. This practice will enhance students’ self-control and shift their mindset so they can use the resources around them to face assessment challenges.

Examiner focus

This framework highlights the examiner’s conduct during the exam. The direct encounter between examiner and examinee has a psychological dimension, creating life-altering memories that can either destroy or reinforce the examinee’s self-esteem and, consequently, resilience. While the presence of an examiner in the examination room automatically causes test anxiety, the situation will become worse if there is a lack of proper communication or if the examiner chooses to fail students based on personal preferences or biases [51]. This negative situation has been dubbed ‘the hawk effect’ [104]. Our framework proposes techniques to mitigate this unintended consequence and foster an atmosphere conducive to reciprocal communication and learning, which automatically enhances students’ ‘engagement’ and prepares them for similar situations in the future by managing learning resources wisely and effectively.

Student reflection

Self-assessment (or self-reflection) has been proven to be an effective approach to support learning engagement. In self-reflection, students evaluate their performance related to both internally set goals and externally set criteria [105]. In this framework, teachers (assessors) provide systematised avenues for self-reflection to achieve the desired resilience construct: growth. While several studies have presented different approaches for promoting self-reflection, we encourage assessors to use the most common reflection method: feedback [100]. Nicol and Macfarlane‐Dick [68] described the most important aspect of high-quality feedback: ‘Good quality external feedback is information that helps students troubleshoot their own performance and self-correct; that is, it helps students take action to reduce the discrepancy between their intentions and the resulting effects’ [68]. Numerous studies on HPE have promoted the use of feedback [106], and it is key component in the programmatic assessment framework [107]. Both feedback and self-reflection support each other to maximise learning and hence ‘growth’ [108]. Through this feedback, students will receive constructive comments regarding their performance based on the teacher’s expectation or established criteria, which they can then use internally to redesign a suitable learning path to achieve their goals [100]. Studies have shown that students’ self-reflection leads to deep learning, self-efficacy, self-regulation, and personal growth [93, 106]. While some researchers argue that students should be trained in self-regulation, others contend that self-regulation is a spontaneous process that can be maximised by providing a suitable platform to practice it [109]. Consequently, the framework provides additional resources for assessors to promote self-reflection.

Limitations and future perspectives

During the literature review phase of developing the SAR framework, efforts were made to broaden the search of the scoping review to include literature in HPE rather than just medicine, and to conduct narrative reviews that considered higher education in general. However, the FGD only included one medical school. Another limitation of this research was that it only included articles written in English, which may introduce bias (also known as language bias [110]) and result in the omission of important cultural contexts and necessary details in data synthesis. However, the triangulation of the findings with those from the scoping review, other narrative reviews, and the FGD mitigated the aforementioned limitations [45].

Conclusions

Resilience is proven to be the desired construct for medical and health professions students. It fosters several characteristics graduates need to meet future challenges and adversities. The current study presents a systematic method for fostering student resilience through assessment practice. Based on rigorous methodological research and a theoretical foundation, the study provides a set of practical strategies for promoting resilience. Through five phases of assessment, namely, assessment direction, assessment preparation, assessment experience, examiner focus, and student reflection, the SAR framework aims to promote four resilience constructs: self-control, management, engagement, and growth.

Availability of data and materials

Please email the corresponding author for a link to the de-identified datasets. However, due to privacy concerns, the FGD transcripts are unavailable to the general public.

Abbreviations

CVI:

Content Validity Index

FGD:

Focus Group Discussion

FVI:

Face Validity Index

HPE:

Health Professions Education

SAR:

Systematic Assessment for Resilience

TA:

Test Anxiety

References

  1. Veal CT. We burn out, We break, We die: medical schools must change their culture to preserve medical student mental health. Acad Med. 2021;96(5):629–31. https://doi.org/10.1097/acm.0000000000003991.

    Article  Google Scholar 

  2. Tian-CiQuek T, Tam W-S, Tran BX, Zhang M, Zhang Z, Su-HuiHo C, et al. The global prevalence of anxiety among medical students: a meta-analysis. Int J Environ Res Public Health. 2019;16(15):2735. https://doi.org/10.3390/ijerph16152735.

    Article  Google Scholar 

  3. Erschens R, Keifenheim KE, Herrmann-Werner A, Loda T, Schwille-Kiuntke J, Bugaj TJ, et al. Professional burnout among medical students: systematic literature review and meta-analysis. Med Teach. 2019;41(2):172–83. https://doi.org/10.1080/0142159X.2018.1457213.

    Article  Google Scholar 

  4. Frajerman A, Morvan Y, Krebs M-O, Gorwood P, Chaumette B. Burnout in medical students before residency: a systematic review and meta-analysis. Eur Psychiatr. 2019;55:36–42. https://doi.org/10.1016/j.eurpsy.2018.08.006.

    Article  Google Scholar 

  5. Abdalla ME, Shorbagi S. Challenges faced by medical students during their first clerkship training: a cross-sectional study from a medical school in the Middle East. J Taibah Univ Med Sci. 2018;13(4):390–4. https://doi.org/10.1016/j.jtumed.2018.03.008.

    Article  Google Scholar 

  6. Jordan RK, Shah SS, Desai H, Tripi J, Mitchell A, Worth RG. Variation of stress levels, burnout, and resilience throughout the academic year in first-year medical students. PLoS ONE. 2020;15(10):e0240667. https://doi.org/10.1371/journal.pone.0240667.

    Article  Google Scholar 

  7. Ragab EA, Dafallah MA, Salih MH, Osman WN, Osman M, Miskeen E, et al. Stress and its correlates among medical students in six medical colleges: an attempt to understand the current situation. Middle East Curr Psychiatr. 2021;28(1):75. https://doi.org/10.1186/s43045-021-00158-w.

    Article  Google Scholar 

  8. Moir F, Yielder J, Sanson J, Chen Y. Depression in medical students: current insights. Adv Med Educ Pract. 2018;9:323–33. https://doi.org/10.2147/AMEP.S137384.

    Article  Google Scholar 

  9. Ribeiro ÍJS, Pereira R, Freire IV, de Oliveira BG, Casotti CA, Boery EN. Stress and quality of life among university students: a systematic literature review. Health Prof Educ. 2018;4(2):70–7. https://doi.org/10.1016/j.hpe.2017.03.002.

    Article  Google Scholar 

  10. Flaherty JA, Richman JA. Substance use and addiction among medical students, residents, and physicians. Psychiatr Clin North Am. 1993;10:189. https://doi.org/10.1016/S0193-953X(18)30201-6.

    Article  Google Scholar 

  11. Hays LR, Cheever T, Patel P. Medical student suicide, 1989–1994. Am J Psychiatry. 1996;153(4):553. https://doi.org/10.1176/ajp.153.4.553.

    Article  Google Scholar 

  12. Newbury-Birch D, White M, Kamali F. Factors influencing alcohol and illicit drug use amongst medical students. Drug Alc Depend. 2000;59(2):125–30. https://doi.org/10.1016/S0376-8716(99)00108-8.

    Article  Google Scholar 

  13. Pickard M, Bates L, Dorian M, Greig H, Saint D. Alcohol and drug use in second-year medical students at the university of leeds. Med Educ. 2000;34(2):148–50. https://doi.org/10.1046/j.1365-2923.2000.00491.x.

    Article  Google Scholar 

  14. Dyrbye LN, Thomas MR, Shanafelt TD. Systematic review of depression, anxiety, and other indicators of psychological distress among U.S. and Canadian medical students. Acad Med. 2006;81(4):354–73. https://doi.org/10.1097/00001888-200604000-00009.

    Article  Google Scholar 

  15. Yusoff MSB. Interventions on medical students’ psychological health: a meta-analysis. J Taibah Univ Med Sci. 2014;9(1):1–13. https://doi.org/10.1016/j.jtumed.2013.09.010.

    Article  Google Scholar 

  16. Yusoff MSB, Rahim AFA, Yaacob MJ. Prevalence and sources of stress among universiti sains Malaysia medical students. Malaysian J Med Sci: MJMS. 2010;17(1):30. https://doi.org/10.5959/eimj.v5i4.190.

    Article  Google Scholar 

  17. Yusoff MSB, Yee LY, Wei LH, Siong TC, Meng LH, Bin LX, et al. A study on stress, stressors and coping strategies among Malaysian medical students. International Journal of Students' Research. 2011;1(2) (Retrieved from: http://www.ijsronline.net/article.asp?issn=2321-6662;year=2011;volume=1;issue=2;spage=45;epage=50;aulast=Yusoff;type=0)).

  18. Yusoff MSB. Impact of summative assessment on first year medical students’ mental health. International Medical Journal. 2011;18(3):172–5. (Retrieved from: https://www.researchgate.net/publication/215632306_Impact_of_Summative_Assessment_on_First_Year_Medical_Students'_Mental_Health).

  19. Aziz N, Serafi AH. Increasing Levels of Test Anxiety and Psychological Distress with Advancing Years of Medical Education. British Journal of Medical and Health Research. 2017;4(3):(Retrieved from: https://www.researchgate.net/publication/315830384_Increasing_Levels_of_Test_Anxiety_and_Psychological_Distress_with_Advancing_Years_of_Medical_Education#fullTextFileContent).

  20. Boparai JK, Gupta AK, Singh A, Matreja PS, Khanna PML, Garg P. Impact of test anxiety on psychomotor functions and satisfaction with life of medical undergraduates during second professional curriculum. Education in Medicine Journal. 2013;5(4):e6-e11. https://doi.org/10.5959/eimj.v5i4.172

  21. Wald HS. Optimizing resilience and wellbeing for healthcare professions trainees and healthcare professionals during public health crises–Practical tips for an ‘integrative resilience’approach. Med Teach. 2020;42(7):744–55. https://doi.org/10.1080/0142159X.2020.1768230.

    Article  Google Scholar 

  22. Bonanno GA. Loss, trauma, and human resilience: have we underestimated the human capacity to thrive after extremely aversive events? Am Psychol. 2004;59(1):20. https://doi.org/10.1037/0003-066X.59.1.20.

    Article  Google Scholar 

  23. Bonanno GAP, Mancini ADP. The human capacity to thrive in the face of potential trauma. Pediatrics. 2008;121(2):369. https://doi.org/10.1542/peds.2007-1648.

    Article  Google Scholar 

  24. Nucifora F, Langlieb AM, Siegal E, Everly GS, Kaminsky M. Building resistance, resilience, and recovery in the wake of school and workplace violence. Disast Med Public Health Prep. 2007;1(S1):S33–7. https://doi.org/10.1097/DMP.0b013e31814b98ae.

    Article  Google Scholar 

  25. Zwack J, Schweitzer J. If every fifth physician is affected by burnout, what about the other four? Resilience strategies of experienced physicians. Acad Med. 2013;88(3):382–9. https://doi.org/10.1097/ACM.0b013e318281696b.

    Article  Google Scholar 

  26. Schwarz S. Resilience in psychology: a critical analysis of the concept. Theory Psychol. 2018;28(4):528–41. https://doi.org/10.1177/0959354318783584.

    Article  Google Scholar 

  27. Sergeant J, Laws-Chapman C. Creating a positive workplace culture. Nurs Manage (through 2013). 2012;18(9):14–9. https://doi.org/10.7748/nm2012.02.18.9.14.c8889.

    Article  Google Scholar 

  28. Alva SA. Academic invulnerability among Mexican-American students: the importance of protective resources and appraisals. Hispanic J Behav Sci. 1991;13(1):18–34. https://doi.org/10.1177/07399863910131002.

    Article  Google Scholar 

  29. Ye W, Strietholt R, Blömeke S. Academic resilience: underlying norms and validity of definitions. Educ Assess, Eval Accountabil. 2021;33(1):169–202. https://doi.org/10.1177/0739986391013100210.1007/s11092-020-09351-7.

    Article  Google Scholar 

  30. García-Crespo FJ, Fernández-Alonso R, Muñiz J. Academic resilience in European countries: the role of teachers, families, and student profiles. PLoS ONE. 2021;16(7):e0253409. https://doi.org/10.1371/journal.pone.0253409.

    Article  Google Scholar 

  31. Huey CWT, Palaganas JC. What are the factors affecting resilience in health professionals? A synthesis of systematic reviews. Med Teach. 2020;42(5):550–60. https://doi.org/10.1080/0142159X.2020.1714020.

    Article  Google Scholar 

  32. Martin A. Motivation and academic resilience: developing a model for student enhancement. Aust J Educ. 2002;46(1):34–49. https://doi.org/10.1177/0739986391013100210.1177/000494410204600104.

    Article  Google Scholar 

  33. Wadi M, Nordin NI, Syazni N, Roslan TC, Yusoff MSB. Reframing resilience concept: insights from a meta-synthesis of 21 resilience scales. Educ Med J. 2020;12(2):3–22. https://doi.org/10.21315/eimj2020.12.2.2.

    Article  Google Scholar 

  34. Cooke GP, Doust JA, Steele MC. A survey of resilience, burnout, and tolerance of uncertainty in Australian general practice registrars. BMC Med Educ. 2013;13(1):2. https://doi.org/10.1186/1472-6920-13-2.

    Article  Google Scholar 

  35. Fox S, Lydon S, Byrne D, Madden C, Connolly F, O’Connor P. A systematic review of interventions to foster physician resilience. Postgrad Med J. 2018;94(1109):162–70. https://doi.org/10.1136/postgradmedj-2017-135212.

    Article  Google Scholar 

  36. Sood A, Sharma V, Schroeder DR, Gorman B. Stress Management and Resiliency Training (SMART) program among department of radiology faculty: a pilot randomized clinical trial. EXPLORE: J Sci Healing. 2014;10(6):358–63. https://doi.org/10.1016/j.explore.2014.08.002.

    Article  Google Scholar 

  37. Mache S, Baresi L, Bernburg M, Vitzthum K, Groneberg D. Being prepared to work in gynecology medicine: evaluation of an intervention to promote junior gynecologists professionalism, mental health and job satisfaction. Arch Gynecol Obstetr. 2017;295(1):153–62. https://doi.org/10.1177/0739986391013100210.1007/s00404-016-4223-6.

    Article  Google Scholar 

  38. Pliego JF, Wehbe-Janek H, Rajab MH, Browning JL, Fothergill RE. OB/GYN boot camp using high-fidelity human simulators: enhancing residents’ perceived competency, confidence in taking a leadership role, and stress hardiness. Simul Healthc. 2008;3(2):82–9. https://doi.org/10.1097/SIH.0b013e3181658188.

    Article  Google Scholar 

  39. Runyan C, Savageau JA, Potts S, Weinreb L. Impact of a family medicine resident wellness curriculum: a feasibility study. Med Educ Online. 2016;21(1):30648. https://doi.org/10.3402/meo.v21.30648.

    Article  Google Scholar 

  40. Sood A, Prasad K, Schroeder D, Varkey P. Stress management and resilience training among department of medicine faculty: a pilot randomized clinical trial. J Gen Internal Med. 2011;26(8):858–61. https://doi.org/10.1007/s11606-011-1640-x.

    Article  Google Scholar 

  41. Joyce S, Shand F, Tighe J, Laurent SJ, Bryant RA, Harvey SB. Road to resilience: a systematic review and meta-analysis of resilience training programmes and interventions. BMJ Open. 2018;8(6):e017858. https://doi.org/10.1136/bmjopen-2017-017858.

    Article  Google Scholar 

  42. van Kessel G, Brewer M, Lane M, Cooper B, Naumann F. A principle-based approach to the design of a graduate resilience curriculum framework. Higher Educ Res Dev. 2022;41(4):1325–39. https://doi.org/10.1080/07294360.2021.1882400.

    Article  Google Scholar 

  43. Whetten DA. What constitutes a theoretical contribution? Acad Manage Rev. 1989;14(4):490–5. https://doi.org/10.5465/amr.1989.4308371.

    Article  Google Scholar 

  44. Hussein A. The use of triangulation in social sciences research: can qualitative and quantitative methods be combined? J Compar Soc Work. 2009;4(1):106–17. https://doi.org/10.31265/jcsw.v4i1.48.

    Article  Google Scholar 

  45. Finfgeld-Connett D. Generalizability and transferability of meta-synthesis research findings. J Adv Nurs. 2010;66(2):246–54. https://doi.org/10.1111/j.1365-2648.2009.05250.x.

    Article  Google Scholar 

  46. Green BN, Johnson CD, Adams A. Writing narrative literature reviews for peer-reviewed journals: secrets of the trade. J Chiropractic Med. 2006;5(3):101–17. https://doi.org/10.1016/S0899-3467(07)60142-6.

    Article  Google Scholar 

  47. Wadi M, Yusoff MSB, Abdul Rahim AF, Lah NAZN. Factors influencing test anxiety in health professions education students: a scoping review. SN Soc Sci. 2022;2(174):1–25. https://doi.org/10.1007/s43545-022-00459-9.

    Article  Google Scholar 

  48. Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8(1):19–32. https://doi.org/10.1080/1364557032000119616.

    Article  Google Scholar 

  49. Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Internal Med. 2018;169(7):467–73. https://doi.org/10.7326/M18-0850.

    Article  Google Scholar 

  50. Wadi M, Yusoff MSB, Abdul Rahim AF, Lah NAZN. Factors affecting test anxiety: a qualitative analysis of medical students’ views. BMC Psychol. 2022;10(1):8. https://doi.org/10.1186/s40359-021-00715-2.

    Article  Google Scholar 

  51. Fowell S, Southgate L, Bligh J. Evaluating assessment: the missing link? Med Educ. 1999;33(4):276–81. https://doi.org/10.1046/j.1365-2923.1999.00405.x.

    Article  Google Scholar 

  52. Lineberry M. Assessment affecting learning. Assessment in health professions education. 2nd ed. New York: Routledge Taylor and Francis Group; 2019. p. 257–71.

  53. Lynn MR. Determination and quantification of content validity. Nurs Res. 1986;35(6):382–5. https://doi.org/10.1177/0739986391013100210.1097/00006199-198611000-00017.

    Article  Google Scholar 

  54. Yusoff MSB. ABC of content validation and content validity index calculation. Res. 2019;11(2):49–54. https://doi.org/10.21315/eimj2019.11.2.6.

    Article  Google Scholar 

  55. Grant JS, Davis LL. Selection and use of content experts for instrument development. Res Nurs Health. 1997;20(3):269–74. https://doi.org/10.1002/(SICI)1098-240X(199706)20:3%3c269::AID-NUR9%3e3.0.CO;2-G.

    Article  Google Scholar 

  56. Polit DF, Beck CT. The content validity index: are you sure you know what’s being reported? Critique and recommendations. Res Nurs Health. 2006;29(5):489–97. https://doi.org/10.1177/0739986391013100210.1002/nur.20147.

    Article  Google Scholar 

  57. Wolfe DA, Stasny EA. Encyclopedia of Survey Research Methods. 2008 2022/07/04. Thousand Oaks Thousand Oaks, California: Sage Publications, Inc. Available from: https://methods.sagepub.com/reference/encyclopedia-of-survey-research-methods.

  58. Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119(2):166.e7-e16. https://doi.org/10.1016/j.amjmed.2005.10.036.

    Article  Google Scholar 

  59. Yusoff MSB. ABC of response process validation and face validity index calculation. Educ Med J. 2019;11(3):55–61. https://doi.org/10.21315/eimj2019.11.3.6.

    Article  Google Scholar 

  60. Martin AJ, Marsh HW. Academic resilience and its psychological and educational correlates: a construct validity approach. Psychol Schools. 2006;43(3):267–81. https://doi.org/10.1002/pits.20149.

    Article  Google Scholar 

  61. Dunn LB, Iglewicz A, Moutier C. A conceptual model of medical student well-being: promoting resilience and preventing burnout. Acad Psychiatry. 2008;32(1):44–53. https://doi.org/10.1176/appi.ap.32.1.44.

    Article  Google Scholar 

  62. Martin AJ, Marsh HW. Academic buoyancy: towards an understanding of students’ everyday academic resilience. J School Psychol. 2008;46(1):53–83. https://doi.org/10.1016/j.jsp.2007.01.002.

    Article  Google Scholar 

  63. Kunicki ZJ, Harlow LL. Towards a higher-order model of resilience. Soc Indicat Res. 2020;151(1):329–44. https://doi.org/10.1007/s11205-020-02368-x.

    Article  Google Scholar 

  64. Van Der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ. 1996;1(1):41–67. https://doi.org/10.1007/BF00596229.

    Article  Google Scholar 

  65. Gibbs G, Simpson C, Macdonald R, editors. Improving student learning through changing assessment–a conceptual and practical framework. European Association for Research into Learning and Instruction Conference, Padova, Italy; 2003: Citeseer. (Retrieved from: https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=ca609b98befc83caf868ca6c28226cc8acc44d51).

  66. Baartman LK, Bastiaens TJ, Kirschner PA, Van der Vleuten CP. The wheel of competency assessment: presenting quality criteria for competency assessment programs. Stud Educ Eval. 2006;32(2):153–70. https://doi.org/10.1016/j.stueduc.2006.04.006.

    Article  Google Scholar 

  67. Nicol DJ, Macfarlane-Dick D. Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud Higher Educ. 2006;31(2):199–218. https://doi.org/10.1080/03075070600572090.

    Article  Google Scholar 

  68. Dijkstra J, Van der Vleuten C, Schuwirth L. A new framework for designing programmes of assessment. Adv Health Sci Educ. 2010;15(3):379–93. https://doi.org/10.1007/s10459-009-9205-z.

    Article  Google Scholar 

  69. Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 conference. Med Teach. 2011;33(3):206–14. https://doi.org/10.3109/0142159X.2011.551559.

    Article  Google Scholar 

  70. Cilliers FJ, Schuwirth LWT, van der Vleuten CPM. Modelling the pre-assessment learning effects of assessment: evidence in the validity chain. Med Educ. 2012;46(11):1087–98. https://doi.org/10.1111/j.1365-2923.2012.04334.x.

    Article  Google Scholar 

  71. ASPIRE. Aspire recognition of excellence in assessment in a medical school. Available on: http://www.aspire-to-excellence.org/Areas+of+Excellence/.2013.

  72. Sarason IG. Introduction to the study of test anxiety. In: Sarason IG, editor. Test anxiety: Theory, research, and applications. Hillsdale, NJ: Lawrence Erlbaum Assoc Incorporated; 1980. p. 3–14.

    Google Scholar 

  73. Naveh-Benjamin M, McKeachie WJ, Lin YG, Holinger DP. Test anxiety: Deficits in information processing. J Educ Psychol. 1981;73(6):816–24. https://doi.org/10.1037/0022-0663.73.6.816.

    Article  Google Scholar 

  74. Hodapp V, Henneberger A. Test anxiety, study habits, and academic performance. In: Spielberger CD, van der Ploeg HM, Schwarzer R, editors. Advances in test anxiety research. 2. Lisse, the Netherlands: Swets and Zeitlinger; 1983. p. 119–27. (Retrieved from:https://www.researchgate.net/profile/Wim-Kleijn/publication/15307613_Cognition_Study_Habits_Test_Anxiety_and_Academic_Performance/links/56f06bd008ae70bdd6c94b77/Cognition-Study-Habits-Test-Anxiety-and-Academic-Performance.pdf).

  75. Smith RJ, Arnkoff DB, Wright TL. Test anxiety and academic competence: A comparison of alternative models. J Counsel Psychol. 1990;37(3):313–21. https://doi.org/10.1037/0022-0167.37.3.313.

    Article  Google Scholar 

  76. Carver CS, Scheier MF. Origins and functions of positive and negative affect: A control-process view. Psychol Rev. 1990;97(1):19–35. https://doi.org/10.1037/0033-295X.97.1.19.

    Article  Google Scholar 

  77. Covington MV. Making the grade: A self-worth perspective on motivation and school reform. New York, NY, US: Cambridge University Press; 1992. https://doi.org/10.1017/CBO9781139173582.

  78. Spielberger CD, Vagg PR. Test anxiety: Theory, assessment, and treatment. Washington, DC: Taylor & Francis; 1995.

    Google Scholar 

  79. Sansgiry SK. Effect of students’ perceptions of course load on test anxiety. Am J Pharma Educ. 2006;70(2):6–26. https://doi.org/10.5688/aj700226.

    Article  Google Scholar 

  80. Sansgiry S, Bhosle M, Dutta AP. Predictiors of test anxiety in doctor of pharmacy students: an empirical study. Pharm Educ. 2005;5(2):121–9. https://doi.org/10.1080/15602210500176941.

    Article  Google Scholar 

  81. Zhang N, Henderson CN. Predicting stress and test anxiety among 1st-year chiropractic students. J Chiropractic Educ. 2019;33(2):133–9. https://doi.org/10.7899/JCE-18-11.

    Article  Google Scholar 

  82. Edelman M, Ficorelli C. A measure of success: nursing students and test anxiety. J Nurs Staff Dev. 2005;21(2):55–9. https://doi.org/10.1097/00124645-200503000-00004.

    Article  Google Scholar 

  83. Loya NS, Jiwane NN. Exam anxiety in professional medical students. Int J Res Rev. 2019. https://mail.jpma.org.pk/PdfDownload/1364

  84. Son HK, So W-Y, Kim M. Effects of aromatherapy combined with music therapy on anxiety, stress, and fundamental nursing skills in nursing students: a randomized controlled trial. Int J Environ Res Public Health. 2019;16(21):4185. https://doi.org/10.3390/ijerph16214185.

    Article  Google Scholar 

  85. Brodersen LD. Interventions for test anxiety in undergraduate nursing students: an integrative review. Nurs Educ Perspect. 2017;38(3):131–7. https://doi.org/10.1097/01.NEP.0000000000000142.

    Article  Google Scholar 

  86. Shapiro AL. Test anxiety among nursing students: a systematic review. Teach Learn Nurs. 2014;9(4):193–202. https://doi.org/10.1016/j.teln.2014.06.001.

    Article  Google Scholar 

  87. Johnson CE. Effect of aromatherapy on cognitive test anxiety among nursing students. Altern Complement Ther. 2014;20(2):84–7. https://doi.org/10.1089/act.2014.20207.

    Article  Google Scholar 

  88. Furlong E, Fox P, Lavin M, Collins R. Oncology nursing students’ views of a modified OSCE. Eur J Oncol Nurs : J Eur Oncol Nurs Soc. 2005;9(4):351–9. https://doi.org/10.1016/j.ejon.2005.03.001.

    Article  Google Scholar 

  89. Young I, Montgomery K, Kearns P, Hayward S, Mellanby E. The benefits of a peer-assisted mock OSCE. Clin Teach. 2014;11(3):214–8. https://doi.org/10.1177/0739986391013100210.1111/tct.12112.

    Article  Google Scholar 

  90. Zhang N, Walton DM. Why so stressed? A descriptive thematic analysis of physical therapy students’ descriptions of causes of anxiety during objective structured clinical exams. Physiother Can. 2018;70(4):356–62. https://doi.org/10.3138/ptc.2016-102.e.

    Article  Google Scholar 

  91. Panadero E, Broadbent J, Boud D, Lodge JM. Using formative assessment to influence self- and co-regulated learning: the role of evaluative judgement. Eur J Psychol Educ. 2019;34(3):535–57. https://doi.org/10.1177/0739986391013100210.1007/s10212-018-0407-8.

    Article  Google Scholar 

  92. Panadero E, Andrade H, Brookhart S. Fusing self-regulated learning and formative assessment: a roadmap of where we are, how we got here, and where we are going. Aust Educ Res. 2018;45(1):13–31. https://doi.org/10.1177/0739986391013100210.1007/s13384-018-0258-y.

    Article  Google Scholar 

  93. Morris R, Perry T, Wardle L. Formative assessment and feedback for learning in higher education: a systematic review. Rev Educ. 2021;9(3):e3292. https://doi.org/10.1002/rev3.3292.

    Article  Google Scholar 

  94. Powell DE, Carraccio C. Toward competency-based medical education. New England J Med. 2018;378(1):3–5. https://doi.org/10.1177/0739986391013100210.1056/NEJMp1712900.

    Article  Google Scholar 

  95. Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J, et al. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(7):1002–9. https://doi.org/10.1177/0739986391013100210.1097/acm.0000000000002743.

    Article  Google Scholar 

  96. Schuwirth LWT, van der Vleuten CPM. A history of assessment in medical education. Adv Health Sci Educ. 2020;25(5):1045–56. https://doi.org/10.1007/s10459-020-10003-0.

    Article  Google Scholar 

  97. To J, Panadero E, Carless D. A systematic review of the educational uses and effects of exemplars. Assess Eval Higher Educ. 2021;47:1–16. https://doi.org/10.1177/0739986391013100210.1080/02602938.2021.2011134.

    Article  Google Scholar 

  98. Rust C, Price M, O’Donovan B. Improving students’ learning by developing their understanding of assessment criteria and processes. Assess Eval Higher Educ. 2003;28(2):147–64. https://doi.org/10.1177/0739986391013100210.1080/02602930301671.

    Article  Google Scholar 

  99. Nicol D. The power of internal feedback: exploiting natural comparison processes. AssessEval Higher Educ. 2021;46(5):756–78. https://doi.org/10.1177/0739986391013100210.1080/02602938.2020.1823314.

    Article  Google Scholar 

  100. De La Fuente J, López-García M, Mariano-Vera M, Martínez-Vicente JM, Zapata L. Personal self-regulation, learning approaches, resilience and test anxiety in psychology students. Estud Sobre Educ. 2017;32:9–26. https://doi.org/10.15581/004.32.9-26.

    Article  Google Scholar 

  101. Cipra C, Müller-Hilke B. Testing anxiety in undergraduate medical students and its correlation with different learning approaches. PLoS ONE. 2019;14(3):e0210130. https://doi.org/10.1371/journal.pone.0210130.

    Article  Google Scholar 

  102. Ahmady S, Khajeali N, Kalantarion M, Sharifi F, Yaseri M. Relation between stress, time management, and academic achievement in preclinical medical education: a systematic review and meta-analysis. J Educ Health Promot. 2021;10:32. https://doi.org/10.4103/jehp.jehp_600_20.

    Article  Google Scholar 

  103. McManus IC, Thompson M, Mollon J. Assessment of examiner leniency and stringency ('hawk-dove effect’) in the MRCP(UK) clinical examination (PACES) using multi-facet Rasch modelling. BMC Med Educ. 2006;6(1):42. https://doi.org/10.1186/1472-6920-6-42.

    Article  Google Scholar 

  104. McKevitt CT. Engaging students with self-assessment and tutor feedback to improve performance and support assessment capacity. J Univ Teach Learn Pract. 2016;13(1):2. https://doi.org/10.53761/1.13.1.2.

    Article  Google Scholar 

  105. Bing-You R, Hayes V, Varaklis K, Trowbridge R, Kemp H, McKelvy D. Feedback for learners in medical education: what is known? A scoping review. Acad Med. 2017;92(9):1346. https://doi.org/10.1097/ACM.0000000000001578.

    Article  Google Scholar 

  106. van der Vleuten CPM, Schuwirth LWT, Driessen EW, Dijkstra J, Tigelaar D, Baartman LKJ, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34(3):205–14. https://doi.org/10.3109/0142159X.2012.652239.

    Article  Google Scholar 

  107. Branch WT Jr, Paranjape A. Feedback and reflection teaching methods for clinical settings. Acad Med. 2002;77(12 Part 1):1185. https://doi.org/10.1097/00001888-200212000-00005.

    Article  Google Scholar 

  108. Veine S, Anderson MK, Andersen NH, Espenes TC, Søyland TB, Wallin P, et al. Reflection as a core student learning activity in higher education - Insights from nearly two decades of academic development. Int J Acad Dev. 2020;25(2):147–61. https://doi.org/10.1177/0739986391013100210.1080/1360144X.2019.1659797.

    Article  Google Scholar 

  109. Stern C, Kleijnen J. Language bias in systematic reviews: you only get out what you put in. JBI Evid Synth. 2020;18(9):1818–9. https://doi.org/10.11124/jbies-20-00361.

    Article  Google Scholar 

  110. Neimann Rasmussen L, Montgomery P. The prevalence of and factors associated with inclusion of non-English language studies in Campbell systematic reviews: a survey and meta-epidemiological study. Syst Rev. 2018;7(1):129. https://doi.org/10.1186/s13643-018-0786-6.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This research is part of a larger project supported by the Fundamental Research Grant Scheme (FRGS/1/2018/SSI09/USM/02/2), Ministry of Higher Education, Malaysia.

Author information

Authors and Affiliations

Authors

Contributions

Each author made a substantial contribution to this paper. MW, MSBY, and MHT performed the three narrative reviews, selected the relevant papers, and compiled them in tables. MW, MSBY, AFA, and NZ performed the scoping review and FGD. MW, MSBY, and AFA generated the SAR guidelines and configure their relationship with the findings of the scoping review and FGD. MW, MSBY, MHT, and SS laid out the SAR framework in its final form. MW initiated the writing of the paper. Both MW and SS contributed to describing the SAR framework. MHT, AFA, NZ, SS, and MSBY reviewed the framework carefully. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Muhamad Saiful Bahri Yusoff.

Ethics declarations

Ethical approval and consent to participate

Ethical approval was obtained from the human research ethics committee at the School of Medical Sciences, Universiti Sains Malaysia (JEPeM USM Code: USM/JEPeM/18060286), in accordance with the Belmont Report and Helsinki Declaration. All participants signed a written informed consent form containing clear information about the study’s purpose, methods, and secure data handling procedures. They were also informed that they could withdraw from the study verbally or in writing at any time.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Appendix I.

(Three narrative reviews). Table S1. The identified theoretical frameworks of academic resilience. Table S2. The identified theoretical frameworks of assessment. Table S3. The identified theoretical frameworks of test anxiety. Appendix II. (Scoping review). Figure S1. PRISMA chart of the scoping review identifying factors related to test anxiety. Table S3. Descriptive variables of the included studies (n = 74). Table S4. Descriptive variables of the included studies (n = 74). Table S5. Themes of factors that increase test anxiety. Table S6. Themes of factors that decrease test anxiety. Appendix III. (Focus Group Discussion). Figure S2. Emerged themes and sub-themes in relation to increasing and decreasing test anxiety. Table S7. Emerged themes and sub-themes from FGD with supporting quotations. Appendix IV. (Generating resilience guidelines based on the scoping review and FGD ). Figure S3. Relation of the proposed guidelines with outputs of scoping review and FGD. Appendix V. (Content Validation). Table S8. Content Validation Index - Guidelines rated as 3 or 4 (relevant) is ticked on the table. Table S9. Acceptable values for content validity indices (104). Table S10. Demographic data of the expert panels. Table S11. The original version of the SAR guidelines that was sent for content validation. Appendix VI. (Response Process). Table S12. Content Validation Index - Guidelines rated as 3 or 4 (relevant) is ticked on the Table S. Table S13. Demographic data of the panels in face validation study. Table S14. The FVI indices of the 19 SAR guidelines. Appendix VII. (Written response of medical teacher in using SAR guideline). Table S15. The medical teachers’ feedback after response process.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wadi, M.M., Yusoff, M.S.B., Taha, M.H. et al. The framework of Systematic Assessment for Resilience (SAR): development and validation. BMC Med Educ 23, 213 (2023). https://doi.org/10.1186/s12909-023-04177-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-023-04177-5

Keywords