Construct validity of a questionnaire for measuring student engagement in problem-based learning tutorials
BMC Medical Education volume 23, Article number: 844 (2023)
Student engagement is student investment of time and energy in academic and non-academic experiences that include learning, teaching, research, governance, and community activities. Although previous studies provided some evidence of measuring student engagement in PBL tutorials, there are no existing quantitative studies in which cognitive, behavioral, and emotional engagement of students in PBL tutorials is measured. Therefore, this study aims to develop and examine the construct validity of a questionnaire for measuring cognitive, behavioral, and emotional engagement of students in PBL tutorials.
A 15-item questionnaire was developed guided by a previously published conceptual framework of student engagement. Focus group discussion (n = 12) with medical education experts was then conducted and the questionnaire was piloted with medical students. The questionnaire was then distributed to year 2 and 3 medical students (n = 176) in problem-based tutorial groups at the end of an integrated course, where PBL is the main strategy of learning. The validity of the internal structure of the questionnaire was tested by confirmatory factor analysis using structural equation modeling assuming five different models. Predictive validity evidence of the questionnaire was studied by examining the correlations between students’ engagement and academic achievement.
Confirmatory factor analysis indicates a good fit between the measurement and structural model of an 11-item questionnaire composed of a three-factor structure: behavioral engagement (3 items), emotional engagement (4 items), and cognitive engagement (4 items). Models in which the three latent factors were considered semi-independent provided the best fit. The construct reliabilities of behavioral, cognitive, and emotional factors were 0.82, 0.82, and 0.76, respectively. We failed however to find significant relationships between academic achievement and engagement.
We found a strong evidence to support the construct validity of a three-factor structure of student engagement in PBL tutorial questionnaire. Further studies are required to test the validity of this instrument in other educational settings. The predictive validity is another area needing further scrutiny.
Studies have demonstrated that student engagement is one of the most robust predictors of academic achievement  and increased student perseverance and retention. In addition, student engagement correlates with desirable mental health outcomes such as low rates of depression , and higher life satisfaction . Student engagement is also intrinsically rewarding for teachers , while student disengagement is a major factor for teacher burnout . Furthermore, student engagement is recognized as a measure of institutional quality  and excellence . Despite the escalating interest in the construct of student engagement, there are several gaps in the medical education literature about its measurement and applications in different educational settings.
Recently, we conducted a scoping review on student engagement in undergraduate medical education  and developed an integrated conceptual framework for student engagement in health professions education . This comprehensive framework contained the antecedents, mediators, dimensions, spheres, and outcomes of student engagement. According to this framework, student engagement comprises five dimensions: cognitive, behavioral, emotional, agentic, and socio-cultural. Cognitive engagement involves the student's psychological investment in learning, going beyond mere requirements, seeking challenges, directing effort towards understanding and mastering content, and utilizing metacognitive and deep learning strategies . Behavioral engagement refers to positive conduct, persistence, directing effort towards completing learning tasks, active participation, asking questions, maintaining focus, and engaging in school-based activities. Emotional engagement pertains to the student's emotional reactions in the classroom, school, or towards teachers, such as experiencing enjoyment, interest, happiness, and a sense of bonding . Agentic engagement emphasizes the active role of students in shaping their educational paths, future social lives, and broader social environments . Indicators of agentic engagement within the classroom may manifest as students actively contributing to their learning and influencing the instructional process . Agentic engagement beyond the classroom might encompass students taking an active role in community initiatives, engaging in peer teaching and mentoring, as well as participating in institutional governance and quality assurance efforts . Finally, socio-cultural engagement refers to the students’ quality of interactions with, accepting to learn from, and predicting actions of, different social and cultural groups . Sociocultural engagement develops when students immerse themselves in a new social setting and develop their unique identities. This process of identity formation helps bridge the gap between their personal social and cultural values and the norms of the new community. Consequently, students may develop their identities by fostering a sense of belonging within the new community [14, 15]. In this article, we will confine ourselves to the study of the dimensions that are deemed directly important in the classroom setting: cognitive, emotional, and behavioral engagement. A second restriction is that we have focused on the problem-based small-group tutorial setting rather than on instructional environments in a broader sense. The study was conducted in such a setting.
Student engagement in problem-based learning (PBL) has been examined in previous studies [16,17,18,19,20]. Assessment of medical students’ engagement with direct observation demonstrated that the amount of learner-to-learner engagement was similar in PBL and team-based learning (TBL) [17, 18], and much greater than in lecture-based teaching [17, 19], where most engagement was of the learner-to-instructor and self-engagement types. Also, learner-to-instructor engagement appeared greater in TBL compared with PBL [17, 18]. Another study developed and validated a 4-item questionnaire for measuring situational cognitive engagement in the PBL classroom . A recent study used video-stimulated recall as prompts for personal interviews to explore the dynamics of their engagement in PBL tutorial groups . They demonstrated that engagement of students in one dimension leads to further engagement in other dimensions and that engagement is decided by students before the PBL session based on the available antecedents . Although these studies have used different methods of measuring student engagement in PBL tutorials, the scope of quantitative measurement by the instruments has been limited to one dimension of engagement, either behavioral or cognitive. Accordingly, there are no existing studies in the literature which measure multiple dimensions of students’ engagement in PBL tutorials. Therefore, this study is designed to address the following research questions: 1. What is the content-related validity evidence of the questionnaire for measuring student engagement in PBL tutorials? 2. What is the internal structure validity evidence of the questionnaire for measuring student engagement in PBL tutorials? And 3. What is the relationship between student engagement in PBL tutorials and their academic achievement?
The present study used a cross-sectional correlation design. The questionnaire was designed based on a psychological perspective of engagement and the student engagement construct was considered multidimensional. Student engagement is conceptualized in this study as the student investment of time and energy in PBL tutorial experiences at the cognitive, affective, behavioral dimensions. An initial questionnaire was designed to operationalize the three dimensions of the student engagement construct based on our previously published conceptual framework of engagement . A focus group discussion was then conducted with medical education experts (n = 12) who examined the degree of concordance between each item of the questionnaire and the intended construct and for examining the degree of clarity of the items. The outcome of the focus group discussion was that experts agreed to include all the 15 items with slight modifications and degrees of agreement ranging from 60 to 100% for items. The questionnaire was then pilot tested with a small group of year 2 medical students (n = 10) for suitability of the items and no further modifications were included.
Setting and participants
The target population in this study were medical students in phase II of the medical program at a college of medicine in the Gulf region. The medical MBBS program at this college consists of five years duration. Year 1 (Phase I) is a foundation year with emphasis on basic medical sciences and general education courses. Year 2 and 3 (Phase II) consists of integrated medical sciences courses arranged in body systems. Problem based learning (PBL) is the main strategy of learning in Phase II of the program and PBL tutorials are the backbone activity. Year 4 and 5 (Phase III) consists of hospital-based rotations in different core clinical specialties.
The context of the study was the PBL small group tutorials conducted during an integrated system-based course. Small-group PBL tutorials consist of 8 to 10 students who meet twice a week for two hours in each session. The tutorials are led by a PBL tutor who functions mainly as a facilitator of learning rather than providing information. In the first session, students discuss a clinical case which is designed to stimulate rich discussion in the group and students generate a list of learning needs by the end of the session. Students then go into a stage of self-study scaffolded by structured college teaching activities between the first and second session. Students then meet again to present their learning during the week and integrate the information related to the case. Each PBL tutor is assigned to the group throughout the whole semester.
Instruments and sampling
The final form of the study questionnaire consists of 15 items representing emotional engagement (4 items), cognitive engagement (6 items), and behavioral engagement (5 items). The multiple-choice achievement test consisted of 100 items of the A-type (single best response) and covered all contents of the course. Most of the questions are context-rich scenarios which test the application of knowledge rather than simple recall. We used convenient sampling with a targeted population size of 204 year 2 and 3 medical students. The paper-based questionnaire was filled in by 176 students (Response rate = 86%) at the end of an organ-system course. Students were informed to score their overall engagement in PBL tutorials during the course which ranged from 6 to 7 weeks.
The purpose of the study was to collect different lines of evidence supporting the validity of the questionnaire. The data were entered and analyzed using the Statistical Package for Social Sciences (SPSS) version 25.0 and Analysis of Moment Structures (Amos) version 25.0 (Chicago, IBM SPSS). A P-value < 0.05 was considered statistically significant.
Confirmatory factor analysis
Confirmatory factor analysis using maximum likelihood estimation was applied to examine the degree of fit between the measurement model (the observed indicators) and the underlying structural model (the latent factors). Different indices were used to assess the goodness-of-fitness of the model. The Comparative Fit Index (CFI) assesses the overall performance of the model studied over a baseline (independence) model. Conventionally, CFI should be equal to or greater than 0.90 to accept the model. This denotes that 90% of the covariation in the data can be reproduced by the given model. The Chi-Square (χ2) test indicates the degree of fit between implied and observed covariance matrices. An insignificant χ2or a χ2 / df < 2 indicates good fit for the model. The Root Mean Square Error of Approximation (RMSEA) indicates the mean difference between observed and predicted covariance, and a value of 0.08 or less indicates an acceptable model fit. The Standardized Root Mean Square Residual (SRMR) is defined as the mean standardized difference between the observed correlation matrix and the model implied correlation matrix. A value less than 0.08 is considered a good fit. This measure tends to be smaller as sample size increases and as the number of parameters in the model increases . Finally, often the Aikaike Information Criterion (AIC) is computed. The AIC compares all different possible models in terms of appropriate use of all information in the data. Lower AIC values indicate a better fit. In conclusion, a decision on what the best model fit represents always takes these different indicators into account.
We first test the full 15-item questionnaire data as envisioned by the focus groups of students and experts against the three-factor model. The results are: χ2 = 321.35, df = 90, χ2/df = 3.57, CFI = 0.85, RMSEA = 0.12, SRMR = 0.26, and AIC = 381.35. This model clearly did not fit the data. A possible reason was that four items had small loadings or did load on more than one factor. The item “I feel the time passes quickly during the PBL tutorial” cross-loaded with high regression weights on both cognitive and emotional engagement. The items “I challenge myself in understanding the topics related to PBL case” and” I pay full attention to during the PBL tutorial” loaded on both cognitive and behavioral engagement. On the other hand, the item “I feel bored in PBL tutorials” cross loaded on the three engagement dimensions. We therefore decided to continue the analysis with the remaining 11 items.
The first model assumed that all these Items loaded on the same engagement factor, suggesting that one latent factor was sufficient to explain the data. This model is the simplest possible and therefore in theory the most parsimonious. It does, however, not fit with the original theoretical analyses. The second model hypothesized three independent, latent factors. The assumption here is that indeed three latent factors, emotional, cognitive, and behavioral, would explain the data, but that these factors were uncorrelated. The third model assumed that the three factors were uncorrelated. However, since the data were acquired using the same method, it was hypothesized that the data shared common-method variance in the form of a fourth latent factor related to all items. This approach assumes that participants have a biased tendency to respond to all items in a somewhat similar way. Finally, the fourth model allowed the three latent factors to be correlated, assuming that emotion, cognition, and behaviors are (at least) to some extent in harmony.
Construct reliability (CR)
Composite (or construct) reliability is a measure of internal consistency in the observed indicators that load on a latent variable (construct). In structural equation modeling, the formula for calculating construct reliability is:
whereby, λ (lambda) is the standardized factor loading for item i and ε is the respective error variance for item i. The error variance (ε) is estimated based on the value of the standardized loading (λ) and appears in the Amos output .
Correlations with academic achievement
Correlations were computed between the three student engagement factors and their examination scores.
Confirmatory factor analysis
The assessment of normality indicated a within acceptable range of skewness (between -.02 to 3.2) and kurtosis (-0.3 to -1.9). In Table 1, the most important findings are displayed.
Scrutinizing the various fit indicators, both Model 3 and Model 4 fit the data well. Since Model 4 is the simplest one, and therefore the most parsimonious, we conducted further analyses with the latter. Figure 1 shows factor loadings for each of the items on the three latent factors.
All items loaded significantly on their respective factors, with standardized factor loadings ranging from .67 to .96, with the enjoyment, concentration, and participation items as the highest loading items.
Construct reliability was calculated for each factor. Construct reliability for cognitive engagement was .75, emotional engagement .80, and behavioral engagement .80.
Correlations with academic achievement
There was a weak positive correlation between behavioral engagement and academic scores (r = .20). However, the relationships between emotional or cognitive engagement on the one hand and academic achievement on the other were not significant.
The purpose of the present study was to test reliability and construct validity of a multidimensional questionnaire measuring engagement of students in problem-based small-group tutorials. The results of the confirmatory factor analysis provided support for the reliability and validity of the 11-item questionnaire. The three-factor structure of the student engagement questionnaire (emotional, cognitive, and behavioral engagement) seems to be useful in guiding future research about student engagement in PBL tutorials. The predictive validity of the instrument, expressed as correlations between the three dimensions and academic achievement was limited to behavioral engagement.
Cognitive engagement of students in PBL tutorials entails their commitment to the process of learning, extending beyond the mere fulfillment of academic requirements. This includes dedicating effort towards comprehending and mastering the subject matter, immersion in the PBL tutorial that they lose awareness of the surroundings, completely focusing on the PBL tutorial tasks, and using deep strategies for their learning. Behavioral engagement in PBL tutorials involves actively taking part in group interactions, fulfilling all the learning activities associated with the PBL case, and punctually attending all tutorial sessions. Emotional engagement relates to the student’s affective responses during PBL tutorials, including feelings of enjoyment, interest, pleasure, and a desire for the tutorial to prolong.
Two previous studies examined the dynamics of student engagement in small group PBL tutorials [20, 21]. A study by Rotgans and Schmidt  was based on cognitive constructivism theory. Based on this perspective, they made three assumptions. First, students engage in theory construction during their encounter with a PBL case and test the theory through self-directed learning. Second, students develop an interest, ‘hunger for knowledge,’ when they encounter authentic cases. Third, students feel autonomy and agency through generating their own learning goals. Our findings support the study by Rotgans and Schmidt in that some of the items in the current instrument match with the construct of situational cognitive engagement . We refer to the student’s feeling of being absorbed in the task and the fact that they put a lot of effort into working with the PBL case. However, the present instrument broadens the cognitive engagement construct to include concentration on the task and the use of deep learning strategies. In addition, the emotional and behavioral engagement constructs had no role in their approach.
A second study used a qualitative approach by conducting video-stimulated recall interviews to examine the dynamics of student engagement in PBL tutorials . This study was explicitly guided by the multidimensional model of student engagement. The investigators found that student engagement should be described as a dynamic and malleable construct by which student engagement followed a spiral-like pattern. Once students were engaged or disengaged on one-dimension, other dimensions were likely to follow suit. Engagement is however a malleable variable. The degree of student engagement in PBL tutorials can change at any moment in a session depending on group dynamics, authenticity of the case scenario, or tutoring skills of faculty . Therefore, it is suggested that methods of measurement should be selected to have the appropriate grain size, sensitive to pick up the contextual changes in student engagement at the level of PBL tutorial activity. To be able to measure these contextual changes, measures like direct observation and experience sampling using short questionnaires should be used. This represents a difficult tradeoff between measuring the multidimensionality of the construct or identifying the short-term changes in student engagement during the PBL tutorial activity. The questionnaire tested here represents a compromise that can be used to measure short-term changes in student engagement. Although the study was conducted to measure student engagement in PBL tutorials at the course level, this 11-item questionnaire requires around 25 s to fill in, which makes it suitable for identifying the dynamic and temporal changes in student engagement at the microlevel.
An analysis of the items that were removed from the original instrument provides insights into the dynamics of the constructs themselves. The item “I feel the time passes quickly during the PBL tutorial” loaded high on both cognitive and emotional engagement constructs. When students are emotionally engaged, they enjoy the learning experience and may feel a sense of flow, in which they are fully absorbed in the learning experience and lose track of time. At the same time, if a learner is cognitively engaged, they are actively processing information, making connections, and make them lose time awareness as well. The item “I challenge myself in understanding the topics related to PBL case” loaded on both cognitive and behavioral engagement. Although the act of challenging oneself is primarily a cognitive engagement process, the challenge to understand a topic may lead to increased behavioral engagement such as increased participation in class or completion of assignments. Similarly, the item” I pay full attention to during the PBL tutorial” loaded on both cognitive and behavioral engagement. When a student pays attention in class, they are actively engaging their cognitive processes to process and understand the information being presented. On the other hand, when a student pays attention in class, they are demonstrating behavioral engagement by actively participating in the learning process and demonstrating a willingness to learn. Therefore, paying attention in class involves both cognitive and behavioral engagement, as it involves both mental and observable behaviors that reflect a student’s engagement in the learning process.
Somewhat disappointedly, the predictive validity of the questionnaire fell short to some extent. A correlation of r = .20 between behavioral engagement and academic achievement was found, whereas the two other elements were not correlated with achievement. Another study found a similarly low correlation between engagement and achievement in the context of small-group education . We can here only speculate about the reasons for this finding. A well-known phenomenon in forms of education stressing student agency and autonomy in pursuing learning issues is that the achievement test, representing what the teacher saw as the objectives of the course, does only fit what students actually study to some extent. Dolmans and colleagues  demonstrated the topics that students spent time and energy on in a problem-based course matched on average 64% with faculty-generated topics. In another study, they were able to show that student-generated learning issues, not foreseen by the teacher, often nevertheless were relevant for the problems at hand . This discrepancy may be a reason that correlations between engagement and achievement in these contexts remain low. Another possible explanation is the lack of full alignment between the expected outcomes of PBL tutorials and the achievement test. Multiple choice questions are mainly designed to assess the cognitive dimension of learning. However, the learning outcomes of PBL tutorials include other competencies which cannot be measured using MCQs such as communication skills, interpersonal skills, problem solving, and self-directed learning.
Availability of data and materials
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.
Green J, Liem GA, Martin AJ, Colmar S, Marsh HW, McInerney D. Academic motivation, self-concept, engagement, and performance in high school: key processes from a longitudinal perspective. J Adolesc. 2012;35(5):1111–22.
Li Y, Lerner RM. Trajectories of school engagement during adolescence: implications for grades, depression, delinquency, and substance use. Dev Psychol. 2011;47(1):233–47.
Lewis AD, Huebner ES, Malone PS, Valois RF. Life satisfaction and student engagement in adolescents. J Youth Adolesc. 2011;40(3):249–62.
Frenzel AC, Goetz T, Lüdtke O, Pekrun R, Sutton RE. Emotional transmission in the classroom: Exploring the relationship between teacher and student enjoyment. J Educ Psychol. 2009;101(3):705–16.
Chang M-L. An appraisal perspective of teacher burnout: examining the emotional work of teachers. Educ Psychol Rev. 2009;21(3):193–218.
WFME. Basic medical education, WFME global standards for quality improvement. 2020. Available from: https://wfme.org/download/bme-standards-2020/http-wfme-org-wp-content-uploads-2017-05-wfme-bme-standards-2020-pdf/.
Harden RM, Roberts TE. ASPIRE: international recognition of excellence in medical education. Lancet. 2015;385(9964):230.
Kassab SE, El-Sayed W, Hamdy H. Student engagement in undergraduate medical education: A scoping review. Med Educ. 2022;56(7):703–15.
Kassab SE, Taylor D, Hamdy H. Student engagement in health professions education: AMEE Guide No. 152. Med Teach. 2023;45(9):949–65.
Fredricks JA, Blumenfeld PC, Paris AH. School engagement: potential of the concept, state of the evidence. Rev Educ Res. 2004;74(1):59–109.
Klemenčič M. From student engagement to student agency: conceptual considerations of european policies on student-centered learning in higher education. High Educ Pol. 2017;30(1):69–85.
Reeve J, Tseng C-M. Agency as a fourth aspect of students’ engagement during learning activities. Contemp Educ Psychol. 2011;36(4):257–67.
GQ F. Sociocultural and civic engagement. Saratoga Springs: Suny Empire State College; 2022. Available from: https://www.esc.edu/global-learning-qualifications-framework/learning-domains/engagement/.
Kahu ER. Framing student engagement in higher education. Stud High Educ. 2013;38(5):758–73.
Kahu ER, Nelson K. Student engagement in the educational interface: understanding the mechanisms of student success. High Educ Res Dev. 2017;37(1):58–71.
Chan SCC, Gondhalekar AR, Choa G, Rashid MA. Adoption of problem-based learning in medical schools in non-western countries: a systematic review. Teach Learn Med. 2022:1–12.
Kelly PA, Haidet P, Schneider V, Searle N, Seidel CL, Richards BF. A comparison of in-class learner engagement across lecture, problem-based learning, and team learning using the STROBE classroom observation tool. Teach Learn Med. 2005;17(2):112–8.
Alimoglu MK, Sarac DB, Alparslan D, Karakas AA, Altintas L. An observation tool for instructor and student behaviors to measure in-class learner engagement: a validation study. Med Educ Online. 2014;19:24037.
O’Malley KJ, Moran BJ, Haidet P, Seidel CL, Schneider V, Morgan RO, et al. Validation of an observation instrument for measuring student engagement in health professions settings. Eval Health Prof. 2003;26(1):86–103.
Rotgans JI, Schmidt HG. Cognitive engagement in the problem-based learning classroom. Adv Health Sci Educ Theory Pract. 2011;16(4):465–79.
Grijpma JW, Mak-van der Vossen M, Kusurkar RA, Meeter M, de la Croix A. Medical student engagement in small-group active learning: A stimulated recall study. Med Educ. 2022;56(4):432–43.
Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct Equa Modeling Multidiscipl J. 1999;6(1):1–55.
Raykov T. Estimation of composite reliability for congeneric measures. Appl Psychol Meas. 2016;21(2):173–84.
Rotgans JI, Schmidt HG, Rajalingam P, Hao JWY, Canning CA, Ferenczi MA, et al. How cognitive engagement fluctuates during a team-based learning session and how it predicts academic achievement. Adv Health Sci Educ Theory Pract. 2018;23(2):339–51.
Dolmans DHJM, Schmidt HG, Gijselaers WH. The relationship between student-generated learning issues and self-study in problem-based learning. Instr Sci. 1994;22:251–67.
Dolmans DH, Gijselaers WH, Schmidt HG, van der Meer SB. Problem effectiveness in a course using problem-based learning. Acad Med. 1993;68(3):202–13.
Ethics approval and consent to participate
The methods and procedures employed in this study were conducted in accordance with the ethical principles outlined in the Declaration of Helsinki. The research protocol was approved by the Institutional Research Board (IRB) at the College of Medicine, Gulf Medical University (Ref. no. IRB-COM-FAC-21-FEB-2023). Each student signed an informed consent form to participate in the study. An information sheet explained the purpose of the research, potential benefits, and ensuring that participation is voluntary, and that she/he has the right to refuse participation or to withdraw without any reasons and without any negative consequences. Students were reassured that their individual responses will be kept confidential to the research team and will not be released to the University or to anyone else.
Consent for publication
The authors declare no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Kassab, S.E., El-Baz, A., Hassan, N. et al. Construct validity of a questionnaire for measuring student engagement in problem-based learning tutorials. BMC Med Educ 23, 844 (2023). https://doi.org/10.1186/s12909-023-04820-1