Skip to main content

Measuring medical students’ reflection on their learning: modification and validation of the motivated strategies for learning questionnaire (MSLQ)



Reflection on learning is an essential component of effective learning. Deconstructing the components of reflection on learning using a self-regulated learning (SRL) framework, allows the assessment of students’ ability to reflect on their learning. The aim of this study was to validate an instrument to measure medical students’ reflection on their learning.


A systematic search was conducted to identify the most suitable instrument to measure students’ reflection on their learning based on the theoretical framework of SRL. The search identified the Motivated Strategies for Learning Questionnaire (MSLQ) which contained five subscales: internal goal orientation, self-efficacy, critical thinking, metacognitive/self-regulation, help seeking and peer learning. Using the original MSLQ as the foundation, we carried out three phases of a research program to develop a useful set of items: an expert panel’s review of items, a substantial pilot study, and a factor analysis of ratings of a modified set of items by preclinical and final year medical students.


The factor analysis of the Modified MSLQ extracted four subscales with reasonable internal consistency: self-orientation, critical thinking, self-regulation and feedback-seeking. Each subscale correlates highly with the Modified MSLQ score, with modest inter-correlations between the subscales suggesting that they are measuring different components of the total score.


Medical students and their educators need to be able to monitor their learning in their complex academic and clinical environments. The Modified MSLQ provides a means of investigating and tracking individual medical students’ reflections on their learning.

Peer Review reports


Learning is an activity in which individuals reflect on past and present experiences in order to develop new understanding [1]. Reflection is a multi-faceted activity in which content knowledge is combined with metacognitive and motivational processes to regulate the learning process [2,3,4]. Boud, Keogh and Walker [5], 19, p. defined reflection as “a generic term for those intellectual and affective activities in which individuals engage to explore their experiences in order to lead to new understandings and appreciation”. Quirk succinctly identified reflective learning as learning “from doing, before, during, or after the event” [6], 29, p. This style of learning is encouraged in higher education as involving critical inquiry, self-reflection, dialogue and cooperation [7].

Reflective learning is specifically applicable to the contexts of medical education, according to Sandars, because it involves self-regulated learning (SRL) activities [2]. For clinical learning, reflection on learning experiences is essential, due to the many unstructured learning activities encountered and the variability and complexity of clinical cases. Medical students need to be able to review, monitor and regulate their own learning processes and to engage in life-long learning to reflect the real-life complexity of integrating knowledge into clinical competence. Since individuals often have difficulty identifying their own limitations when reflecting on their learning [8], being able to access and use feedback from other people is a crucial component [9, 10]. Medical educators, therefore, need to be able to encourage their students to engage in reflective learning, and consequently need appropriate measures of students’ natural and educated self-regulated learning. The aims of this research were to examine the appropriateness of a set of measures of reflective learning and to modify a suitable instrument for measuring medical students’ reflection on their self regulated learning.

Reflective learning, however, is not a unidimensional concept, but has a number of components that need to be incorporated into useful measures. Self regulated learners reflect on the metacognitive, motivational, and behavioural dimensions of their engagement in learning situations, including on feedback given or sought [2,3,4, 9]. For example, a qualitative study by Cleary and Sandars [11] demonstrated that the more successful students applied self regulatory approaches when learning a venipuncture procedure, while less successful students tended to focus on the final desired outcome without paying attention to the strategies needed to achieve the outcome. Cleary and Sandars examined students’ self-regulatory with a list of questions about their cognition, metacognition, and self efficacy. Their findings, and supportive studies by Sandars [2], suggest that breaking down reflective learning into components will enable medical educators to identify strengths and deficiencies in individual students’ reflection on their learning. Higher education researchers have developed self regulated learning frameworks and measures that are useful for university samples, for example, Study Process Questionnaire (SPQ) [12] and Metacognitive Awareness Inventory (MAI) [13]. Medical students are highly motivated and academically competent, so that in Emilia, Bloomfield and Rotem’s study [14] using Biggs’ SPQ, most medical students were assessed as performing at optimal levels. More fine-grained and clinically aware instruments are needed.

Adopting a validated instrument that assesses self-regulated learning components in other domains is an appropriate starting place for examining the reflection process of medical students.


Choice of instrument

Systematic search and review of identified questionnaires

A systematic search was conducted to identify instruments suitable to measure the reflection of medical students on their learning. There is no specific database for medical education research and therefore PubMed and ERIC were used for the search. The search terms or keywords used in each database included self-regulated learning, reflection, questionnaire, instrument and medical or higher education. Figures 1 and 2 depict the flow of the inclusion and exclusion process, along with the number of relevant/irrelevant articles, for each stage of screening.

Fig. 1

Flowchart of the inclusion/exclusion process for articles retrieved from PubMed

Fig. 2

Flowchart of the inclusion/exclusion process for articles retrieved from ERIC

Inclusion criteria included articles in English, focused on measuring students’ reflection on learning in medical and higher education, using an instrument/scale/questionnaire. An article was excluded if it was written in language other than English, focused on teachers’ reflection, assessed reflection on learning with measures other than an instrument/scale/questionnaire.

A total of 21 questionnaires were reviewed to determine if domains in the questionnaire included the critical domains of cognition, metacognition, motivation, self efficacy and feedback seeking.

Review of chosen instrument

Based on the review of the identified questionnaires the Motivated Strategies for Learning Questionnaire (MSLQ) was the most appropriate instrument to measure reflective learning as it considered reflective learning as a self-regulated learning activity and included items assessing the cognitive, metacognitive, motivational and emotional aspects of the learning process. The MSLQ [15, 16] was developed for students in tertiary education, regardless of discipline to examine their motivation for learning and their learning strategies. In addition, the MSLQ acknowledged the influences of external sources, such as feedback, on reflection, and was developed for higher education students in general, which makes it adaptable for modification and use in a specific educational setting, medical education.

The MSLQ is divided into two scales, which are motivation (31 items) and learning strategies (50 items), scored on a 7-point Likert scale (from 1 = not at all true of me to 7 = very true of me). The application of the MSLQ in general higher education courses demonstrated acceptable internal consistency represented with Cronbach alpha values (e.g. [17, 18]), ranging from .41 to 78 for learning strategies scale and from .50 to .93 for motivation scale. To the best of our knowledge, there are some studies on MSLQ in medical education context (e.g. [19,20,21,22,23,24,25,26,27,28]). Most studies correlated some or all components of MSLQ with certain criteria of academic performance.

A comparison between MSLQ and reflective learning construct was made and resulted in 36 items from six subscales (internal goal orientation, self-efficacy, critical thinking, metacognitive/self-regulation, help seeking and peer learning) of the original MSLQ, which were considered to be the most appropriate in measuring reflection. These subscales were selected because they build the reflective learning construct. All items in each of the six original subscales were included.

Minor revisions on the wording and terminology were made to the items in the chosen subsets of the MSLQ (Table 1), in order to increase its suitability for use in the Australian medical education context, e.g., replacing the word “instructor” with “tutor”.

Table 1 Modifications of MSLQ selected items

Using the 36-item MSLQ as the foundation, we carried out three phases of a research program to develop a useful set of items that would assist students and medical educators to measure students’ reflective learning in its different dimensions: an expert panel’s review of items, a substantial pilot study, and a factor analysis of ratings of a modified set of items by preclinical and final year medical students.

Expert panel review

The 36-item MSLQ was submitted to an expert review process. A panel of eight experts involved medical practitioners with expertise in medical education and educational psychologists with expertise in questionnaire construction. They were asked to critically appraise the questionnaire and provide comments on potential sources of error and bias, and the suitability of the questionnaire for investigating students’ reflection on their learning. The experts rated the relevance of each item on a 4-point rating scale (1 = not relevant; 2 = unable to assess relevance without item revision; 3 = relevant but needs minor alteration; 4 = very relevant and succinct) [29, 30]. They also were invited to provide comments, point out potential sources of error, and re-phrase or reword items.

The content validity index (CVI) for each item and also for the entire questionnaire was then calculated. The CVI for each item is the proportion of experts who rate that particular item as content valid (a rating of 3 or 4), whereas the CVI for the whole questionnaire is the proportion of total items judged to be content valid [29, 30].

There were 28 items (of 36 items in total) with CVI above the recommended value (> .75 [29]). Experts’ comments were taken into consideration to improve the relevance and quality of each item. The three authors conferred to make judgements about modifications and whether to discard any items [31]. Four items had low CVI, but only one ambiguous and confusing to rate item was deleted (item 18, “Considering the difficulty of this course, the teacher, and my skills, I think I will do well in this course”). The other three items were retained with revisions, for example, the phrase “an excellent job” was replaced with a local idiom “well” in item 13, “I’m confident I can do an excellent job on the assignment and tests in this course”.

Ethics approval was obtained from the University Human Research Ethics Committee to conduct pilot and factor analytic validation studies using the 35 items in a Modified MSLQ. Permission was given for students to provide anonymous consent by completing and handing in a questionnaire.

Pilot study

Participants were 70 medical students in the third preclinical year of a six-year degree program at a large Australian medical school, with a 95% response rate. They completed the modified MSLQ and commented on the wording, understandability, ambiguity, relevance and usefulness of each item, and suggested rewording.

Factor analysis of the items of the modified MSLQ

The modified MSLQ was completed by two groups of medical students from a large Australian university: 306 first year (preclinical) students (95%) from the Doctor of Medicine (MD) program; and 248 final year students (91%) from the Bachelor of Medicine, Bachelor of Surgery (MBBS) program. Mean ages were: MD, 22.68 years (SD = 2.4, range 20–38); MBBS, 25.21 years (SD = 2.63, range 22–40). There were comparable numbers of male and female students: MD, 45% male, 51% female; MBBS, 43% male, 48% female.

Analyses involved a factor analysis and calculation of internal consistency (alpha) coefficients. The factor analysis tested whether there were concordances between the subscales that emerged from this analysis and the original subscales developed by Pintrich et al. [16]. Internal consistency of subscales was calculated with Cronbach’s alpha and Guttman Lambda coefficients.


Pilot study

Internal consistency coefficients for 6 subscales (.410–.838) compared reasonably well with those of the original MSLQ [15]. Pilot participants’ comments indicated that four items from the self-regulation subscale (items 8, 25, 26, 27) were potentially ambiguous. Most of those items were critical for understanding how students reflect on their learning. Consequently, only one item, 8 was omitted (“I often find that I have been studying in this course but don’t really know what it is all about”), because it did not give insight into how students learn. Omitting item 8 reduced the alpha coefficient of the metacognition or self-regulation subscale by .01 (.74 to .73).

Most items in the questionnaire were considered relevant and useful by medical students in the pilot study. Students’ suggestions for improving or deleting items produced 34-items that were suitable for a factor analytic validation study.

Factor analysis of the items of the modified MSLQ

Preliminary analyses revealed that four subscales reasonably reflected the subscales of the original MSLQ. Table 2 shows the internal consistency coefficients for 6 subscales with their original MSLQ labels. Internal goal orientation and help seeking subscales had poor internal consistency coefficients for both groups, as was consistent with the pilot study and the original study by Pintrich, et al. [15].

Table 2 34-item modified MSLQ subscales, items distribution and reliability coefficient for each subscale

All 34 items were submitted to factor analysis without making assumptions about subscales. The correlation matrix was suitable for factor analysis. We used principal component analysis (PCA) with oblique (direct oblimin) rotation (IBM SPSS version 19), combining data from the MD and MBBS groups on the basis of correlations of demographic characteristics and background learning experiences.

Ten components had eigenvalues greater than one (Kaiser’s criterion), and explained 58.42% of the variance. Inspection of the scree plot demonstrated the point of inflexion after 4 components, and six components accounting for less than 5% of the variance each were below the elbow of the scree plot. Consequently, four factors were extracted and explained 43.45% of the variance, with 42% of non-redundant residuals with absolute values greater than .05. The pattern matrix is shown in Table 3.

Table 3 Summary of principal component analysis with direct oblimin rotation for the 34-item modified MSLQ on combined MD and MBBS groups (n = 554)

The final four factors yielded the four subscales of a Modified MSLQ that are shown in Table 4, with their contributing items and internal consistency coefficient.

Table 4 Subscales and items of 32-item modified MSLQ following factor analysis of the MD and MBBS student group results (n = 554)

Two of the four subscales of the Modified MSLQ combined two subscales of the original MSLQ. The Modified MSLQ self-orientation subscale included the original MSLQ self-efficacy subscale and two items relating to how students perceived themselves from the original internal goal orientation subscale. The feedback seeking subscale consisted of items from MSLQ help seeking and peer learning subscales that related to how students seek and incorporate feedback to monitor their learning. The critical thinking subscale added two items from MSLQ self-regulation subscale that were related to how students apply critical analysis in their learning. Inspection of Table 4 in relation to Table 2 shows the stronger internal consistency for three new subscales: self-orientation; feedback seeking, and critical thinking; with the new self-regulation subscale within an acceptable range.

Table 5 shows the matrix of inter-correlations of 554 students’ scores on the four subscales and an overall Modified MSLQ score. Each subscale correlates highly with the Modified MSLQ score, and the modest inter-correlations between the subscales suggest they are measuring different components of the total score.

Table 5 Inter-correlations of Modified MSLQ scores and Four Subscales, for 554 Medical Students

A factor analysis conducted with a sample of 585 medical students yielded a four components solution of which two were combinations of two original MLSQ subscales. No completely new factors emerged over and above the original MLSQ subscales [15, 16]. Internal consistency was acceptable for the four Modified MLSQ subscales.


Our aims were to develop a questionnaire instrument that would be useful for interrogating the reflective learning of medical students. A systematic search of 401 journal articles pointed to Pintrich’s MSLQ [15] as the most appropriate questionnaire to modify for measuring the reflective learning of medical students. The MSLQ has been used extensively in higher education and in medical education studies. It incorporates major components of reflective learning, namely, cognition, metacognition, motivation, self efficacy and feedback seeking; and it had reasonable levels of internal consistency over several studies.

Using the 36-item MSLQ as the basis, we carried out three phases of research to develop a set of items specifically useful in medical education. Following modifications suggested by an expert panel of medical educators, a pilot sample of 70 pre-clinical medical students rated the items and suggested further modifications. In the main study, the ratings of 34 items of a Modified MSLQ were subjected to a comprehensive factor analysis that yielded a four components solution. These components were used to construct four subscales that reflected the dimensions of reflective learning [2,3,4, 9].

Two subscales were the same as original MSLQ subscales and two incorporated items from across two original subscales. The four modified subscales, with acceptable internal consistency coefficients, indicate individual students’ ratings of their self-orientation, critical thinking, self-regulation of their learning, and use of feedback. The subscales inter-correlate modestly with each other and highly with a total Modified MSLQ score, indicating their separate contributions to description of a student’s reflective learning.

The modified MSLQ can serve as a measure of medical students’ reflection on their learning, since it can provide teachers with indications of whether the students have appropriate motivation to initiate reflection and whether they have enough confidence, since the level of confidence influences their reflection on their learning [10, 32, 33]. It can also be used to examine whether they use the metacognitive skills to regulate and reflect on their learning and whether they seek and incorporate external feedback to inform their reflection [10, 32, 34, 35].

The self-orientation component deals with students’ perceptions on their self-efficacy and internal motivation. Both self-efficacy and internal motivation affect how students reflect on their learning [36,37,38,39]. Students with low self-efficacy perceive themselves to be incompetent in a particular task and this perception of incompetence is likely to hinder their ability to perform a task and to reflect on it. In contrast, students with low internal motivation may regard reflection as unnecessary, since their focus is only on grades and examination.

Critical thinking is required for a student to be able to reflect on their learning. Within a learning process or after experiencing a learning event, a student needs to analyse that particular learning process as an effort to understand more about the learning, which will lead into reflection on learning [6, 9]. The third component is self-regulation that is highly interrelated with critical thinking aspect. Self-regulation involves the awareness of a learning process and how to regulate the learning through planning and monitoring in order to achieve the intended goals [40,41,42,43]. Students with higher critical thinking ability are likely to provide a more critical analysis of the learning process and this will lead to a better ability of self-regulating.

The last component is feedback-seeking behaviour. Reflection cannot be an individual’s isolated activity, since the results of self-assessing process tend to be inaccurate [8, 32, 35, 44,45,46]. Reflection process involves the process of processing and incorporating external data, one of which was in the form of feedback, to inform the reflection [6, 9, 34, 47, 48]. A student with better feedback-seeking behaviour is likely to have a more accurate reflection on learning, because the student continuously looks for feedback to refine and improve the reflection.

Generalizability of the results in the present study may be limited since the sample was restricted to a group of students of one university in Australia. However, the comprehensiveness of the analyses and multiple phases of current study provide a basis for further validation and use of this instrument. While there may be a legitimate argument against using an empirical approach to a reflective process, we have focused on how the instrument’s items express the scope and dimensions of the reflective concept. Further validation studies are now warranted, specifically to examine the relation of the instrument and its subscales to student performance and to other measures of the management of their learning in their medical courses.


Medical students and their educators need to be able to monitor their learning in their complex academic and clinical environments. The Modified MSLQ provides a means of investigating and tracking individual medical students’ reflections on their learning.


  1. 1.

    Abbott J. Learning makes sense: re-creating education for a changing future. Letchworth: Education. 2000;1994.

  2. 2.

    Sandars J. The use of reflection in medical education: AMEE guide no. 44. Medical Teacher. 2009;31(8):685–95.

    Article  Google Scholar 

  3. 3.

    Ertmer PA, Newby TJ. The expert learner: strategic, self-regulated and reflective. Instr Sci. 1996;24, 24(1).

    Article  Google Scholar 

  4. 4.

    Zimmerman BJ. Models of self-regulated learning and academic achievement. In: Zimmerman BJ, Schunk DH, editors. Self-regulated learning and academic achievement: theory, research, and practice. New York: Springer-Verlag; 1989. p. 1–26.

    Chapter  Google Scholar 

  5. 5.

    Thompson S, Thompson N. The critically reflective practitioner. 1st ed. New York: Palgrave Macmillan; 2008.

    Book  Google Scholar 

  6. 6.

    Boud D, Keogh R, Walker D. Reflection: turning experience into learning. London: Kogan; 1985.

    Google Scholar 

  7. 7.

    Quirk M. Intuition and metacognition in medical education: keys to developing expertise. New York: Springer Publishing Company; 2006.

    Google Scholar 

  8. 8.

    Barnett R. The idea of higher education. Buckingham: Open University Press; 1990.

    Google Scholar 

  9. 9.

    Kruger J, Dunning D. Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments. J Pers Soc Psychol. 1999;77(6):1121–34.

    Article  Google Scholar 

  10. 10.

    Dornan T. Self-assessment in CPD: lessons from the UK undergraduate and postgraduate education domains. J Contin Educ Health Prof. 2008;28(1):32–7.

    Article  Google Scholar 

  11. 11.

    Cleary TJ, Sandars J. Assessing self-regulatory processes during clinical skill performance: a pilot study. Medical Teacher. 2011;33:e368–74.

    Article  Google Scholar 

  12. 12.

    Biggs JB. Student approaches to learning and studying. Hawthorn, Victoria: Australian council for Educational Research; 1987.

    Google Scholar 

  13. 13.

    Schraw G, Dennison RS. Assessing metacognitive awareness. Contemp Educ Psychol. 1994;19:460–75.

    Article  Google Scholar 

  14. 14.

    Emilia O, Bloomfield L, Rotem A. Measuring students’ approaches to learning in different clinical rotations. BMC Medical Education. 2012;12:114.

    Article  Google Scholar 

  15. 15.

    Pintrich PR, Smith D, Garcia T. McKeachie W. a manual for the use of the motivated strategies for learning questionnaire (MSLQ). Ann Arbor, MI: The University of Michigan; 1991.

    Google Scholar 

  16. 16.

    Pintrich PR, Smith D, Garcia T, McKeachie W. Reliability and predictive validity of the motivated strategies for learning questionnaire (MSLQ). Educ Psychol Meas. 1993;53(3):801–13.

    Article  Google Scholar 

  17. 17.

    Duncan TG, McKeachie WJ. The making of the motivated strategies for learning questionnaire. Educ Psychol. 2005;40:117–28.

    Article  Google Scholar 

  18. 18.

    Kosnin AM. Self-regulated learning and academic achievement in Malaysian undergraduates. Int Educ J. 2007;8:221–8.

    Google Scholar 

  19. 19.

    Cook DA, Thompson WG, Thomas KG. The motivated strategies for learning questionnaire: score validity among medicine residents. Med Educ. 2011;45(12):1230–40.

    Article  Google Scholar 

  20. 20.

    Sandars J. Pause 2 learn: developing self-regulated learning. Med Educ. 2010;44:1117–8.

    Article  Google Scholar 

  21. 21.

    Salamonson Y, Everett B, Koch J, Wilson I, Davidson PM. Learning strategies of first year nursing and medical students: a comparative study. Int J Nurs Stud. 2009;46(12):1541–7.

    Article  Google Scholar 

  22. 22.

    Bodkyn C, Stevens F. Self-directed learning, intrinsic motivation and student performance. Caribbean Teaching Scholar. 2015;5(2):79–93.

    Google Scholar 

  23. 23.

    Hamid S, Singaram VS. Motivated strategies for learning and their association with academic performance of a diverse group of 1st-year medical students. African Journal of Health Professions Education. 2016;8(1 Suppl 1):104–7.

    Article  Google Scholar 

  24. 24.

    Kassab SE, Al-Shafei AI, Salem AH, Otoom S. Relationships between the quality of blended learning experience, self-regulated learning, and academic achievement of medical students: a path analysis. Advances in Medical Education and Practice. 2016;6:27–34.

    Google Scholar 

  25. 25.

    Kim K-J, Jang HW. Changes in medical students’ motivation and self-regulated learning: a preliminary study. Int J Med Educ. 2015;6:213–5.

    Article  Google Scholar 

  26. 26.

    Stegers-Jager KM, Schotanus J, Themmen APN. Motivation, learning strategies, participation and medical school performance. Med Educ. 2012;46(7):678–88.

    Article  Google Scholar 

  27. 27.

    Van Nguyen H, Laohasiriwong W, Saengsuwan J, Thinkhamrop B, Wright P. The relationships between the use of self-regulated learning strategies and depression among medical students: an accelerated prospective cohort study. Psychology, Health & Medicine. 2015;20(1):59–70.

    Article  Google Scholar 

  28. 28.

    Turan S, Konan A. Self-regulated learning strategies used in surgical clerkship and the relationship with clinical achievement. Journal of Surgical Education. 2012;69(2):218–5.

    Article  Google Scholar 

  29. 29.

    Lynn MR. Determination and quantification of content validity. Nurs Res. 1986;35(6):382-385.

    Article  Google Scholar 

  30. 30.

    Waltz CW, Bausell RB. Nursing research: design, statistics and computer analysis. Philadelphia: F.A. Davis Co.; 1981.

    Google Scholar 

  31. 31.

    deVellis RF. Scale development: theory and applications. 2nd ed. Thousand Oaks: California; 2003.

    Google Scholar 

  32. 32.

    Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80(10):S46–54.

    Article  Google Scholar 

  33. 33.

    Pintrich PR. A conceptual framework for assessing motivation and self-regulated learning in college students. Educ Psychol Rev. 2004;16(4):385–407.

    Article  Google Scholar 

  34. 34.

    Kember D. Triggers for reflection. In: Kember D, editor. Reflective teaching & learning in the health professions: action research in professional education. Oxford: Blackwell Science Ltd.; 2001. p. 152–66.

    Chapter  Google Scholar 

  35. 35.

    Sargeant J, Mann K, Van der Vleuten CPM, Metsemakers J. “Directed” self-assessment: practice and feedback within a social context. J Contin Educ Health Prof. 2008;28(1):47–54.

    Article  Google Scholar 

  36. 36.

    Pintrich PR. The role of goal orientation in self-regulated learning. In: Boekaerts M, Pintrich PR, Zeidner M, editors. Handbook of self regulation. San Diego: Academic Press; 2000. p. 451–502.

    Chapter  Google Scholar 

  37. 37.

    Volet S, Mansfield C. Group work at university: significance of personal goals in the regulation strategies of students with positive and negative appraisals. High Educ Res Dev. 2006;25:341–56.

    Article  Google Scholar 

  38. 38.

    Zimmerman BJ. Self-regulation involves more than metacognition: a social cognitive perspective. Educ Psychol. 1995;30(4):217–21.

    Article  Google Scholar 

  39. 39.

    Zimmerman BJ. Attaining self-regulation: a social cognitive perspective. In: Boekaerts M, Pintrich PR, Zeidner M, editors. Handbook of self regulation. San Diego: Academic Press; 2000. p. 13–39.

  40. 40.

    Brown A. Metacognition, executive control, self-regulation, and other more mysterious mechanism. In: Kluwe R, Weinert FE, editors. Metacognition, motivation, and understanding. Hillsdale, N.J: L. Erlbaum Associates; 1987. p. 65–116.

    Google Scholar 

  41. 41.

    Flavell JH. Metacognitive aspects of problem solving. In: Resnick LB, editor. The nature of intelligence. Hillsdale, N.J: Lawrence Erlbaum Associates; 1976. p. 231–5.

  42. 42.

    Ibabe I, Jauregizar J. Online self-assessment with feedback and metacognitive knowledge. High Educ. 2010;59(2):243–58.

    Article  Google Scholar 

  43. 43.

    Zeidner M, Boekaerts M, Self-regulation PPR. Directions and challenges for future practice. In: Boekaerts M, Pintrich PR, Zeidner M, editors. Handbook of self regulation. San Diego: Academic Press; 2000. p. 749–68.

  44. 44.

    Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094–102.

    Article  Google Scholar 

  45. 45.

    Gordon MJ. A review of the validity and accuracy of self-assessments in health professions training. Acad Med. 1991;66(12):762–9.

    Article  Google Scholar 

  46. 46.

    Silver I, Campbell C, Marlow B, Sargeant J. Self-assessment and continuing professional development: the Canadian perspective. J Contin Educ Health Prof. 2008;28(1):25–31.

    Article  Google Scholar 

  47. 47.

    Durning SJ, Cleary TJ, Sandars J, Hemmer PA, Kokotailo P, Artino AR. Viewing "strugglers" through a different lens: how a self-regulated learning perspective can help medical educators with assessment and remediation. Acad Med. 2011;86:488–95.

    Article  Google Scholar 

  48. 48.

    Volet S, Vauras M, Salonen P. Self- and social regulation in learning contexts: an integrative perspective. Educ Psychol. 2009;44(4):215–26.

    Article  Google Scholar 

Download references

Availability of data and materials

The data supporting the findings in this study can be obtained from the corresponding author.

Author information




All authors involved in the study design. DS collected, analysed the data and drafted the manuscript. GM and AD contributed to data analysis and review of the manuscript. All authors have read and approved the manuscript in its current format.

Corresponding author

Correspondence to Diantha Soemantri.

Ethics declarations

Authors’ information

Diantha Soemantri, MD, MMedEd, PhD (DS) is a senior lecturer in medical education at the Faculty of Medicine Universitas Indonesia. Her main research interest is on students learning and assessment, specifically related to feedback and reflection.

Geoff Mccoll, BMedSc, MBBS, MEd, PhD (GM) is currently the Executive Dean, Faculty of Medicine University of Queensland. Previously he was the Head of the Melbourne Medical School and Professor of Medical Education and Training at the University of Melbourne. He is currently the Chair of the Australian Medical Council’s Medical School Assessment Committee.

Agnes Dodds, BA (hons), M.A. (AD) is an Associate Professor in the Department of Medical Education in the Melbourne Medical School. Her research interests are in evaluation and young adult development, particularly of high achieving students in professional courses.

Ethical approval and consent to participate

The study has been approved by the University of Melbourne Human Research Ethics Committee. All participants provided their consent to participate in this study by completing and handing in the questionnaires. Information on the voluntary nature of their participation and confidentiality of the data was provided beforehand.

Consent for publication

The consent provided by the study participants includes the permission to use the data for presentation and publication of the study.

Competing interests

The authors report no competing financial and non-financial interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Soemantri, D., Mccoll, G. & Dodds, A. Measuring medical students’ reflection on their learning: modification and validation of the motivated strategies for learning questionnaire (MSLQ). BMC Med Educ 18, 274 (2018).

Download citation


  • Medical students
  • Reflection on learning
  • MSLQ
  • Instrument