Skip to main content

Monitoring progression of clinical reasoning skills during health sciences education using the case method – a qualitative observational study



Outcome- or competency-based education is well established in medical and health sciences education. Curricula are based on courses where students develop their competences and assessment is also usually course-based. Clinical reasoning is an important competence, and the aim of this study was to monitor and describe students’ progression in professional clinical reasoning skills during health sciences education using observations of group discussions following the case method.


In this qualitative study students from three different health education programmes were observed while discussing clinical cases in a modified Harvard case method session. A rubric with four dimensions – problem-solving process, disciplinary knowledge, character of discussion and communication – was used as an observational tool to identify clinical reasoning. A deductive content analysis was performed.


The results revealed the students’ transition over time from reasoning based strictly on theoretical knowledge to reasoning ability characterized by clinical considerations and experiences. Students who were approaching the end of their education immediately identified the most important problem and then focused on this in their discussion. Practice knowledge increased over time, which was seen as progression in the use of professional language, concepts, terms and the use of prior clinical experience. The character of the discussion evolved from theoretical considerations early in the education to clinical reasoning in later years. Communication within the groups was supportive and conducted with a professional tone.


Our observations revealed progression in several aspects of students’ clinical reasoning skills on a group level in their discussions of clinical cases. We suggest that the case method can be a useful tool in assessing quality in health sciences education.

Peer Review reports


Outcome-based, or competency-based, education has been an emerging trend in medical and health sciences education for decades, presented as a better alternative to older content- and time-based curricula [1,2,3]. In Europe, the Bologna agreement of 1999 [4] started a movement towards increased comparability in higher education. Frameworks for qualification as learning outcomes formed an important part of the process [5] as well as standardisation of credits. Assessment of competencies constitutes an important part of those curricula for summative assessment of individuals as well as for feedback to teachers about the quality of the curriculum. For both these purposes, it is important to assess students’ development towards their future profession during the course of study [6]. Both students and teachers require frequent information on how the students are proceeding towards specified outcomes/competencies. Progress tests are commonly used [7,8,9,10] and are usually multiple-choice tests given simultaneously for all cohorts in the education programme. In engineering education, students’ professional skills have been monitored using observation of teams while solving problems [11]. To our knowledge, progression in clinical reasoning skills, in health sciences have not previously been evaluated by using the case method. Inspired by the study by Wahlgren and Ahlberg [11], we undertook to use clinical cases to monitor progression and to evaluate, using standardized criteria, how students in health professions develop professional clinical reasoning skills. If progression can be identified this method may be used to assess quality in health sciences education.

An important competence in medical and health sciences education is professional problem solving or clinical reasoning. Experts use varying approaches in clinical reasoning, either analytical or non-analytical, or both [12]. Script concordance testing has been used to assess clinical reasoning [13,14,15] and to trace students’ progression in the development of clinical reasoning [16]. Students need training in clinical reasoning [12], and verbalisation of reasoning processes forms an important part of such training [17]. Authentic clinical cases are often used for learning clinical reasoning and methods exists for structuring such discussions, for example problem-based learning [18, 19]. The case method, originally developed at Harvard Business School [20] has been adapted to medical and health sciences education and used for student active learning [21,22,23,24]. The students are involved in discussing a case using a structure that closely resembles the clinical reasoning process [17].

The aim of this study was to monitor and describe students’ progression in professional clinical reasoning skills during health sciences education using observations of group discussions following the case method.


Study design

This study had an observational design using a qualitative method of analysis. A modified Harvard case method [23] was used as a tool for teaching and learning and rubrics [25] as a tool for identifying clinical reasoning skills.

Context and setting of the study

In Sweden, outcomes for higher education are specified in the Higher Education Ordinance [26]. Swedish higher education is modular; the students pass through a series of courses and each course requires separate assessment, with no final mandatory graduation examination. The faculty of medicine at Lund University has several health education programmes including occupational therapy (OT), speech-language therapy (SLT) and midwifery (MW). These programmes took part in this study after an invitation to teachers who applied the case method in their teaching. The OT programme is a three-year undergraduate programme, while the SLT programme has undergraduate intake leading to an exam at advanced level after 4 years. The MW programme is 1.5 years at advanced level with nurse qualification (3 years) as entry requirement. All three programmes include clinical placements.


The study was carried out in year 2014-2015. A modified Harvard case method [23] was used to monitor progression across years through the professional programmes that took part in the study. The students were required to discuss and proceed through six steps, following this structure, c.f. Levett-Jones [17].

  1. 1.

    Identification of the problem that a professional faces

  2. 2.

    Identification of relevant facts

  3. 3.

    Discussion of what could happen if left unattended

  4. 4.

    Suggestion of actions by professional to solve the problem

  5. 5.

    Analysis of potential results of suggested actions

  6. 6.

    Evaluation of actions

If relevant, students should apply an holistic perspective including perspectives from other professions as well as ethical, legislative and financial aspects when discussing the case. This process resembles the strategies suggested by Klein as important aspects of expertise for decision-making skills [27].

The discussion started when the students were presented with a case written in a narrative style in the perspective of a professional from the future profession of the students. This professional faces a challenge, concerning a patient or a client. The cases were authentic, taken from real-life experience, and open-ended in the sense that differing potential solutions should be possible [24]. The cases differed in detail and length depending on the clinical situation. For example, the case used by the midwifery students described a scenario during the course of a day, but the case in occupational therapy covered a situation developed during a year or longer. The teacher in the case method had the role of a facilitator who guides the students through the structure (see above). Ten to–24 students took part in each case method session (Table 1). They worked in small groups (4–5) in the same room with intermittent discussions in the large group. Facilitation by the teacher took place only in the large group. A whiteboard was used to document work in the large group, whereas the small groups documented their work in notes. The whiteboard was sectioned and headlines (the six step presented above) were used to guide the students’ discussion [22]. The whole session took 1-2 h depending on the complexity of the case.

Table 1 Numbers of participating students in cohorts, number and qualifications of observers and duration of observations in this study

In this study, it was a prerequisite that the same case was used in all student cohorts within each professional programme. Special comprehensive cases were developed for the study. Since the three professions differ in area of responsibility, all three cases were prepared in order to capture profession-specific clinical reasoning skills. The OT case concerned a client-centred, occupation-based intervention, including the individual, family, community and organizations. The case had a high complexity in order to challenge students throughout the programme. The SLT case addressed speech and language impairment in the school years, requiring knowledge of aetiology and diagnostics, as well as long-term educational consequences, professional delineations, and inter-professional collaboration. The MW case concerned the complexity of progress during normal labour, which is a very central problem for the midwifery profession and might be met by any student or recently graduated midwife.


Students in all years of the three programmes were invited to take part in the study and informed that the observations were concerned with the groups, not with individuals. In the OT programme students from all 3 years were invited to participate voluntarily in the case discussions after a compulsory seminar. In the SLT-programme four cohort of students were invited to participate and in the MW two cohorts of students representing first and third semester of the programme. In the SLT year 1-3 and MW programmes the case method sessions were scheduled in the regular courses since they were considered to also be learning opportunities for the students. Students who did not wish to participate could be given the case for self-study, but all invited students accepted to participate. In year four of the SLT programme where the students were performing their degree projects, case sessions could not be scheduled. Students were invited to voluntarily take part in a case discussion. Table 1 includes details of the number of students and cohorts in the study.


Two or three observers observed the students (Table 1), while taking part in a discussion of a case during a case method session as described above. In addition, a teacher, well acquainted with the case method, was present as a facilitator who guided the process but intervened as little as possible. Since observations of students of all cohorts in a single programme were carried out during the same period, the students belonged to different cohorts.

A rubric to be used as a tool for observations was developed within a larger project including the faculties of engineering and economics [25]. In Sweden there are generic learning outcomes for all Higher Education irrespective of discipline [26] and the rubric was developed in accordance with these to be used to when the case method was used for teaching and learning. The rubric was tested with students from engineering; this study will be reported elsewhere.

The final version of the rubric had four dimensions:

  1. 1.

    Problem-solving process (Identification of the problem; Use of data in the case; Analysis; Synthesis and decision)

  2. 2.

    Disciplinary knowledge (Professional language; Prior knowledge)

  3. 3.

    Character of the discussion (Theory-based; Polemical; Supportive; Perspective-shifting/metaphoric)

  4. 4.

    Communication (Communication within the group; Trust within the group; Interaction with the teacher)

Point for observation were described and outlined in the rubric. All observers were instructed, but not formally trained, in using the rubric by observing the four dimensions. The rubric was used as a tool for observation of group discussions and observers added written notes in a column under each dimension. Details about observers are included in Table 1. The observers were all teachers at the medical faculty, well trained in observing group discussions and well familiar with the case method. All students were familiar with case-discussions and were informed about the study aim and that their discussions would be observed but not informed about the rubric’s content. It was clarified that it was not an individual summative assessment. After the observation, the students were informed about the rubric and they had opportunity to ask questions.

Data analysis

The observers used the rubric in order to employ the same standardized procedure when observing the different cohorts and programmes. Results were compiled for each cohort and programme. The analysis started with comparison of the written notes from each observer collected during the case method sessions. All data were reviewed for content and coded for correspondence to the four dimensions in the rubric using a deductive content analysis [28]. Consensus on description in each rubric for all sessions in the respective programme was achieved through mutual discussions between the observers until agreement was achieved. Trustworthiness [29] was supported by the observers’ active participation in every phase of the analysis process, including the preparation, organization and interpretation of data.


The results from this qualitative observational study describe students’ progression in problem solving process, disciplinary knowledge, character of discussion and communication, observed during the case method sessions. They are summarized for each participating programme separately both in the text below and in Table 2.

Table 2 Summary of findings presented for each programme and cohort

Occupational therapy programme

In the OT programme students were recruited from three cohorts; first, second and third year. We found clear differences in the problem-solving process and how the students discussed the case depending on study year. Disciplinary knowledge was observed and students in the first year did not use a professional language or concepts specific for a family-centred setting. However, they used an occupation-focused approach in their discussions. It was also evident that students in the first year made an effort to understand the impairments on body function and body structure level before they started to discuss occupational performance skills. They identified the problem but needed cues from the facilitator to grasp the complexity of the case. Some obvious connections were made, but to some degree, there was a lack of important understanding of how individual, family, community and environmental factors impact on occupational performance possibilities. The students in the second year used a mature professional language and tried to use correct terminology. They identified key roles of families, peers and communities as factors influencing on occupational performance. The case was thoroughly analysed and relevant assessment tools discussed. Some appropriate interventions were mentioned, although a lack of collaboration with other professionals was evident. The character of the discussions and communication was identified; students worked in a constructive way and gave comments and feedback to each other in the groups. Students in the third year identified the key problem by using mature and flexible teamwork. They discussed in a broader context such as autonomy, ethics and human rights. The complexity in the case was identified using competencies within the group. Primarily they addressed occupational performance and discussed challenges relevant to the problem. Key policies were considered, such as social security systems, health policies, social justice and human rights. Students in this group used a professional language and prior knowledge connected to theory. The students made reflections beyond the case presented and showed an understanding of a relevant approach to interventions.

Speech-language therapy programme

Four groups of students were recruited from the SLT programme, representing all 4 years of the programme. The groups of students differed regarding depth of knowledge and level of reasoning. First-year students showed limited disciplinary knowledge as evidenced by a high degree of colloquial language, resulting in less clearly defined concepts and requests for clarification from other students. They showed basic knowledge of scientific and clinical concepts but these were more fully mastered by second-year students, who showed a more developed problem-solving process, arriving more quickly at a common identification of the problem through hypothesis testing. The character of the discussion of the students in their third and fourth years of studies showed evidence of a deeper and more advanced level of reasoning, with students alternating between the perspectives of the SLT, patient, caregiver and school personnel. While students from all years relied heavily on previous course content to guide their analyses, first-year students also let personal experiences and anecdotal evidence influence their interpretation. In contrast, in later years, statements without clear references were questioned by fellow students. Year-four students asked provocative questions to promote the discussion, without losing a professional conversational tone. The communication of all groups showed high levels of independence from the facilitator, who only occasionally, primarily in the first year, was required to guide the discussion with questions and comments. A noteworthy transition from a more specialized competence to a team-based perspective was observed. Students in the third year quickly identified problems and the SLT role in resolving the issue. Students in the fourth year were reluctant to make similar interpretations, and were more prone to a team-based solution, acknowledging the competence of other professionals, in particular school personnel.

Midwifery programme

Two groups of students were recruited in this programme, representing the first and second years of the three-semester programme. We found progression in the problem-solving process in the way the students discussed and analysed the case, identifying the problem, using information and decision-making in the group. First-year students had difficulties in identifying the main problem and instead discussed several problems as though equally important. When using information in the case, students from first year read all the facts before discussing, whilst second-year students started their discussion quickly, without reading all the facts. However, they returned to the case during the discussion to gather more facts. First-year students’ decision-making varied but was mainly tentative based on theoretical knowledge. Students in the second year made and integrated decisions based on professional knowledge and experience and evidence was not specifically alluded to. The disciplinary knowledge also clearly progressed in the use of professional language, concepts and terms and the use of prior practical experience. A striking difference was that first-year students used theory-based knowledge from theoretical courses and their experience as nurses, whereas second-year students used experience-based knowledge from midwifery practice. First-year students also exhibited some difficulties in using the problem-solving model whereas second-year students dealt with central problems using the problem-solving model spontaneously. All students used professional concepts and terms according to course literature, and second-year students in addition communicated in the same effective and relaxed way as professional midwives. The main difference between the two groups, when they discussed and solved the problem, concerned practical experience. When first-year students discovered a lack of knowledge hampered progress in the discussion, they turned to the course literature for help. They used theoretical knowledge that at times was insufficient. Second-year students seemed to have a clear theoretical grounding, even though they did not refer to the literature but rather to practical experiences. All groups discussed in a supportive trusting way and also listened to and considered each other’s experiences. However, second-year students were shifted perspective more in the discussions. First-year students interacted more with the teacher whereas second-year students did not seem to need the teacher in their discussion.


We have found it is possible to monitor progression by using the case method and rubrics as tools for identifying clinical reasoning. The results reveal the students’ transition over time from strictly theory-based knowledge to a reasoning ability characterized by clinical considerations and experiences when trying to solve the clinical problem. This is also in line with previous findings by Wahlgren and Ahlberg [11].

In the problem-solving process, students in the early stages of the programme had more focus on and questions to the facilitator than students had in later stages of the programmes, who were more secure and confident. When identifying the relevant problem, first-year students had a somewhat fragmented approach to the case, reflected in the identification of several problems and an inability to identify the most relevant problem. For example, students early in the OT programme relied more on learning from the anatomy and neurological courses in an atomistic way and had difficulties integrating this in a more holistic way. Marton et al. [30] describe atomistic learning as fragmented and deep learning as holistic, where the student strives to understand meaning, connection, context and implication. Later in their programmes, students were more confident in their application of knowledge and had clinical experience to identify the most relevant problem(s) in the case efficiently. The transition from a fragmented approach to a holistic approach could be observed in all programmes. It has previously been shown in a study about midwifery students’ written reflections that their knowledge moves from a fragmented to a holistic approach during their education [31]. Perhaps the case method can inspire students and support deep learning early in the education.

A linear relationship between nursing student’s scores on a script concordance test and their experience of clinical practice was shown by Dawson et al. [13]. Using a progress test Williams et al. [16] also found a steady increase over the study years in clinical reasoning skills among medical students. A test with many questions may have a better reliability for an individual student, but observations like the ones we have used provide opportunities to study students’ development on group level.

We observed an insecurity amongst first-year students when they discovered that their knowledge was not sufficient, made evident by the use of textbooks, relying on personal and anecdotal evidence or looking for interaction from the teacher/facilitator. Vocabulary and use of professional concepts and terms developed over the cohorts. First-year students used layperson language influenced by textbook knowledge. Students who had clinical experience used a professional language, similar to the language used when qualified professionals communicate (as testified by the observers). Jones et al. [32] point out that standardized language is very important as an effective strategy to clarify professional nursing practice, which is equally important for all professionals. A precise vocabulary enables the formulation of precise questions, which receive focused answers, as shown in the present study by the prompt arrival at a clear identification of the relevant problems by 3rd and 4th year students. In addition, fourth-year students appeared more familiar with the discussion format, and asked provocative and challenging questions to fellow students, in order to advance the discussion.

Multiple-choice tests to measure progress by quantitative means have shown a steady increase in the knowledge of medical students [33]. Such tests are more reliable than case-based tests mainly due to better sampling [8]. This study was performed using one single case per programme, which may have compromised reliability, due to case specificity. However, it has been shown that generic skills contribute to clinical performance [34] and the case specificity has been questioned [35]. A combination of methods is probably preferable to obtain both reliable results concerning students’ knowledge as well as assessing generic skills. To increase the reliability of observations a rubric can be used, preferably complemented with examples [36] as was done in this study. Assessor training could further have increased the reliability of marking [37]. However, experts have been shown to have a high degree of agreement on the key elements of the clinical reasoning process [38]. The characteristics of communication in all programmes were distinguished by trust, most obvious in the later years. Students in the two undergraduate programmes (OT, SLT) used a team-based approach to problem solving in their final years. SLT students in their third year of studies quickly and accurately identified the necessary contributions of the SLT in addressing the issue described in the case. Fourth-year students, in contrast, acknowledged the necessity of a team-based approach, in which the plan of action was determined in close collaboration with other health and education professionals. Second-year students in the OT programme were observed as having the personal, professional and interprofessional skills that represent an isolated specialist. In their final year, this had changed to the competence of a team member who requires the expertise of other professionals for a broader understanding of challenges described in the case. This was not observed for MW students, most likely reflecting the midwife’s independent professional role with only an auxiliary nurse in a small team in the given case. Communication skills are important for teamwork in the future profession of the students [39]. Developing competence in communication during education of health professionals has been identified as being of major importance for healthcare by the WHO [40] and by a recent Lancet commission [41].

The use of only theoretical knowledge in the first year students could be compared to the reliance on rules in the novice stage in the Dreyfus model [42]. Novices follow rules whereas experts through experience have developed more intuitive and holistic ways of solving problems [42, 43]. The more fluent and precise identification of the relevant problem in later years could perhaps be interpreted as the development of more intuitive thinking.

In all programmes students’ development in problem identification and clinical reasoning were influenced by clinical training and professional reasoning. Professional reasoning includes important learning of values, attitudes and beliefs of the profession [44]. Despite the positive effects, there also can be a potential risk with professional socialisation [45] and it has been argued that there is a risk that professional socialisation may have negative consequences. The new professionals may prefer to do things in the way they always have been done, rather than practise the latest evidence. It is important for both students and teachers to be aware of this challenge.

We observed a developing reliance on clinical experience in the students’ discussions and at the same time an integration of theoretical knowledge and a development of an holistic perspective. This is in line with the two-dimensional model of professional development suggested by Dall’Alba and Sandberg [46]. They add a second dimension to skill progression, namely embodied understanding of practice, allowing for differences between individuals’ development trajectories. The students showed a progression of problem-solving skills, but also, and perhaps more prominently, a progression in embodied understanding of practice.

The case method used in the present study shows potential to track the progression of students learning during education. Without the need for formal assessment, it can provide the teacher with continuous information on students’ level of knowledge and reasoning, necessary to adjust instructions and the amount of support. The importance of accord between course learning outcomes, assessments and integration of different parts of the course content for student learning, has been proposed by Biggs [47] and termed constructive alignment.

Strengths of the study

Studies of this kind might be easily integrated into regular teaching and students could learn from the observed sessions making participation of value to the students. Using the case method enabled the application of and to describe students clinical reasoning skills. Another strength is that the same rubric was used for all observations. This means that the results could easily be compared across programmes. Further, all observers except one were qualified professionals with clinical experience. Small groups of students discussing the cases were observed as recommended by Benson et al. [48] since it allows for closer evaluation of the group process.

The strategy to enhance trustworthiness of the content analysis was to reach high intercoder reliability [28] between the observers. The rubric was developed prior to the study and all observers were familiar with the domains observed. Throughout the data collection and during the deductive content analysis observers (researchers) discussed the coding scheme used in the analysis process.

Weaknesses of the study

A weakness of the study is that the case sessions were scheduled as part of regular teaching for some cohorts, and as voluntary extracurricular activity for some. The students who volunteered may not be representative of the whole cohort. The groups of students observed were small and we cannot exclude a possibility that there can be variations between different cohorts. The influence of individual students could be strong in such small groups, and we cannot exclude large variations between individual students’ progression. Students worked in teams and the competence in the team might be higher than the competence of the individual student. We have observed structured discussions of a single case for each profession and the structure and the context of the case could have influenced the result. All students had experience of the case method. However, students in later years were more experienced in using the case method and this could also have influenced the results. Another weakness is that the observers were not formally trained in how to observe. They were all teachers who had used the case method in their teaching but they were new to this rubric. Using a rubric may also result in other important information being lost. The use of a generic rubric (a consequence of taking part in a university-wide project) could be seen as a weakness. However, we believe that the rubric, though generic, covers relevant aspects of health sciences clinical reasoning well). Further, since the observers were teachers in the programmes they knew which cohort they observed and this can have caused a bias in the observation.

This study relies exclusively on verbal expression of professional skills. These are important as such, but professional competence also involves action in practice. Students’ prior knowledge, norms and values may be both a strength and weakness.


Our observations revealed progression in several aspects of students’ clinical reasoning skills on a group level in their discussions of clinical cases. Observing students’ discussions of professional cases could be used to evaluate progression and quality in health sciences education. We have found it is possible to monitor progression by using the case method and rubrics as tools for identifying clinical reasoning. This can be considered as an evaluation of quality of curriculum, which is important in higher education. In addition, it can be considered resource saving to use an already existing learning tool, the Case Method, for the purpose of evaluation.





Occupational therapy


Speech-language therapy


World Health Organisation


  1. 1.

    Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010; doi: 10.3109/0142159X.2010.500898.

  2. 2.

    Frank J, Snell LS, Ten Cate O, Holmboe ES, Carraccio C, Swing SR, et al. Competency-based medical education: theory to practice. Med Teach. 2010; doi: 10.3109/0142159X.2010.501190.

  3. 3.

    Harris P, Snell L, Talbot M, Harden RM. Competency-based medical education: implications for undergraduate programs. Med Teach. 2010; doi: 10.3109/0142159X.2010.500703.

  4. 4.

    Bologna Declaration. The European higher education area. Joint Declaration Eur Ministers Educ 1999. Accessed 8 Sept 2017.

  5. 5.

    Morcke AM, Dornan T, Eika B. Outcome (competency) based education: an exploration of its origins, theoretical basis, and empirical evidence. Adv Health Sci Educ Theory Pract 2013; doi: 10.1007/s10459-012-9405-9.

  6. 6.

    Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010; doi: 10.3109/0142159X.2010.500704.

  7. 7.

    McHarg J, Bradley P, Chamberlain S, Ricketts C, Searle J, McLachlan JC. Assessment of progress tests. Med Educ. 2005;39:221–7.

    Article  Google Scholar 

  8. 8.

    Schuwirth LWT, Van der Vleuten CPM. The use of progress testing. Perspect Med Educ 2012; doi: 10.1007/s40037-012-0007-2.

  9. 9.

    Wrigley W, Van Der Vleuten CPM, Freeman A, Muijtjens A. A systemic framework for the progress test: Strengths, constraints and issues: AMEE Guide No. 71. Med Teach. 2012; doi: 10.3109/0142159X.2012.704437.

  10. 10.

    Chen Y, Henning M, Yielder J, Jones R, Wearn A, Weller J. Progress testing in the medical curriculum: students’ approaches to learning and perceived stress. BMC Med Educ 2015; doi: 10.1186/s12909-015-0426-y.

  11. 11.

    Wahlgren, M, Ahlberg, A. Monitoring and stimulating development of integrated professional skills in university study programmes. Eur J High Educ 2013. Accessed 23 Jan 2017.

  12. 12.

    Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39:98–106.

    Article  Google Scholar 

  13. 13.

    Dawson T, Comer L, Kossick MA, Neubrander J. Can Script Concordance Testing Be Used in Nursing Education to Accurately Assess Clinical Reasoning Skills? Nurs Educ; 2014; doi: 10.3928/01484834-20140321-03.

  14. 14.

    Humber AJ, Miech EJ. Measuring Gains in the Clinical Reasoning of Medical Students: Longitudinal Results From a School-Wide Script Concordance Test. Acad Med 2014; doi: 10.1097/ACM.0000000000000267.

  15. 15.

    See KC, Tan KL, Lim TK. The script concordance test for clinical reasoning: re-examining its utility and potential weakness. Med Educ 2014; 48: doi: 10.1111/medu.12514.

  16. 16.

    Williams RG, Klamen DL, White CB, Petrusa E, Fincher RM, Whitfield CF, et al. Tracking Development of Clinical Reasoning Ability Across Five Medical Schools Using a Progress Test. Acad Med 2011; doi: 10.1097/ACM.0b013e31822631b3.

  17. 17.

    Levett-Jones T, Hoffman K, Dempsey J, Jeong SY, Noble D, Norton CA, et al. The ‘five rights’ of clinical reasoning: An educational model to enhance nursing students’ ability to identify and manage clinically ‘at risk’ patients. Nurse Educ Today 2010; doi: 10.1016/j.nedt.2009.10.020.

  18. 18.

    Schmidt HG. Problem-based learning: rationale and description. Med Educ. 1983;1983(17):11–6.

    Article  Google Scholar 

  19. 19.

    Diemers AD, van de Wiel MW, Scherpbier AJ, Baarveld F, Dolmans DH. Diagnostic reasoning and underlying knowledge of students with preclinical patient contacts in PBL. Med Educ. 2015; doi: 10.1111/medu.12886.49:1229-38.

  20. 20.

    Barnes LB, Christensen CR, Hansen AJ. Teaching and the case method. MA: Harvard Business School Press Boston; 1994.

    Google Scholar 

  21. 21.

    Egidius H. Pedagogik inför 2000-talet. (Pedagogics for the 21st century). Stockholm: Natur och Kultur; 2009.

    Google Scholar 

  22. 22.

    Crang Svalenius E, Stjernquist M. Applying the case method for teaching within the health professions – teaching the teachers. Med Teach. 2005;27:489–92.

    Article  Google Scholar 

  23. 23.

    Stjernquist M, Crang-Svalenius E. Applying the case method for teaching within the health professions – teaching the students. Educ. Health [serial online] 2007. Accessed 23 Jan 2017.

  24. 24.

    Nordquist J, Sundberg K, Johansson L, Sandelin K, Nordenström J. Case-Based Learning in Surgery: Lessons Learned. World J Surg 2012; doi: 10.1007/s00268-011-1396-9.

  25. 25.

    Ramberg U, Edgren G, Wahlgren M. Evaluation of a Method to Monitor Progression of Professional Skills using Case Discussions in Class - a Comparative Study. 2017; manuscript submitted for publication, 2016.

  26. 26.

    Swedish Council for Higher Education: The Higher Education ordinance. (2014). Accessed 9 Feb 2017.

  27. 27.

    Klein G, et al. Thinking & Reasoning. 1997;3(4):337–52. 10.1080/135467897394329.

    Article  Google Scholar 

  28. 28.

    Elo S, Kääräinen M, Kanste O, Pölkki T, Utriainen K, Kyngäs H. Qualitative Content Analysis – A Focus on Trustwothiness. SAGE Open. 2014. Accessed 8 Sept 2017.

  29. 29.

    Polit DF, Beck CT. Nursing research: Principles and methods. Philadelphia PA: Lippincott Williams & Wilkins; 2012.

    Google Scholar 

  30. 30.

    Marton F, Dahlgren L, Svensson L, Säljö R. Inlärning Och omvärldsuppfattning. (Learning and conception of reality). Stockholm: Almqvist & Wiksell; 1977.

    Google Scholar 

  31. 31.

    Persson EK, Kvist LJ, Ekelin M. Analysis of midwifery students’ written reflections to evaluate progression in learning during clinical practice at birthing units. Nurse Educ Pract 2015; doi: 10.1016/j.nepr.2015.01.010.

  32. 32.

    Jones D, Lunney M, Keenan G, Moorhead S. Standadized nursing languages: essential for the nursing workforce. Annu Rev Res. 2010;28:253–94.

    Article  Google Scholar 

  33. 33.

    Verhoeven BH, Verwijnen GM, Scherpbier AJ, van der Vleuten CP. Growth of medical knowledge. Med Educ. 2002;36:711–7.

    Article  Google Scholar 

  34. 34.

    Wimmers PF, Fung C-C. The impact of case specificity and generalizable skills on clinical performance: a correlated traits-correlated methods approach. Med Educ 2008; doi: 10.1111/j.1365-2923.2008.03089.x.

  35. 35.

    Norman G, Bordage G, Page G, Keane D. How specific is case specificity? Med Educ. 2006;40:618–23.

    Article  Google Scholar 

  36. 36.

    Jönsson A, Svingby G. The use of scoring rubrics: Reliability, validity and educational consequences. EDUC RES REV-NETH. 2007. Accessed 23 Jan 2017.

  37. 37.

    Bird FL, Yucel R. Improving marking reliability of scientific writing with the Developing Understanding of Assessment for Learning programme. Assess Eval High Educ. 2013. Accessed 23 Jan 2017.

  38. 38.

    Gauthier G, Lajoie SP. Do expert clinical teachers have a shared understanding of what constitutes a competent reasoning performance in case-based teaching? Instr Sci 2014; doi: 10.1007/s11251-013-9290-5.

  39. 39.

    Wilhelmsson M, Pelling S, Uhlin L, Dahlgren L O, Faresjö T, Forslund K. How to think about interprofessional competence: a metocognitive model. J Interprof Care 2012; doi: 10.3109/13561820.2011.644644.

  40. 40.

    WHO: Framework for action on interprofessional education and collaborative practice. (2010). Accessed 8 Sept 2017.

  41. 41.

    Frenk J, Chen L Bhutta ZA, Cohen J, Crisp N, Evans T, et al. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010. Accessed 23 Jan 2017.

  42. 42.

    Dreyfus SE, Dreyfus HL, A five-Stage Model of the Mental Activities Involved in Directed Skill Acquisition. Washington: Storming Media. DC. 1980. Accessed 23 Jan 2017.

  43. 43.

    Dreyfus S. The five-stage model of adult skill acquisition. B Sci Technol Soc. 2004. Accessed 23 Jan 2017.

  44. 44.

    Vollmer HM, Mills DL, editors. Professionalisation. New Jersey: Prentice-Hall, Englewood Cliffs; 1966.

    Google Scholar 

  45. 45.

    Parsons M, Griffiths R. The effect of professional socialisation on midwives’ practice. Women Birth. 2007;20:31–4.

    Article  Google Scholar 

  46. 46.

    Dall’Alba G, Sandberg J. Unveiling professional development: a critical review of stage models. Rev Educ Res. 2006. Accessed 23 Jan 2017.

  47. 47.

    Biggs J. Enhancing teaching through constructive alignment. J High Educ 1996; doi:10.1007/BF00138871.

  48. 48.

    Benson G, Noesgaard C, Drummond-Young M. Facilitating small group learning in problem-based learning. In: Rideout E, editor. Transforming nursing education through problem-based learning. Sudbury: Jones & Barlett; 2001. p. 75–102.

    Google Scholar 

  49. 49.

    Swedish Ministry of Education and Cultural Affairs, 2003. The Act concerning the Ethical Review of Research Involving Humans (SFS 2003:460). Accessed 8 Sept 2017.

Download references


The authors wish to thank all students and observers who kindly agreed to participate in the study.


This study was funded by Lund University EQ11 grants, for educational development.

Availability of data and materials

The data are available from the corresponding author on reasonable request.

Author information




The study was designed within the larger project including the faculties of engineering and economics [25]. All authors have been active in data collection and analysis of the material and contributed to the draft of the manuscript. KO, ME, GE and EP prepared the final version of the manuscript and PH and OS have critically reviewed it. All authors have approved the final manuscript.

Corresponding author

Correspondence to Kristina Orban.

Ethics declarations

Ethics approval and consent to participate

Ethical approval was not sought for this study since according to Swedish research regulations it was not necessary (the Ethical Review Act, SFS 2003:460) [49] and furthermore, the ethical committee will not consider applications that do not meet their specifications. The perspective of the analyses was pedagogical. Students were observed only at group level and no notes were taken about the contributions of individual students. In some cohorts, the case sessions were scheduled but students’ participation was voluntary. At the OT and MW programme written informed consent were obtained and in the SLT programme, the informed consent was verbal. Students may have felt some pressure to participate but the negative consequences of participating could be considered negligible (some discomfort from being observed). In some cohorts, sessions were arranged outside schedule, in late afternoons. This means that there was no pressure on the students.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no conflicts of interest.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Orban, K., Ekelin, M., Edgren, G. et al. Monitoring progression of clinical reasoning skills during health sciences education using the case method – a qualitative observational study. BMC Med Educ 17, 158 (2017).

Download citation


  • Clinical problem-solving
  • Professional development
  • Health sciences education
  • Qualitative content analysis