Skip to main content
  • Research article
  • Open access
  • Published:

Mapping clinical reasoning literature across the health professions: a scoping review

Abstract

Background

Clinical reasoning is at the core of health professionals’ practice. A mapping of what constitutes clinical reasoning could support the teaching, development, and assessment of clinical reasoning across the health professions.

Methods

We conducted a scoping study to map the literature on clinical reasoning across health professions literature in the context of a larger Best Evidence Medical Education (BEME) review on clinical reasoning assessment. Seven databases were searched using subheadings and terms relating to clinical reasoning, assessment, and Health Professions. Data analysis focused on a comprehensive analysis of bibliometric characteristics and the use of varied terminology to refer to clinical reasoning.

Results

Literature identified: 625 papers spanning 47 years (1968–2014), in 155 journals, from 544 first authors, across eighteen Health Professions. Thirty-seven percent of papers used the term clinical reasoning; and 110 other terms referring to the concept of clinical reasoning were identified. Consensus on the categorization of terms was reached for 65 terms across six different categories: reasoning skills, reasoning performance, reasoning process, outcome of reasoning, context of reasoning, and purpose/goal of reasoning. Categories of terminology used differed across Health Professions and publication types.

Discussion

Many diverse terms were present and were used differently across literature contexts. These terms likely reflect different operationalisations, or conceptualizations, of clinical reasoning as well as the complex, multi-dimensional nature of this concept. We advise authors to make the intended meaning of ‘clinical reasoning’ and associated terms in their work explicit in order to facilitate teaching, assessment, and research communication.

Peer Review reports

Background

Clinical reasoning has been called the backbone of clinical practice [1, 2]. Competency frameworks across the Health Professions (e.g. Accreditation Council for Graduate Medical Education Core Competencies, the Royal College of Physicians and Surgeons of Canada’s CanMEDS framework, the General Medical Council’s Good Medical Practice, the Canadian Association of Occupational Therapists’ Profile of Practice, the Canadian Physiotherapy Association Competency Profile) [3,4,5,6,7] highlight the importance of clinical reasoning. Implementing these policy documents and frameworks in the training of health professionals requires a clear conceptualization of clinical reasoning to support its assessment and teaching.

While considered core to the practice of health professionals [1, 8], clinical reasoning has been discussed as either a multifaceted construct [9, 10] or a ‘black box’ phenomenon [11]. In broad terms, clinical reasoning reflects the thinking or reasoning that a health practitioner engages in to solve and manage a clinical problem. The field of clinical reasoning research represents a large literature that is rooted in early work by Elstein [12], Barrows [13, 14], Feltovitch [14], Neufeld [15], Schmidt [16], and Norman [17, 18], with a heavy focus on characterizing the cognitive processes that underpin clinical reasoning. Since then, clinical reasoning has been variably described as a process or an outcome [19]; has been discussed through the lens of various frameworks [20]; and interpreted for multiple audiences—from scholars to clinical teachers [19]. This broad and substantive literature notwithstanding, little consensus exists regarding the definition of clinical reasoning [20].

One recent review considered clinical reasoning through a series of different conceptual lenses [20], and other recent work offered insights into how various theories of clinical reasoning may be reflected in current teaching and assessment practices [21]. These works, however, are limited to the field of medicine, and are not the result of a systematic investigation of the literature across Health Professions. Given current emphasis on interprofessional training [22], and the thread of clinical reasoning throughout health professions competency profiles [3,4,5,6,7], a careful mapping of the concept of clinical reasoning across professions is necessary to support both profession-specific and interprofessional learning, assessment, and research. Here, we report on a scoping review conducted with the support of the Best Evidence Medical Education (BEME) collaboration [23] with the purpose of answering the question “How is clinical reasoning described in the Health Professions Education (HPE) literature?”

Methods

Scoping methodology

Due to the exploratory nature of this project, and the breadth of the potentially relevant literature, we chose a scoping review methodology for this project. Scoping reviews are increasingly used in Health Professions Education (HPE) to synthesize and map diverse bodies of literature in both well-defined and emerging domains. Further details regarding scoping reviews in HPE can be found in Thomas et al. [24, 25]. Scoping methodology allows for the inclusion and synthesis of various types of literature (e.g. review articles, primary work, commentaries and editorials), methodological approaches (e.g. experimental designs, descriptive studies, ethnographic studies), and data analysis approaches (qualitative, quantitative, or mixed approaches). Scoping reviews do not necessitate a formal quality appraisal of the literature [25, 26] and given the inclusion of various literature types in the current review (e.g. primary literature and commentaries), with the focus on descriptions of clinical reasoning, we judged that a quality appraisal was not appropriate nor would it add meaningfully to the results of our review.

Study design

Mapping is defined as a process whereby the identified literature is represented both numerically (quantitatively) and thematically (qualitatively). Our specific methods aligned with the 5-step methodological framework recommended by Arksey and O’Malley, and are presented below [26].

  • Step 1: Identification of a research question.

The question guiding this review was “How is clinical reasoning described in the Health Professions Education (HPE) literature?”

  • Step 2: Identifying relevant research studies.

The review described in this paper is one component of a larger Best Evidence Medical Education (BEME) commissioned synthesis on assessment of clinical reasoning (for information on BEME, please see: www.bemecollaboration.org). Our scoping review draws on literature identified through the larger review [27] (reflected in the search strategy in Additional file 1: Appendix 1); however, study inclusion, data extraction, and analysis were conducted independently. Between 2013 and 2014, the team worked with a librarian to design a search comprised of three constructs: HPE, clinical reasoning, and assessment. Each article captured by the search included search terms or subheadings related to all three constructs (i.e. any given paper identified by the search would include a health profession, in an educational or assessment context, with some mention of the construct of clinical reasoning). The search strategy was vetted by two other academic health sciences librarians, adapted to the following databases: MEDLINE, ERIC, CINHAL, PsychINFO, Scopus, Google Scholar, and New York Academy of Medicine (NYAM) Grey Literature Report; and restricted to English-language papers.

  • Step 3: Study selection.

Articles identified by the search strategy were screened by the larger Assessment Review Team [27], relying primarily on title and abstract review. In addition to selecting articles relevant to the review of assessments of clinical reasoning, [27] reviewers were asked to identify articles relevant for a review of the definitions of clinical reasoning; more specifically, identifying papers that either contained a definition of clinical reasoning, an associated term, or could contribute to understanding how clinical reasoning is defined in the literature. Reviewers identified 635 articles (625 of which were in English with full-text available) as relevant to the definitional review (Fig. 1). Given the large number of remaining papers identified by the Assessment Review Team, we engaged in an additional round of inclusion to ensure papers identified would contribute meaningfully to our scoping review. During this secondary round, six pairs of reviewers reviewed a total of 7 papers each (7% of database) to reassess whether each paper should be included based on the goals of the review. Initial agreement regarding inclusion within pairs of reviewers was unexpectedly low (ranging from 14 to 71%). We hypothesized that the lack of agreement was in part due to divergent conceptualizations of clinical reasoning within our own team (Young et al) [28].

Fig. 1
figure 1

PRISMA Flow-chart of article selection27

In response to these findings, we paused the review process and engaged in a reflective exercise in which each team member answered questions regarding their definition of clinical reasoning and component processes. This exercise, the findings of which are reported elsewhere (Young et al) [28], revealed variation within the team regarding what was considered as ‘relevant’ contributors to clinical reasoning. As the purpose of the current review was to map the breadth of the literature, we proceeded with the review following team discussion and erred on the side of inclusion, extracting data from all 625 previously identified articles.

  • Step 4: Charting the data.

The data collection tool (Additional file 1: Appendix 2) used in this review was developed using a multistep iterative process with two rounds of revision followed by usability testing. We piloted the original extraction form with the review team (n = 12 individuals working in teams of two), established reasonable agreement on co-extracted data on quantitative extraction items, and refined it based on usability ratings and team member suggestions.

A second phase of co-coding and data extractions occurred with the revised tool (Additional file 1: Appendix 2). Six pairs of reviewers extracted seven papers each, for a total of 42 papers (another 7% of the database). Given that several of the extraction items depended on the coder to apply their knowledge and interpret findings within the papers, and given the multiple perspectives within our review team [27], data was extracted using open-ended items to allow for interpretation and flexibility (Additional file 1: Appendix 2). Given the importance of diversity for our attempt to map the breadth of the literature, reaching agreement was not our aim. Therefore, we proceeded with single coders (n = 13) for the remainder of the database. We used DistillerSR software (Evidence Partners, Ottawa, Canada) for data extraction and database management, Excel (Microsoft Excel 2013, Redmond, Washington, U.S) and Prism (Prism GraphPad Software, Inc., La Jolla, CA, USA) for analysis and graphic representations.

  • Step 5: Collating, summarizing and presenting findings.

Description of analytical process

We used several approaches to summarize our study findings. In this paper, we focus on a multi-dimensional description of the database that formed the foundation of this project. To characterize the articles included in this review we focused on: profession represented (e.g. nursing, medicine, physical therapy), learner level (e.g. undergraduate, postgraduate), paper type (e.g. commentary, original research, review), country of origin, the presence of the term ‘clinical reasoning’, and other terms used to refer to clinical reasoning (when appropriate).

Terminology used to refer to clinical reasoning: For each paper, team members were asked to identify whether the term clinical reasoning was used (yes/no), and whether any other term was used to refer to clinical reasoning within the text. Team members could identify up to three terms per text, relying on their content expertise to determine relevance of a given term. Few constraints were given to the team, and team members were encouraged to apply their own conceptualizations of clinical reasoning during extraction [28]. Terms identified (n = 110) that were used interchangeably with clinical reasoning (e.g. diagnostic reasoning) were then iteratively coded. First, MY engaged in an inductive categorization of terms, informed by her knowledge of the clinical reasoning literature. This initial category structure was critically revised by AT and SL and adapted iteratively. Following refinement of the categories of terms, LG and DG reviewed the category labels, the identified terms, and assigned each term to a single category independently. Following this, MY, LG and DG discussed the process, reviewed their categorization of terms, and decided whether they would continue to assign a given term to a certain category or revisit their categorization. This process resulted in the team agreeing on the categorization of 65 (59%) terms across 6 categories. Terms for which the team could not agree were not included in the analyses reported in this manuscript. This categorization process is described in more detail elsewhere (Young et al. 2019) [29], including the terms for which consensus was not possible.

Exploration of terminology across publication characteristics: Whether or not a publication used the term ‘clinical reasoning’, and the categories of terminology other than ‘clinical reasoning’ were used to explore how these different categories of terms were used across articles included in this study. Analysis explored the distribution of these different categories of terms across different Health Professions, different publication types, and papers that included (or not) an assessment of clinical reasoning.

Results

Nature and distribution of the studies

The numbers of articles at each stage are shown in a PRISMA [30] flow chart in Fig. 1. Articles relevant to the definitional review were identified following title and abstract review. This resulted in 635 papers included in our archive. Ten papers were removed due to language (only English-language articles were included), or the inability to identify a full-text version of the article. This left a total of 625 studies (full list available in the Digital Supplement), spanning 47 years (1968–2014; Fig. 2), published in 155 journals, written by 544 unique first authors. Papers from the North America were dominant (Table 1), almost two thirds of papers reported original research (Table 1), and papers represented the entire HPE training continuum (Table 2). Although a total of 18 different Health Professions were represented in our archive, more than half of the articles (n = 335) were from medicine (Table 2).

Fig. 2
figure 2

Distribution of papers across publication year (bin size of 5 years)

Table 1 Geographic distribution and type of papers included in our review
Table 2 Representation of Health Professions and Level of Learner

Clinical reasoning terminology

Of the 625 papers included in this study, 230 papers (36.8%) used the verbatim term ‘clinical reasoning’ within the article. We used descriptive analyses to explore the relative proportion of papers that used the term clinical reasoning across the most frequently represented Health Professions in our database (medicine, nursing, dentistry, physical therapy and occupational therapy). Thirty-eight percent of papers in medicine used the term clinical reasoning (126/335), 27% in nursing (51/192), 23% in dentistry (6/26), 83% in physical therapy (15/18), and 81% in occupational therapy (13/16).

In the entire corpus of 625 papers, coders identified a total of 110 different terms used in reference to clinical reasoning. A total of six overarching categories of terminology were identified:

  1. 1.

    reasoning skills referred to the abilities needed in order to reason clinically—terms such as clinical skills, cognitive skills,

  2. 2.

    reasoning performance referred to aspirational goals for clinical reasoning to be attained—terms such as competency, acumen, or expertise,

  3. 3.

    reasoning process focused on the ‘how’ of clinical reasoning—proposing component processes or means by which the reasoning process unfolds (e.g. analytic reasoning, intuition, heuristics),

  4. 4.

    outcome of reasoning focused on the ‘what’ results from a reasoning process (e.g. a diagnosis, a management plan), the quality of that outcome (e.g. accuracy, quality), and the errors or failures in reasoning (e.g. bias, error),

  5. 5.

    context of reasoning included notions of ‘where’ the reasoning process is occurring ‘outside’ of the individual clinicians’ cognition, or factors that could influence that reasoning—including notions such as participatory approaches or shared decision making, or situational awareness which includes notions of influences on cognition that are more situationally or contextually derived,

  6. 6.

    purpose/goal of reasoning focused on the ‘why’ of clinical reasoning—for patient management, to determine a treatment, or to propose a diagnosis. A full list of terms for which consensus was reached and their categorization can be found in Table 3.

Table 3 Terms used to refer to clinical reasoning and their associated categorization

Categories of terms were differentially represented across Health Professions (Fig. 3). Reasoning skills descriptions dominated in dentistry, nursing, and physical therapy, whereas medicine had a high prevalence of terminology reflecting the purpose or goal of reasoning. When examining the presence of different categories of terms across publication type (Fig. 4), terminology reflecting reasoning skills was dominant in innovation reports, theses, and review papers, whereas skills and purpose or goal of reasoning terminology were relatively balanced in original research papers and commentaries or editorials.

Fig. 3
figure 3

Presence of different categories of terminology for clinical reasoning across publications in various Health Professions

Fig. 4
figure 4

Presence of different categories of terminology for clinical reasoning across publication types

We explored how different categories of terms related to the likelihood that a given work included an assessment of clinical reasoning compared to those that did not (Fig. 5). Papers reporting on assessments were much more likely to describe clinical reasoning in terms of reasoning performance, purpose/goal of reasoning, and outcome of reasoning, and less likely to use terminology reflecting the context of reasoning than other categories of terminology.

Fig. 5
figure 5

Presence of each category of terminology in papers that report on an assessment of clinical reasoning, compared to those that do not report on an assessment of clinical reasoning

Discussion

This review explored how clinical reasoning is represented within the Health Professions Education (HPE) literature. Through this review, a group of scholars from different professions, different training backgrounds, and different perspectives on clinical reasoning [28], engaged in a synthesis to explore how clinical reasoning is described in the HPE literature. We analyzed papers spanning nearly half a century, representing 18 different Health Professions, levels of learners across the continuum, and a variety of publication types. We do not claim that these papers represent the entire corpus of writing on the topic of clinical reasoning in HPE; we argue instead that it represents a broad sampling of literature informing this topic and creates a foundation to map different areas of focused attention, and perhaps differing conceptualizations of clinical reasoning.

Just over one third of articles in this review contained the verbatim term ‘clinical reasoning’ within the title, abstract, or body of the article. Articles from the fields of physical and occupational therapy were the most likely to include the exact phrase ‘clinical reasoning’; this may be due, in part, to the presence of very explicit frameworks and definitions of clinical reasoning within these rehabilitation professions [2, 8, 31,32,33,34]. For example, reasoning is understood as a cognitive or metacognitive process that guides clinical practice and includes: procedural, interactive, conditional, narrative, and pragmatic reasoning [33]. These explicit descriptions of clinical reasoning likely support relative uniformity in the conceptual framework underlying the term in these professions.

In lieu of the term ‘clinical reasoning’, we identified terms referring to clinical reasoning, grouped into six overarching categories, which appear to represent different dimensions of focus in the operationalization of clinical reasoning. More specifically, each category appears to focus on different aspects or components of clinical reasoning, with terms variously focused on the ‘why’, the ‘how’, the ‘where’, the ‘what’ or the ‘what should’ result from a reasoning process. These six categories of terms were not used uniformly across the Health Professions. Articles from medicine (dominant in our database), tended to use language associated with the purpose or goal of reasoning (e.g. diagnostic reasoning), whereas articles reporting on clinical reasoning in nursing tended to use language reflecting reasoning as a skill (e.g. critical thinking). These categories of terminology appear to prioritize different components or aspects of clinical reasoning—perhaps suggesting different conceptualizations, understandings, or operationalizations [35] of what constitutes clinical reasoning across the Health Professions.

When examining across publication type, we saw a relatively consistent presence of language reflecting the purpose or goal of reasoning, or reasoning as a skill, with the exception of innovation reports where the language of reasoning as a skill dominated. This finding may indicate that in educational innovations—publications describing new approaches to teaching and learning—reasoning may be expressed as a teachable or learnable skill rather than a process or contextually-bound experience.

Finally, we examined the presence of these six categories of terminology across papers that did, or did not, include the description of an assessment. The only category of terminology less likely to be present in a paper reporting on assessment of clinical reasoning was language around the context of reasoning (e.g. participatory decision-making). This finding may suggest that either this category of language has not been broadly adopted by the assessment literature or this conceptualization of clinical reasoning may be more difficult to assess and perhaps less amenable to assessment approaches.

To summarize our findings, the literature included in this synthesis is broad and represents many different facets of the HPE literature on clinical reasoning. Further, there are a multitude of terms being used to refer to clinical reasoning. However, based on their differential representation across paper type, health profession, and the inclusion of an assessment, these terms do not appear to be used synonymously. This result suggests that clinical reasoning may be an overarching concept, rather than a singularly definable entity in itself [35]. Rather, the concept of clinical reasoning appears to manifest, be operationalized, or crystalized differently depending on the context—whether across individual health professions, different publication types, or assessment focus.

The purpose of this review was to provide a concrete description of the variability within the concept of clinical reasoning [36], respecting the differences across Health Professions, without creating a hierarchy of terminology, operationalizations, or conceptualizations of a concept, nor homogenizing our findings into one universal definition of clinical reasoning across the Health Professions. Our purpose was to map the breadth of literature, and to attempt to provide an organizational framework for various understandings of clinical reasoning. While clinical reasoning has been referred to as a multi-dimensional construct [9, 10], the likely presence of multiple conceptualizations of clinical reasoning, suggested by the different terms used to label it, has important implications for teaching, assessment, and research within and across the Health Professions. One could imagine that an assessment based on a conceptualization of clinical reasoning as a contextually-bound experiential phenomenon may focus on very different dimensions of reasoning than one based on a conceptualization focused on the outcome of reasoning. Similarly, educational programs or interventions would likely take very different shapes if one were to focus on reasoning as a skill (i.e. focus on transferable approaches to reasoning) as opposed to focusing on the purpose or goal of reasoning (e.g. focus on the justification of a diagnosis). Summarizing particular approaches to teaching or assessment that reflect these different conceptualizations of reasoning are beyond the scope of the current review, but remain an important avenue for future research.

This study has limitations. We acknowledge that the corpus of studies included in this review does not represent the full literature available on the topic of clinical reasoning, and the distribution of terminologies, use of the term ‘clinical reasoning’, and distribution of studies may not generalize to the entire literature available on clinical reasoning and related concepts. However, we believe that the breadth represented in this review allows for an initial mapping of some of the different contexts, terms, and perhaps conceptualizations of clinical reasoning present in the HPE literature. While members of our team represent a variety of expertise and experiences, our team did not include nursing as an area of expertise. Given the representation of articles from nursing within our database, that particular perspective may have been beneficial to our analytical team. Future work should include representation from a broader range of health professionals in order to better situate clinical reasoning as a potential area for interprofessional or team-based [37] education.

Several areas for consideration and educational development remain. With ‘competence’ as a final goal, explicit identification of a (or perhaps several) conceptualization(s) of clinical reasoning is required in order to describe and develop performance profiles of trainees. Further, these different dominant conceptualizations of clinical reasoning across the Health Professions will - and likely already do - inform the complex context of both Interprofessional Education (IPE) and Interprofessional Collaborative Practice. IPE competencies currently do not explicitly focus on clinical reasoning [38], yet clinical reasoning has been identified as important across Health Professions and thus may be reasonable, or even essential, for IPE to address. However, as terminology or perhaps even conceptualizations of clinical reasoning differ across Health Professions, this may prove challenging as different professions’ educational programs may reflect different understandings, operationalizations, or prioritization of different areas of focus of this multifaceted concept. Our work provides an initial structure to begin to address this complex educational and practice challenge, without proposing an interprofessional unified definition of clinical reasoning relevant to all Health Professions.

Conclusion

The variability in terminology used to describe clinical reasoning across the Health Professions Education literature may lead to unclear communication within the clinical reasoning community, and perhaps difficulty in operationalizing the concept of clinical reasoning for teaching and assessment in the Health Professions. We encourage those involved in the study, teaching, and assessment of clinical reasoning to carefully consider and make explicit their intended understanding of clinical reasoning in order to support better communication, teaching, and assessment of clinical reasoning.

Availability of data and materials

All data and materials are available from the authors upon request. Data collection materials are included in the Appendix, and all other data is drawn from published manuscripts.

Abbreviations

HPE:

Health Professions Education

References

  1. Higgs J, Jones MA, Loftus S, Christensen N, editors. Clinical reasoning in the health professions. 3rd ed. Amsterdam: Elsevier Butterworth Heinemann; 2008.

    Google Scholar 

  2. Mattingly C. What is clinical reasoning? Am J Occup Ther. 1991;45:979–86.

    Article  Google Scholar 

  3. Accreditation Council for Graduate Medical Education (ACGME). Common Program Requirements. (http://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRs_07012016.pdf). Revised July 1, 2016. Accessed 10 June 2017.

  4. Royal College of Physicians and Surgeons of Canada. CanMEDS 2015. (http://www.royalcollege.ca/rcsite/canmeds/canmeds-framework-e). Revised 2017. Accessed 10 June 2017.

  5. General Medical Council. Good Medical Practice. (http://www.gmc-uk.org/Good_medical_practice___English_1215.pdf_51527435.pdf). Revised April 29, 2014. Accessed 10 June 2017.

  6. Canadian Association of Occupational Therapists. Profile of Practice of Occupational Therapists in Canada 2012. (https://www.caot.ca/document/3653/2012otprofile.pdf) Published October 2012. Accessed 10 June 2017.

  7. Canadian Alliance of Physiotherapy Regulators & Canadian Physiotherapy Association Competency Profile. Essential competencies for physiotherapist support Workers in Canada. Toronto: The Canadian Association of Occupational Therapists; 2002.

  8. Schell BAB. Professional reasoning in practice. In: Crepeau EB, Cohn ES, Schell BAB, editors. Willard and Spackman's occupational therapy. 10th ed. Philadelphia: Lippincott Williams & Wilkins; 2009. p. 314–27.

    Google Scholar 

  9. Pelaccia T, Tardif J, Triby E, Charlin B. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory. Med Educ Online. 2011;16:5890.

    Article  Google Scholar 

  10. Ajjawi R, Higgs J. Using hermeneutic phenomenology to investigate how experienced practitioners Lear n to communicate clinical reasoning. Qual Rep. 2007;12:612–38.

    Google Scholar 

  11. Sandhu H, Carpenter C, Freeman K, Nabors SG, Olson A. Clinical decisionmaking: opening the black box of cognitive reasoning. Ann Emerg Med. 2006;48:713–9.

    Article  Google Scholar 

  12. Elstein AS, Shulman LS, Sprafka SA. Medical problem solving: an analysis of clinical reasoning. Cambridge: Harvard University Press; 1978. p. 64–6.

    Book  Google Scholar 

  13. Barrows HS, Norman GR, Neufeld VR, Feightner JW. The clinical reasoning of randomly selected physicians in general medical practice. Clin Invest Med. 1982;5:49–55.

    Google Scholar 

  14. Barrows HS, Feltovich PJ. The clinical reasoning process. Med Educ. 1987;21:86–91.

    Article  Google Scholar 

  15. Neufeld VR, Norman GR, Barrows HS, Feightner JW. Clinical problem-solving by medical students: a longitudinal and cross-sectional analysis. Med Educ. 1981;15:315–22.

    Article  Google Scholar 

  16. Schmidt HG, Moshuizen HPA, Hobus PPM. Transitory stages in the development of medical expertise: the ‘intermediate effect’ in clinical case representation studies. Erlbarum: Proceedings of the 10th annual conference of the cognitive science society; 1988. p. 139–45.

    Google Scholar 

  17. Norman GR, Tugwell P, Feightner JW, Muzzin LJ, Jacoby LL. Knowledge and clinical problem-solving. Med Educ. 1985;19:344–56.

    Article  Google Scholar 

  18. Norman GR, Young ME, Brooks LR. Non-analytic models of clinical reasoning: the role of experience. Med Educ. 2007;41:1140–5.

    Google Scholar 

  19. Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39:98–106.

    Article  Google Scholar 

  20. Durning SJ, Artino AR Jr, Schuwirth L, van der Vleuten C. Clarifying assumptions to enhance our understanding and assessment of clinical reasoning. Acad Med. 2013;88(4):442–8.

    Article  Google Scholar 

  21. Young ME, Dory V, Lubarsky S, Thomas A. How different theories of clinical reasoning influence teaching and assessment. Acad Med. 2018;93(9):1415.

    Article  Google Scholar 

  22. Reeves S, Perrier L, Goldman J, Freeth D, Zwarenstein M. Interprofessional education: effects on professional practice and healthcare outcomes (update). Cochrane Database Syst Rev. 2013;3:CD002213.

    Google Scholar 

  23. Best Evidence Medical Education Collaborative. https://www.bemecollaboration.org. Accessed 13 Sep 2017.

  24. Thomas A, Lubarsky S, Varpio L, Durning SJ, Young ME. Scoping reviews in health professions education: challenges, considerations and lessons learned about epistemology and methodology. In: Advances in health sciences education, e-print available; 2019.

    Google Scholar 

  25. Thomas A, Lubarsky S, Durning S, Young ME. Knowledge syntheses in medical education: demystifying scoping reviews. Acad Med. 2017;92:161–6.

    Article  Google Scholar 

  26. Arksey H, O'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8:19–32. https://doi.org/10.1080/1364557032000119616 Accessed 1 Jan 2016.

    Article  Google Scholar 

  27. Daniel M, Rencic J, Durning SJ, Holmboe E, Santen SA, Lang V, et al. Clinical reasoning assessment methods: a scoping review and practical guidance. Acad Med. 2019;94(6):902–12.

    Article  Google Scholar 

  28. Young ME, Thomas A, Lubarsky S, et al. Drawing boundaries: the difficulty in defining clinical reasoning. Acad Med. 2018;93:990–5. https://doi.org/10.1097/ACM.0000000000002142.

    Article  Google Scholar 

  29. Young M, Thomas A, Gordon D, Gruppen L, Lubarsky S, Rencic J, et al. The terminology of clinical reasoning in health professions education: implications and considerationsMedical Teacher. EPrint July; 2019.

    Google Scholar 

  30. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6:e1000097. https://doi.org/10.1371/journal.pmed1000097.

    Article  Google Scholar 

  31. Schell BA. Clinical reasoning: the basis of practice. In: Crepeau EB, Cohn ES, Schell BAB, editors. Willard and Spackman's occupational therapy. 10th ed. Philadelphia: Lippincott Williams & Wilkins; 2003. p. 131–9.

    Google Scholar 

  32. Tomlin GS. Scientific reasoning. In: Schell BA, Schell JW, editors. Clinical and professional reasoning in occupational therapy. Philadelphia: Lippincott Williams & Wilkins; 2009. p. 91–125.

    Google Scholar 

  33. Fleming MH. The therapist with the three-track mind. Am J Occup Ther. 1991;45:1007–14.

    Article  Google Scholar 

  34. Hamilton TB. Narrative reasoning. In: Schell BAB, Schell JW, editors. Clinical and professional reasoning in occupational therapy. Philadelphia: Lippincott Williams & Wilkins; 2008.

    Google Scholar 

  35. Young ME. Crystallizations of constructs: lessons learned from a literature review. Pers Med Educ. 2018;27:1–3.

    Google Scholar 

  36. Eva KW. What’s in a name? Definitional clarity and its unintended consequences. Med Educ. 2016;51:1–2.

    Article  Google Scholar 

  37. Graber ML, Ruxa D, Jones ML, et al. The new diagnostic team. Diagnosis. 2017;4:225–38.

    Article  Google Scholar 

  38. Interprofessional Education Collaborative. Core competencies for interprofessional collaborative practice: 2016 update. Washington, DC: Interprofessional Education Collaborative; 2016.

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank members of the BEME collaboration for their support. We would like to acknowledge those who participated in screening the articles included in our review, who were not part of the definitional review: Michelle Daniel, Carlos Estrada, Anita Hart, Brian Heist, Valerie Lang, Katherine Picho, Patrick Rendon, Sally Santen, and Dario Torre. We also would like to acknowledge the work of the academic librarians, Nancy Allee, Donna Berryman, and Elizabeth Richardson, who aided in developing the original search strategy.

Previous presentations

Components of this work have been presented at the Canadian Conference on Medical Education (2017), and the Association of Medical Educators of Europe Annual Conference (2017).

Disclaimer

The views expressed herein are those of the authors and not necessarily those of the Department of Defense or other federal agencies.

Funding

This work was partially supported the Fonds de Recherche du Quebec-Sante (FRQ-S) Junior Research Scholar program and by funds provided by the Department of Medicine, McGill University to M. Young.

Author information

Authors and Affiliations

Authors

Contributions

Conception and design of the study (MY AT SL SJD), data collection and analysis (MY AT SL DG LG JR TB EH ADS TR LS VD SJD), initial drafting of the manuscript (MY AT SL SD), critical review of the manuscript (MY AT SL DG LG JR TB EH ADS TR LS VD SJD) and all authors provided final approval of the submitted manuscript.

Corresponding author

Correspondence to Meredith E. Young.

Ethics declarations

Ethics approval and consent to participate

Ethics approval was not sought for this work, as the data collected and analyzed were drawn from published literature.

Consent for publication

Not required as all images within the manuscript were created by the research team.

Competing interests

None.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1.

Search strategy used for this review included search terms for three constructs: clinical reasoning, assessment, and education.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Young, M.E., Thomas, A., Lubarsky, S. et al. Mapping clinical reasoning literature across the health professions: a scoping review. BMC Med Educ 20, 107 (2020). https://doi.org/10.1186/s12909-020-02012-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-020-02012-9

Keywords