Skip to main content

Assessing competency in Evidence Based Practice: strengths and limitations of current tools in practice

Abstract

Background

Evidence Based Practice (EBP) involves making clinical decisions informed by the most relevant and valid evidence available. Competence can broadly be defined as a concept that incorporates a variety of domains including knowledge, skills and attitudes. Adopting an evidence-based approach to practice requires differing competencies across various domains including literature searching, critical appraisal and communication. This paper examines the current tools available to assess EBP competence and compares their applicability to existing assessment techniques used in medicine, nursing and health sciences.

Discussion

Only two validated assessment tools have been developed to specifically assess all aspects of EBP competence. Of the two tools (Berlin and Fresno tools), only the Fresno tool comprehensively assesses EBP competency across all relevant domains. However, both tools focus on assessing EBP competency in medical students; therefore neither can be used for assessing EBP competency across different health disciplines. The Objective Structured Clinical Exam (OSCE) has been demonstrated as a reliable and versatile tool to assess clinical competencies, practical and communication skills. The OSCE has scope as an alternate method for assessing EBP competency, since it combines assessment of cognitive skills including knowledge, reasoning and communication. However, further research is needed to develop the OSCE as a viable method for assessing EBP competency.

Summary

Demonstrating EBP competence is a complex task – therefore no single assessment method can adequately provide all of the necessary data to assess complete EBP competence. There is a need for further research to explore how EBP competence is best assessed; be it in written formats, such as the Fresno tool, or another format, such as the OSCE. Future tools must also incorporate measures of assessing how EBP competence affects clinician behaviour and attitudes as well as clinical outcomes in real-time situations. This research should also be conducted across a variety of health disciplines to best inform practice.

Peer Review reports

Background

What is competence and why is it important?

Competence can broadly be defined as a concept that incorporates a variety of domains including knowledge, skills and attitudes. [1] Health professionals may demonstrate overall competence in their relevant discipline via a four step process including; (1) knowledge, (2) competence (specific to the task), (3) performance, and (4) action. [2] Apart from knowledge, skills and attitudes, competence also incorporates a health professional's problem solving skills (e.g. ability to critically think and apply clinical reasoning) and ability to work as a team member and communicate effectively, both in a written and verbal format. [3] Assessing competences can focus on any one of these domains.

What is Evidence Based Practice (EBP) competence?

Evidence based practice (EBP) involves making clinical decisions informed by the most relevant and valid evidence available. [4] EBP has been described as the integration of clinical expertise and patient values with the best available research evidence. [4] Clinical expertise draws on the health professional's clinical skills and past experience to identify and treat each patient's individual circumstance. Patient values encompass the personal concerns, expectations, cultural influences and characteristics of individuals during the clinical encounter. The best research evidence draws on the highest quality of clinically related research. The integration of these three elements increases the potential for positive health outcomes.

EBP requires the health professional to apply the best available evidence to assist with their clinical decision making. [4] The practice of EBP consists of the following key steps;

  1. 1.

    Converting clinical scenarios into a structured answerable question,

  2. 2.

    Searching the literature to identify the best available evidence to answer the question,

  3. 3.

    Critically appraising the evidence for its validity and applicability, and

  4. 4.

    Applying the results of the appraisal into clinical practice

  5. 5.

    Evaluation/assessment of the EBP process. [4]

Each step of the EBP process requires a different level of knowledge and skill (i.e. competence).

  • Step 1 requires knowledge to construct a question using the PICO mnemonic,

  • Step 2 requires the acquisition and application of literature searching skills across a variety of databases,

  • Step 3 requires a certain level of expertise in epidemiology and biostatistics, and

  • Step 4 requires an ability to synthesise and communicate the results to relevant parties (i.e. health professionals, patients).

  • Step 5 requires the health professional to evaluate the EBP process and assess its impact within the clinical context in which it was implemented.[5]

Has assessment of EBP competence previously been investigated?

A systematic review published in 2001 investigated the effects of teaching EBP skills to health professionals with respect to their EBP competence. [6] It identified one randomised controlled trial (RCT), which concluded that teaching EBP skills in a post-graduate environment increased participants' EBP knowledge and skills. However, a limitation of that RCT was that no validated assessment tool was applied to distinguish the effects of pre/post training in EBP. Rather, participants were asked to complete a 'self-assessment' of their EBP competencies. Self-review is a subjective form of assessment, with participants often factoring other variables that they perceive may have influenced their performance, thereby skewing the actual performance and outcome. [7] Participants are also prone to experiencing recall bias, whereby they may believe that their baseline ability was much poorer than it actually was, therefore increasing their perceived improvement following the training intervention.

Discussion

Few validated assessment tools have been developed to assess EBP competence. Assessment tools used to assess EBP competence have primarily focussed on medical students and graduates. [8] The majority of these assessment tools have been self reports and learner satisfaction questionnaires – both of which are limited in their use in assessing EBP competence as previously explained. [911] A recent systematic review appraised instruments for evaluating EBP teaching. [12] It identified 104 unique instruments, most of which were administered to medical students and postgraduate trainees. These instruments aim to evaluate EBP competence of students/trainees, effectiveness of EBP curricula and student/trainee behaviour. It identified that the majority of instruments predominantly focused only on one aspect of EBP (critical appraisal). The Fresno and Berlin assessment tools were the only instruments to evaluate all steps of the EBP process; with both containing measured psychometric properties, objective measured outcomes and established validity and reliability references for individuals. [1214]

The Berlin assessment tool

The Berlin assessment tool measures medical professionals' EBP competence (skills and knowledge). [14] It was constructed by a panel of EBP experts and validated in a group of medical health professionals attending a course on EBP. The Berlin tool consists of 15 multiple choice questions (MCQs), which primarily focused on assessing the participant's epidemiological skills and knowledge. The EBP competencies of participants were compared to a 'control' group of medical professionals at the conclusion of the course. The Berlin tool was able to reliably distinguish expertise between the two groups. Although the Berlin tool is described as a tool that can assess EBP competence, it only assesses one component of EBP ('Step 3' of the EBP process). It contains no assessment of the other three key steps needed to demonstrate complete competence in EBP. Similarly, it has only been designed to assess EBP competence in medicine – it does not assess EBP competence across other health disciplines (e.g. nursing, allied health etc). Therefore, the Berlin tool can only at best truly assess one component of EBP competence.

The Fresno assessment tool

The Fresno assessment tool also measures medical professionals' EBP competence (skills and knowledge). [13] The Fresno tool consists of two clinical scenarios with open ended questions. Participants are required to complete the four key steps of EBP process in order to adequately answer the open ended questions relating to the clinical scenarios. The Fresno tool has been validated with medical residents and has shown to have good inter-rater reliability, since it requires expert knowledge to assess open ended answers. The Fresno tool is the only standardised, objective measure of EBP competence currently available, since it measures the participants' knowledge and skill across the four key EBP steps. It requires the participant to demonstrate their knowledge, competence, performance and action across all four components to successfully demonstrate EBP competence. [2] Although the Fresno tool assesses complete EBP competence, it is limited in its applications as it has only been developed for use in medicine. Therefore, it cannot be used to assess EBP competence in other health disciplines (e.g. nursing, allied health).

What other assessment tools could be used to assess EBP competence?

Written items, such as Extended Matching Questions (EMQs) and MCQs, are best utilised to assess the learner's core clinical knowledge [15]. These assessment tools may be useful in assessing one aspect of EBP competence (e.g. step 3), as is the case with the Berlin test; however, they are not suitable if competence across all four EBP domains is sought. Although there are several papers providing MCQs and EMQs to assess EBP knowledge,[16] no literature currently explores the validity of using MCQs or EMQs as an assessment tool for EBP competence. Several self-directed, continuing education exercises such as the PEARLS (Presentations of Evidence Abstracted from the Research Literature for the Solution of Real Individuals' Clinical Problems) exercise provide clinicians with a formative method of assessing their EBP competence. [17, 18] These exercises also provide practicing clinicians with the opportunity to integrate the principles of EBP in their daily clinical environment. However, little research has been done to ascertain the psychometric properties, and established validity and reliability references for such tools.

The Objective Structured Clinical Exam (OSCE) has been demonstrated as a reliable and versatile tool to assess student clinical competencies, practical and communication skills. [1922] Assessing competence in EBP can be difficult due to the various cognitive skills and knowledge that must be performed. However, the OSCE has great scope to adequately test student competency for various reasons. The OSCE simulates 'real-life' situations that the student may encounter in the clinical environment. Recently, several studies have published preliminary results exploring the value of assessing EBP competency via OSCEs. [2327] All of the studies reported very good construct validity and inter-rater reliability. However, few assess all four components within the OSCE framework and all studies were conducted with undergraduate medical students; thereby limiting the generalisability of the results. Additionally, none of the studies incorporated a validated tool for assessing EBP competence; with participants were assessed according to a pre-determined check list, or on a Likert scale. [2327]

How is EBP competence assessed across health disciplines?

Due to the lack of data in the current literature it is not possible to compare how EBP competence is assessed across difference health disciplines. The Berlin and Fresno tools have both been validated as tools to assess EBP competence within medicine. Another version of the Fresno tool is currently being developed to assess EBP competence in other health disciplines. However, apart from that development no other assessment tools, or studies investigating assessing EBP competence in disciplines other than medicine, have been published.

It has been little more than a decade since the notion of integrating evidence into clinical decision making was proposed and the field of EBP first developed. [4] The past decade has seen tremendous growth in the field, with institutions such as the Cochrane Collaboration, and methodologies, including the systematic review, now widely accepted. Whilst tremendous effort has been put forth into the development of EBP methodologies and teaching EBP competencies, relatively little research has been performed on these topics. [8]

There is a dearth of literature exploring how EBP should best be taught (e.g. lectures, tutorials, case based presentations and journal clubs) – therefore it is difficult to ascertain how EBP competencies should be assessed. Many of the studies published on teaching EBP have focussed on methods to impart new knowledge. In doing so researchers have struggled to define the specific changes they wish to achieve in implementing their EBP teaching interventions (e.g. knowledge, behaviour, or both). Until recently these studies have relied on assessing EBP related competencies on self reports and ad-hoc evaluations, rather than validated assessment tools such as the Berlin and Fresno tools. Even with the advent of the Berlin and Fresno tools, no new studies, apart from the original papers, have adopted their use in assessing EBP competency.

Avenues for future research

Further research needs to be conducted across a variety of areas to comprehensibly explore assessment of EBP competence. Further development on specific assessment tools, such as the Fresno tool, is required so that it can be applied across various health disciplines. Developing an OSCE version of the Fresno tool would further enhance its ability to assess participants' communication skills in EBP. An OSCE version of this tool would provide greater scope to assess specific EBP competencies (such as searching the literature online) in a restricted timed environment that mimics the real time situation that most clinicians will experience.

Impact on clinical behaviour and outcome is an important measure of EBP. The Fresno and Berlin tools assess all four domains of EBP however; neither assesses the fifth element of EBP – evaluating/assessing the effectiveness of the EBP process. One method posed for evaluating the EBP process is conducting an audit of clinical processes and outcomes. Such a process would entail comparing actual practice, as a result of adopting an EBP approach, to a standard of practice. [28] A simple method of conducting such an audit may include incorporating an activity diary to document activities directly related to EBP, such as online searching or critical appraisal. [29] Few current instruments assess changes in behaviour and attitudes in great depth, with none exhibiting acceptable levels of validity. [12] The use of such diaries should also be explored as a method for evaluating any changes in attitude and/or behaviour directly related to EBP.

Whilst achieving a high level of EBP competence might be desirable for many health professionals, others might prefer achieving a high level in only certain domains of EBP. This divergence in needs has lent support to adopting a framework for evaluating teaching methods for EBP. [30] Such a framework ponders evaluating EBP competence according to the need of the learner. A busy clinician may only wish to utilise pre-appraised information, hence it may be appropriate not to evaluate step 3 of the EBP process. Further research is needed to identify whether the Fresno and Berlin tools can be modified and integrate other essential aspects to evaluation, such as clinical audit and EBP competence according to needs. It is also necessary to explore how pragmatic such an integrated approach may be across several health disciplines.

Summary

  • There is a current dearth of evidence exploring the best methods of assessing EBP.

  • The Fresno tool currently is the most appropriate tool to assess EBP competence.

  • Further development of the Fresno tool is needed to accommodate assessment of EBP competence across a variety a health disciplines.

  • Demonstrating EBP competence is a complex task – no single assessment method can adequately provide all of the necessary data to assess complete EBP competence.

  • Future tools must incorporate measures of assessing how EBP competence affects clinician behaviour and attitudes as well as clinical outcomes in real-time situations.

About the Author

DI is a Senior Lecturer in Evidence Based Clinical Practice at the School of Public Health & Preventive Medicine in Monash University. He co-ordinates teaching of EBP across undergraduate and graduate levels.

References

  1. Holmboe E, Hawkins R: Methods for Evaluating the Clinical Competence of Residents in Internal Medicine: A Review. Annals of Internal Medicine. 1998, 129: 42-48.

    Article  Google Scholar 

  2. Miller G: The assessment of clinical skills/competence/performance. Academic Medicine. 1990, 65: S63-67. 10.1097/00001888-199009000-00045.

    Article  Google Scholar 

  3. McAllister M: Competency standards: clarifying the issues. Contemporary Nurse. 1998, 7: 131-137.

    Article  Google Scholar 

  4. Sackett D, Straus S, Richardson W, Rosenberg WRH: Evidence-based medicine. 2000, Edinburgh: Churchill Livingstone

    Google Scholar 

  5. Rosenberg W, Donald A: Evidence based medicine: an approach to clinical problem-solving. BMJ. 1995, 310: 1122-1126.

    Article  Google Scholar 

  6. Parkes J, Hyde C, Deeks J, Milne R: Teaching critical appraisal skills in health care setting. Cochrane Database of Systematic Reviews. 2001, 3: CD001270.

    Google Scholar 

  7. Loza W, Green K: The self-appraisal questionnaire a self-report measure for predicting recidivism versus clinician-administered measures: A 5-year follow-up study. J Interpers Violence. 2003, 18 (7): 781-97. 10.1177/0886260503253240.

    Article  Google Scholar 

  8. Hatala R, Guyatt G: Evaluating the teaching of evidence-based medicine. JAMA. 2002, 288: 1110-1113. 10.1001/jama.288.9.1110.

    Article  Google Scholar 

  9. Green M: Graduate medical education training in clinical epidemiology, critical appraisal, and evidence-based medicine: a critical review of curricula. Academic Medicine. 1999, 74: 686-694. 10.1097/00001888-199906000-00017.

    Article  Google Scholar 

  10. Taylor R, Reeves B, Ewings P: A systematic review of the effectiveness of critical appraisal skills training for clinicians. Medical Education. 2000, 34: 120-125. 10.1046/j.1365-2923.2000.00574.x.

    Article  Google Scholar 

  11. Norman G, Shannon S: Effectiveness of instruction in critical appraisal (evidence based medicine) skills: a critical appraisal. Canadian Medical Association Journal. 1998, 158: 177-181.

    Google Scholar 

  12. Shaneyfelt T, Baum K, Bell D, Feldstein D, Houston T, Kaatz S, Whelan C, Green M: Instruments for evaluating education in evidence-based practice. JAMA. 2006, 296: 1116-1127. 10.1001/jama.296.9.1116.

    Article  Google Scholar 

  13. Ramos K, Schafer S, Tracz S: Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003, 326: 319-321. 10.1136/bmj.326.7384.319.

    Article  Google Scholar 

  14. Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer H, Kunz R: Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ. 2002, 325: 1338-1341. 10.1136/bmj.325.7376.1338.

    Article  Google Scholar 

  15. Wood E: What are extended matching questions?. Bioscience Education Electronic Journal. 2003, 1: 1-8.

    Google Scholar 

  16. Leung W: Multiple choice questions in evidence based medicine. Postgrad Med J. 2000, 76 (899): 594-5. 10.1136/pmj.76.899.594.

    Article  Google Scholar 

  17. The College of Family Physicians of Canada: Pearls. [http://www.cfpc.ca/English/cfpc/cme/pearls/default.asp?s=1]

  18. University of Sydney (Sydney Medical Program): EBM PEARLS. [http://www.gmp.usyd.edu.au]

  19. Anderson M, Stickley T: Finding reality: the use of objective structured clinical examination (OSCE) in the assessment of mental health nursing students interpersonal skills. Nurse Education in Practice. 2002, 2: 160-168. 10.1054/nepr.2002.0067.

    Article  Google Scholar 

  20. Jolly B, Grant J: The good assessment guide: a practical guide to assessment and appraisal for higher specialist training. 1997, London: Joint Centre for Education in Medicine

    Google Scholar 

  21. Wallace J, Rao R, Haslam R: Simulated patients and objective structured clinical examinations: review of their use in medical education. Advances in Psychiatric Treatment. 2002, 8: 342-350. 10.1192/apt.8.5.342.

    Article  Google Scholar 

  22. Sanson-Fisher R, Poole A: Simulated patients and the assessment of medical students' interpersonal skills. Medical Education. 1980, 14: 249-253. 10.1111/j.1365-2923.1980.tb02269.x.

    Article  Google Scholar 

  23. Frohna JG, Gruppen LD, Fliegel JE, Mangrulkar RS: Development of an evaluation of medical student competence in evidence-based medicine using a computer-based OSCE station. Teach Learn Med. 2006, 18 (3): 267-72. 10.1207/s15328015tlm1803_13.

    Article  Google Scholar 

  24. Burrows SC, Tylman V: Evaluating medical student searches of MEDLINE for evidence-based information: process and application of results. Bulletin of the Medical Library Association. 1999, 87: 471-476.

    Google Scholar 

  25. Bradley P, Humphris G: Assessing the ability of medical students to apply evidence in practice: the potential of the OSCE. Medical Education. 1999, 33: 815-817. 10.1046/j.1365-2923.1999.00466.x.

    Article  Google Scholar 

  26. Fliegel JE, Frohna JG, Mangrulkar RS: A computer-based OSCE station to measure competence in evidence-based medicine skills in medical students. Academic Medicine. 2002, 77: 1157-1158. 10.1097/00001888-200211000-00022.

    Article  Google Scholar 

  27. Tudiver F, Rose D, Banks B, Pfortmiller D: Reliability and validity of testing an evidence-based medicine OSCE station. Family Medicine. 2009, 41: 89-91.

    Google Scholar 

  28. Seddon M, Buchanan J, group. E: Quality improvement in New Zealand healthcare. Part 3: achieving effective care through clinical audit. The New Zealand Medical Journal. 2006, 119: U2108.

    Google Scholar 

  29. McCluskey A, Lovarini M: Providing education on evidence-based practice improved knowledge but did not change behaviour: a before and after study. BMC Medical Education. 2005, 5: 40-10.1186/1472-6920-5-40.

    Article  Google Scholar 

  30. Straus S, Green M, Bell D, Badgett R, Davis D, Gerrity D, Ortiz E, Shaneyfelt T, Whelan C, Mangrulkar R: Evaluating the teaching of evidence based medicine: conceptual framework. BMJ. 2004, 329: 1029-1032. 10.1136/bmj.329.7473.1029.

    Article  Google Scholar 

Pre-publication history

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dragan Ilic.

Additional information

Competing interests

The author declares that they have no competing interests.

Authors' contributions

DI conceived and drafted the manuscript.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Ilic, D. Assessing competency in Evidence Based Practice: strengths and limitations of current tools in practice. BMC Med Educ 9, 53 (2009). https://doi.org/10.1186/1472-6920-9-53

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-9-53

Keywords