- Research article
- Open Access
Examining aptitude and barriers to evidence-based medicine among trainees at an ACGME-I accredited program
BMC Medical Education volume 20, Article number: 414 (2020)
- The Correction to this article has been published in BMC Medical Education 2020 20:478
The aims of Evidence-Based medicine (EBM) are to promote critical thinking and produces better patients’ outcome (Profetto-McGrath J, J Prof Nurs Off J Am Assoc Coll Nurs 21:364-371, 2005). Accreditation Council for Graduate Medical Education (ACGME) competencies require trainees to locate, appraise and apply clinical evidence to patients’ care. Despite the emphasis that ACGME place on EBM, few organizations provide adequate training in EBM. This is even more critical in regions where medical trainees matriculate from diverse backgrounds of undergraduate medical education, where EBM may not be emphasized nor taught at all. EBM practice has a history of research in the West, however, EBM has not been widely studied in the Middle East.
Clinicians and trainees at Hamad Medical Corporation (HMC) matriculate from many countries in the Middle East and North Africa (MENA) and Asia. Because trainees in Graduate Medical Education (GME) come to HMC from a variety of geographic backgrounds, it is assumed that they also have a variety of experiences and aptitudes in EBM. To assess trainees EBM attitudes and knowledge in the internal medicine department at HMC in Doha, Qatar, the authors surveyed residents and fellows using a two-part survey. The first part was adapted from the evidence-based practice inventory by Kaper to assess trainees’ attitudes and perceptions of EBM. Trainees were also asked to complete the Assessing Competency in Evidence Based Medicine (ACE) tool to evaluate their aptitude in different elements of EBM. The results from the two parts were analyzed.
The average score on the ACE tool among the participants was 8.9 (±1.6). Most participants rated themselves as beginners or intermediate in their EBM capabilities. Higher ACE scores were observed from participants with educational background from South Asia, and among those with more favorable attitudes towards EBM. There was no clear pattern that early incorporation of EBM into practice will result in better ACE score. Participants also reported reasonable abilities in EBM tasks and a favorable work atmosphere for EBM implementation. Lack of knowledge, resources, and time were the most reported barriers to utilizing EBM.
While it is clear that participants are enthusiastic about EBM and see it as a useful method for clinical decision making, their aptitude in EBM is not optimal and there are gaps and barriers for them to practice.
Evidence-Based Medicine (EBM) is described as the “integration of the best research evidence with our clinical expertise and our patient’s unique values and circumstances .” EBM is an essential component in the clinical decision-making process and continuing education for clinicians . The core competencies of the Accreditation Council for Graduate Medical Education (ACGME-I); adopted by many institutions in the Qatar and the Middle East region to improve educational outcomes and to matriculate residents to fellowship programs in the United States, require training in EBM skills within its Practice-Based Learning and Improvement competency. Specifically, these requirements state that, “Residents must demonstrate the ability to investigate and evaluate their care of patients, to appraise and assimilate scientific evidence, and to continuously improve patient care based on constant self-evaluation and lifelong learning” . Despite EBM’s accepted position in clinical practice, many graduate medical education programs struggle to find methods for instructing trainees and incorporating EBM in the daily routines of learners. Many residency programs utilize journal clubs or workshop formats, but there is not sufficient data to suggest that any one method is superior [4,5,6,7,8].
While EBM is generally established as a component of clinical education in the United States, Canada and Europe, it is more novel to many clinicians in the Middle East and North Africa (MENA) region. Research on EBM practice and utilization in the MENA region is limited [9,10,11,12]. Previous studies focusing on EBM in the region have noted that many clinicians have misconceptions about the fundamentals and applications of EBM. Mortada conducted a study in Egypt where many clinicians were found to be lacking in EBM proficiency, despite asserting to be utilizing EBM. Many clinicians lacked knowledge and aptitude in fundamental EBM concepts . Other studies in Kuwait by Buabbas et al., and in Saudi Arabia by Baig et al., found that while clinicians have a very positive attitude towards EBM, they have low proficiency to apply elements of EBM to patient care [14, 15]. Al Wahaibi conducted a study in Oman where many clinicians indicated they felt that many barriers to practicing EBM existed, such as access to evidence resources or time constraints that prevent them from fully applying EBM in clinical practice .
The authors are unaware of any studies that have sought to evaluate EBM capabilities of medical trainees in the State of Qatar.
The objectives of this study were to examine trainees’ self-reported background knowledge, attitudes, use and training in EBM, to test their aptitude for EBM using a validated tool and to look for associations between background variables and the aptitude. The results of this study will ultimately help assess the potential gaps in EBM training and potential areas for future improvements.
This study was conducted among internal Medicine (IM) trainees at Hamad Medical Corporation (HMC). HMC is the main healthcare provider in the state of Qatar, comprising of 12 hospitals that provide all levels of care. HMC hospitals serve as the main teaching hospitals to undergraduate training and the only institution in Qatar that offers graduate medical education . HMC is the main affiliate to Weill Cornell Medicine in Qatar (WCM-Q) where students receive their clinical training and clinical faculty have clinical practices. Many HMC consultants have affiliate faculty appointment at WCM-Q. HMC received ACGME-International (ACGME-I) accreditation in 2011. The internal Medicine residency program received accreditation in 2013, one of the first programs in the Middle East. ACGME-I accreditation is based on the standards for ACGME accreditation of teaching hospitals and medical centers in the United States, requiring base standards for trainee programs, including medical knowledge, medical skills, communication, practice-based learning, systems-based learning and professionalism . Other residency programs in the Gulf Region have adopted the CanMEDS framework which was developed by the Royal College in Canada . The residency training requires an internship, in addition to the standard requirements (standardized exam and interviews) to enroll applicants in its four-year training. HMC’s IM residency program attracts trainees from the MENA region with diverse educational and cultural experiences.
This cross-sectional study utilized two survey instruments. The first instrument was used to collect participants’ demographic, educational background, utilization, and attitudes regarding EBM. This instrument was adapted from The evidence-based practice inventory developed by Kaper, et al., and adjusted to fit the first aim of this study . Affective elements from Kaper, et al’s survey were incorporated into this study with consideration for keeping the survey brief. Additional survey questions were added to assist in understanding how trainees’ access and aptitude with information resources affected EBM, since this was perceived to be potentially a significant factor in this study. After the authors reached consensus, the survey was piloted on five clinicians who were asked to give feedback on the content and wording of the survey. Minor adjustments were then made.
To assess EBM aptitude, the authors reviewed several tools, ultimately selecting the Assessing Competency in Evidence Based Medicine (ACE) Tool developed by Illic et al., for its established validity and ease of administration . The ACE tool consist of a sample scenario, search, hypothetical article, and 15 binary (yes/no) questions. Each question is assessed one point (max 15pts) to produce a final quantitative score for assessment of EBM knowledge and aptitude.
Execution of this cross-sectional study took place between July 2018 and April 2019 at HMC in Doha, Qatar. The study population was comprised of interns, residents, and fellows within the Internal Medicine department. A pool of 185 potential participants were available for inclusion in this study. This pool is representative of the total available IM trainees from the internal medicine department during the duration of data collection of the study. Participation in this study was requested at education activities, via email, and through snowball methods. Participants were asked to complete the survey (26 total questions) addressing their educational background, use, and attitudes regarding EBM. Self-reported comfort levels for each EBM component were obtained using a five-point Likert scale, with 1 representing “least capable” and 5 representing “most capable.” Participants were asked to self-rate their overall EBM ability as beginner, intermediate, or advanced. In addition, self-reported perceptions of institutional attitudes and barriers to implementing EBM were obtained. These data points were also derived from Likert scales, with 1 representing a negative view and 5 the most positive. Individual attitudes to EBM implementation to clinical care were also asked with answers on Likert scales with 1 indicating a negative attitude and 5 being highest positive attitude. Participants’ preferences for information resources for searching for clinical evidence as an EBM process was also gathered. Participants on those final questions could indicate multiple answers. Following the survey on attitudes and self-ratings, participants completed the ACE Tool to gauge their EBM aptitude. Participants completed the Survey and ACE Tool either through paper-based or online submissions. Qualtrics survey tool was used to collect data. Online submissions were entered directly into the survey tool, while paper-based submissions were subsequently entered into the online survey tool by a member of the research team.
Prior to survey execution, this research, including protocol, survey and recruitment materials, was approved by the institutional review boards of both Weill Cornell Medicine – Qatar and Hamad Medical Corporation. Consent was obtained, either in writing or electronically, from each study participant before initiating the survey and ACE tool.
Demographics and work-related variables are summarized using frequency distributions. Similar summary statistics are used for variables related to EBM, including education, practice, attitudes and, self-perceived ability.
For each participant, the ACE score is computed as the sum of the number of correct answers out of the 15 ACE questions. Some participants (n = 10) did not answer the ACE tool in its entirety, with 6 participants only completing 14 of 15 questions and 4 participants completing 13 of 15 questions. For those participants, the unanswered questions were considered as wrong answers.
To assess the potential association between the ACE score and other variables in the study such as demographic variables, self-rated abilities in EBM and time incorporating EBM into practice, the mean ACE score and standard deviation was computed for each category within those variables.
A total of 94 trainees submitted a response to the study instrument. However, 14 respondents did not go beyond answering the demographic questions and hence were excluded from the analysis since they contributed no potential data to the results. The final sample size included 80 trainees (response rate = 43.2%).
The majority of the participants were age 20–29 (83.8%) and male (65.8%). The majority obtained their medical education in the Middle East Region (60.6%). Interns accounted for 11.3% of participants and fellows accounted for 8.8% (Table 1).
Overall, participants’ scores on the ACE tool ranged from 5.0 to 12.0 (out of 15) with an average of 8.9 ± 1.6 and a median of 9.0. This means that participants were able to correctly answer, on average, only 59.3% of the ACE questions. There were no apparent differences in the ACE scores between the two age groups or between genders. Although Interns had the lowest average ACE score among all other groups, there was no major pattern indicating, for example, that an increase in work experience might have a positive impact on the ACE score. For example, average ACE scores for PGY 2 (10.0 ± 1.3) and PGY 4 (10.0 ± 1.8) were slightly higher than that of the Fellows group (8.7 ± 1.6). Those with a medical educational background from South Asia scored the higher on the ACE tool than those graduating from other regions. The biggest difference was between South Asian graduates and those graduating from North Africa, with a difference of 1.7 points out of 15 questions (see Table 1).
Table 2 includes information about participants’ EBM education and its incorporation into practice. Most of the participants learned about EBM in their undergraduate medical education (63.7%) utilizing mainly a mix of face to face and online learning modalities or only face to face (86.3%). The majority started incorporating EBM in their clinical decision-making processes during residency (61.3%). Those who learned about EBM during undergraduate medical education or during residency scored higher; by about 1.75 points, on the ACE tool as compared to those who reported that they haven’t learned about EBM. There was no clear pattern that early incorporation of EBM into practice will result in better ACE score. For example, those who reported not incorporating EBM into their practice had a higher average ACE score; by 1.1 points, compared with those who incorporated it after residency. There was no clear indication that the type of instructional setting has a major impact on the ACE score (see Table 2).
The self-reported comfort levels for each EBM component and for overall EBM ability are presented in Table 3. For most categories, participants rated themselves as either 3 or 4 on a scale of 1 to 5, with 1 indicating least capable and 5 indicating most capable. In all those components, with the exception of applying EBM to a clinical decision, the percentage of participants who indicated that they are most capable did not exceed 12.5% or 1 in 8. Conversely, the vast majority of the participants rated themselves as beginner or intermediate (89.9%) on their overall EBM abilities (see Table 3).
There was a trend of increased average ACE scores with increased self-rating on applying EBM principals in clinical decision making, translating research evidence to the care of patients, critical appraisal of research evidence from literature, and overall ability in EBM. On those questions, difference between those who reported least capable (or beginner) and those who reported most capable (or advanced) ranged between 1.1 to 2.2 points. For the other questions, the increasing trend was observed except for a decrease in the score for the group who self-rated themselves as most capable (Table 3).
Participants were asked to rate their perceptions of institutional attitudes and barriers to implementing EBM (Table 4), the majority of participants gave the highest two possible scores; on a Likert scale from 1 to 5, for those questions indicating a general level of encouragement to apply EBM (59.2%), giving attention to EBM application in clinical decision making (72.6%), a strong level of support from supervisors to apply EBM within clinical decisions (77.2%) and a general atmosphere of frequent discussion of research evidence (57.6%).
When questioned about participants’ attitudes to EBM implementation to clinical care (Table 5), almost all participants gave the two highest scores for EBM usefulness to improving patient outcomes (96.3%), for improving their clinical decisions (93.7%) for feeling that there is a synergy between EBM and their own clinical experience (87.3%). Finally, most participants identified their view of EBM’s most significant limitation, with 51.2% reporting not knowing how to practice EBM, 36.3% citing lack of available resources, 28.7% identifying time limitations, 16.3% reporting lack of support of colleagues and 3.8% reporting lack of support from administration. Except for a single instance with one participant, those who reported the highest two levels of positive attitudes towards EBM had on average higher score on the ACE tools. Again, the maximum difference between any of those two later groups and the other groups didn’t exceed 2.1 on the ACE score.
The top 4 reported resources for searching for clinical evidence as an EBM process were PubMed (82.5%), Google (55%), Google Scholar (40%) and Wikipedia (30%). The most reported reason for selecting the resources of choice was due to ease of use (82.5%) and availability of articles (52.5%) (see Appendix).
The increasing emphasis on ACGME-I standards in the region warrants more attention to the incorporation of evidence in the clinical decision-making process. Although the IM residency training at HMC encourages EBM through implementation of regular journal clubs, the diverse and varied educational backgrounds of trainees makes it essential to assess trainees’ incoming level of EBM knowledge and tailor the EBM instructions to match trainees’ needs. Participants knowledge, as assessed by the ACE tool, showed an average score of 8.9 out of 15 indicating that on average participants correctly answered 59.3% of the questions. There was no clear associations between demographic variables and the ACE score or clear pattern that early incorporation of EBM into practice will result in better ACE score. The ACE score exhibited increasing trends with some of the variables especially the self-rated EBM capabilities and positive attitudes toward EBM but difference didn’t exceed 2.1 (14%) points out of 15.
Participants reported reasonable capabilities of practicing EBM with only 10.1% self-rating themselves as experienced in EBM. Participants also reported a favorable atmosphere in their work environment for EBM implementation. Lack of knowledge, resources and time were the most reported barriers for doing EBM. In some instances, the trend of increasing ACE scores with increasing self-rating EBM capabilities or with more favorable attitudes toward EBM was not complete where we observed a dip in the ACE score for those with highest self-rating and highest favorable attitudes. This might be due to selection bias and the low number of participants in general and particularly in some of the categories defined by the self-rating or attitudes.
In comparison to the study in Australia that validated the ACE Tool, the average ACE tool score of trainees in the Qatari sample was between the means of the participants with novice and intermediate levels of experience, defined by authors as having 2 and 3 years of EBM training respectively. In the study of medical trainees from Australia novice and intermediate trainees scores were (means scores of 8.6 and 9.5 respectively) lower than that of the advanced EBM group in that study (mean = 10.5) . This might be due to the fact that about 40% of the participants in Qatar did not formally encounter EBM until residency and thus they did not have the three to 4 years training in EBM that the advanced group in Australia had.
Although it may be obvious that learning about EBM earlier increases aptitude, there was not a solid trend that early incorporation of EBM within clinical decision-making increases aptitude. This could be due to participants’ diversity in educational backgrounds, not knowing the frequency and details of such incorporations, the potential confounding effect of other variables that the study could not control for due to the small sample size.
This is even more pronounced when comparing the number of participants who incorporated EBM within their clinical decision during undergraduate education (20%), with those whose first instructional contact with EBM was during graduate education (64%). This gap would seem to indicate that a large proportion of participants were exposed to EBM education in a nominal or uncontextualized way. This is congruent with a systematic review that indicated that standalone teaching was not as effective as clinically based teaching in improving residents’ skills, attitudes and behaviors . This is also consistent with the literature, which indicates that, in general, EBM instruction increases knowledge and skills but does not itself impact on physician behavior or clinical practice [21,22,23,24,25].
Most of the participants in this study reported positive to very positive attitudes, both collegially and individually. This is congruent with other studies in the region that showed that clinicians generally have a favorable view of EBM [13, 26, 27]. These studies also cited that despite having positive attitudes towards EBM, this was not necessarily translated into aptitude or knowledge. This was also the case in our study, as participants’ attitudes were very positive to positive, but the average score on the ACE tool was 8.9 out of 15. Additionally, the difference between those with the highest positive attitudes and those with lower positive attitudes was minimal. Besides lacking experience, some of the reasons for this could be the need to guide clinicians about appropriate resources for identifying research evidence and providing them with the protected time to learn. This is evident from participants responses to the questions about EBM’s limitations.
Strengths and limitations
This study has several limitations. This is a cross- sectional study that was conducted at a finite point in time, as such it lacks the depth that a longitudinal study would afford. In future considerations of this research, it would be helpful to examine not only the EBM aptitude levels of residents at a particular point in time, but how their knowledge changes over the course of their graduate medical education. Additionally, it would be valuable to study what instructional methods work best at increasing residents’ knowledge and application of EBM within the clinical setting. The small sample size of 80 contributed to lack in depth in the analysis, such as assessing the effect of potential confounding variables on the results. Moreover, since participation was voluntary and with a response rate of 43.2%, the results of this study should be interpreted with caution and might not be generalizable to the whole population. Finally, since a lot topics are self-reported rating and self-reported attitudes and behaviors, respondents might have different ways of interpreting them, a limitation that the authors had no control over.
However, this study has several strengths. To our knowledge, this is the first study in the State of Qatar and one of the very few in the Middle Eastern / Arab region to look into the perceptions and attitudes of trainees towards EBM at an ACGME-I program. The use of a validated ACE tool helps in making sure that the measure of the actual ability of participants to practice EBM is accurate. This was evident in several increasing trends observed in ACE scores for some of the self-reported EBM capabilities.
Although results of such study should be interpreted with cautions due to the limitations described above, this study still offers an interesting insight into perceptions, attitudes and aptitudes among trainees in the State of Qatar. While it is clear that participants are enthusiastic about EBM and see it as a useful method for clinical decision making, their aptitude in EBM is not optimal and there are gaps and barriers for them to practice. Since health care trainees in Qatar come from a diverse cultural and education backgrounds, assessments of EBM abilities and support to improve such capabilities should be in place during their time in Qatar. As such, there should be more emphasis on identifying gaps in individual learner’s knowledge through assessments at initiation and providing time and resources to advance them to a standard level. Graduate medical education institutions can play an important role in identifying the best practices for educating residents about EBM and help in testing such potential interventions.
Availability of data and materials
The datasets generated and/or analyzed during the current study are not publicly available due institutional guidelines, but are available from the corresponding author on reasonable request.
Evidence Based Medicine
Accreditation Council for Graduate Medical Education
Post Graduate Year
Hamad Medical Corporation
Hamad Medical Hospital
Straus SE. Evidence-based medicine: how to practice and teach it. 4th ed: Elsevier Churchill Livingstone; 2011.
Drummond R. Users’ guides to the medical literature: a manual for evidence-based clinical practice. 2nd ed: McGraw Hill Medical; 2008. Accessed November 14, 2019. http://www.dawsonera.com/depp/reader/protected/external/AbstractView/S9780071590396.
ACGME. Program requirements for graduate medical education in internal medicine; 2020. https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/140_InternalMedicine_2020.pdf.
Coomarasamy A, Khan KS. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004;329(7473):1017. https://doi.org/10.1136/bmj.329.7473.1017.
Flores-Mateo G, Argimon JM. Evidence based practice in postgraduate healthcare education: a systematic review. BMC Health Serv Res. 2007;7:119. https://doi.org/10.1186/1472-6963-7-119.
Thor J, Olsson D, Nordenström J. The design, fate and impact of a hospital-wide training program in evidence-based medicine for physicians – an observational study. BMC Med Educ. 2016;16. https://doi.org/10.1186/s12909-016-0601-9.
Ahmadi N, McKenzie ME, MacLean A, Brown CJ, Mastracci T, McLeod RS. Teaching evidence based medicine to surgery residents-is journal Club the best format? A systematic review of the literature. J Surg Educ. 2012;69(1):91–100. https://doi.org/10.1016/j.jsurg.2011.07.004.
Bednarczyk J, Pauls M, Fridfinnson J, Weldon E. Characteristics of evidence-based medicine training in Royal College of Physicians and surgeons of Canada emergency medicine residencies - a national survey of program directors. BMC Med Educ. 2014;14:57. https://doi.org/10.1186/1472-6920-14-57.
Al-Jazairi AS, Alharbi R. Assessment of evidence-based practice among hospital pharmacists in Saudi Arabia: attitude, awareness, and practice. Int J Clin Pharm. 2017;39(4):712–21. https://doi.org/10.1007/s11096-017-0430-x.
Alahdab F, Firwana B, Hasan R, et al. Undergraduate medical students’ perceptions, attitudes, and competencies in evidence-based medicine (EBM), and their understanding of EBM reality in Syria. BMC Res Notes. 2012;5:431. https://doi.org/10.1186/1756-0500-5-431.
Albarrak AI, Ali Abbdulrahim SA, Mohammed R. Evaluating factors affecting the implementation of evidence based medicine in primary healthcare centers in Dubai. Saudi Pharm J SPJ Off Publ Saudi Pharm Soc. 2014;22(3):207–12. https://doi.org/10.1016/j.jsps.2013.05.003.
Alshehri AA, Al-Khowailed MS, Alnuaymah FM, et al. Knowledge, attitude, and practice toward evidence-based medicine among hospital physicians in Qassim region, Saudi Arabia. Int J Health Sci. 2018;12(2):9–15.
Mortada EM. Clinicians’ self-perceived competency in evidence-based practice at Zagazig University hospitals, Egypt. J Egypt Public Health Assoc. 2013;88(3):121–9. https://doi.org/10.1097/01.EPX.0000436478.40699.a5.
Buabbas AJ, Alsaleh FM, Al-Shawaf HM, Abdullah A, Almajran A. The readiness of hospital pharmacists in Kuwait to practise evidence-based medicine: a cross-sectional study. BMC Med Inform Decis Mak. 2018;18(1):4. https://doi.org/10.1186/s12911-018-0585-y.
Baig M, Sayedalamin Z, Almouteri O, Algarni M, Allam H. Perceptions, perceived barriers, and practices of physicians’ towards evidence-based medicine. Pak J Med Sci. 2016;32(1):49–54. https://doi.org/10.12669/pjms.321.8841.
Al Wahaibi A, Adawi SA, Shehhi WA, et al. Knowledge and attitudes of Oman medical specialty board residents towards evidence-based medicine. Oman Med J. 2014;29(3):178–84. https://doi.org/10.5001/omj.2014.45.
ACGME International > Home. Accessed October 5, 2020. https://www.acgme-i.org/.
The Royal College of Physicians and Surgeons of Canada: CanMEDS. Accessed October 6, 2020. http://www.royalcollege.ca/rcsite/canmeds-e.
Kaper NM, Swennen MHJ, van Wijk AJ, et al. The “evidence-based practice inventory”: reliability and validity was demonstrated for a novel instrument to identify barriers and facilitators for evidence based practice in health care. J Clin Epidemiol. 2015;68(11):1261–9. https://doi.org/10.1016/j.jclinepi.2015.06.002.
Ilic D, Nordin RB, Glasziou P, Tilson JK, Villanueva E. Development and validation of the ACE tool: assessing medical trainees’ competency in evidence based medicine. BMC Med Educ. 2014;14:114. https://doi.org/10.1186/1472-6920-14-114.
Kortekaas MF, Bartelink MEL, Zuithoff NPA, van der Heijden GJMG, de Wit NJ, Hoes AW. Does integrated training in evidence-based medicine (EBM) in the general practice (GP) specialty training improve EBM behaviour in daily clinical practice? A cluster randomised controlled trial. BMJ Open. 2016;6(9):e010537. https://doi.org/10.1136/bmjopen-2015-010537.
McCluskey A, Lovarini M. Providing education on evidence-based practice improved knowledge but did not change behaviour: a before and after study. BMC Med Educ. 2005;5(1):40. https://doi.org/10.1186/1472-6920-5-40.
Rademaker MM, Smit AL, Kortekaas MF, van Benthem PPG, Stegeman I. Attitude and behaviour of Dutch Otorhinolaryngologists to evidence based medicine. PLoS One. 2019;14(12):e0226743. https://doi.org/10.1371/journal.pone.0226743.
Shuval K, Berkovits E, Netzer D, et al. Evaluating the impact of an evidence-based medicine educational intervention on primary care doctors’ attitudes, knowledge and clinical behaviour: a controlled trial and before and after study. J Eval Clin Pract. 2007;13(4):581–98. https://doi.org/10.1111/j.1365-2753.2007.00859.x.
Simons MR, Zurynski Y, Cullis J, Morgan MK, Davidson A. Does evidence-based medicine training improve doctors’ knowledge, practice and patient outcomes? A systematic review of the evidence. Med Teach. 2019;41(5):532–8.
Abdel-Kareem A, Kabbash I, Saied S, Al-Deeb A. Knowledge, practices and attitudes of physicians towards evidencebased medicine in Egypt. East Mediterr Health J Rev Sante Mediterr Orient Al-Majallah Al-Sihhiyah Li-Sharq Al-Mutawassit. 2019;25(2):82–9. https://doi.org/10.26719/emhj.18.010.
Bahammam MA, Linjawi AI. Knowledge, attitude, and barriers towards the use of evidence based practice among senior dental and medical students in western Saudi Arabia. Saudi Med J. 2014;35(10):1250–6.
The authors like to acknowledge all participants in this study for their time and information that will hopefully help improve the practice of evidence-based medicine in Qatar and the region.
Ethics approval and consent to participate
Ethical approval for this study was applied for at and obtained from the IRB of both Weill Cornell Medicine – Qatar (17–00024) and Hamad Medical Corporation (MRC-01-18-025). Informed written or electronic consent was obtained from all participants before completing surveys.
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
The original version of this article was revised: the authors identified an error in the author name of Antoun Kamel. The incorrect author name is: Antoun Kamal. The correct author name is: Antoun Kamel.
About this article
Cite this article
Mahmoud, M.A., Laws, S., Kamel, A. et al. Examining aptitude and barriers to evidence-based medicine among trainees at an ACGME-I accredited program. BMC Med Educ 20, 414 (2020). https://doi.org/10.1186/s12909-020-02341-9
- EBM practice
- Trainees’ knowledge and attitude about EBM