Skip to main content

What drives junior doctors to use clinical practice guidelines? A national cross-sectional survey of foundation doctors in England & Wales

  • The Erratum to this article has been published in BMC Medical Education 2016 16:50



Clinical practice guidelines (CPGs) aim to improve patient care, but their use remains variable. We explored attitudes that influence CPG use amongst newly qualified doctors.


A self-completed, anonymous questionnaire was sent to all Foundation Doctors in England and Wales between December 2012 and May 2013. We included questions designed to measure the 11 domains of the validated Theoretical Domains Framework (TDF). We correlated these responses to questions assessing current and future intention to use CPGs.


A total of 13,138 doctors were invited of which 1693 (13 %) responded. 1,035 (62.5 %) reported regular CPG use with 575 (34.4 %) applying CPGs 2–3 times per week. A significant minority of 606 (36.6 %) declared an inability to critically appraise evidence.

Despite efforts to design a questionnaire that captured the domains of the TDF, the domain scales created had low internal reliability. Using previously published studies and input from an expert statistical group, an alternative model was sought using exploratory factor analysis. Five alternative domains were identified. These were judged to represent: “confidence”, “familiarity”, “commitment and duty”, “time” and “perceived benefits”.

Using regression analyses, the first three were noted as consistent predictors of both current and future intentions to use CPGs in decreasing strength order.


In this large survey of newly qualified doctors, “confidence”, “familiarity” and “commitment and duty” were identified as domains that influence use of CPGs in frontline practice. Additionally, a significant minority were not confident in critically appraising evidence.

Our findings suggest a number of approaches that may be taken to improve junior doctors’ commitment to CPGs through processes that increase their confidence and familiarity in using CPGs.

Despite limitations of a self-reported survey and potential non-response bias, these findings are from a large representative sample and a review of existing implementation strategies may be warranted based on these findings.

Peer Review reports


The aim of evidence-based clinical practice guidelines (CPGs) issued by the National Institute for Health and Care Excellence (NICE) is to improve and standardise quality of care delivered in the National Health Service (NHS) in England and Wales [1]. In the NHS, consultants and general practitioners (GPs) are responsible for leading the interpretation and implementation of NICE guidelines in clinical practice amongst doctors [2]. In line with the General Medical Council (GMC) publication “Tomorrow’s Doctors”, the current standard for United Kingdom (UK) medical education, there is an expectation that UK trained junior doctors both apply clinical practice guidelines and deliver evidence-based care [3].

However, despite the impetus, we do not know if there has been much change in the evidence indicating variable application of CPGs amongst senior clinicians [4]. A recent report from the GMC described factors that influenced clinicians acting in accordance with good practice [5]. A national survey of public health directors, from many years ago, noted the variable implementation of NICE CPGs with their full benefits remaining unrealised [6]. Finally, a comprehensive systematic review of responses by 11,611 clinicians, from 13 years ago, noted that more than a third of clinicians considered CPGs to be impractical, reduced physician autonomy and increased risk of litigation [7]. Anecdotally, these attitudes persist.

These opinions appear to be equally shared amongst future doctors with a national survey of medical students in England and Wales identifying views such as CPGs having negative influences on patient choice and decreasing practice autonomy. Furthermore, marked deficits in their knowledge of CPG development were identified [8]. In addition to the documented variation in the methods and content of evidence-based medicine (EBM) curriculum amongst UK medical schools, these perceptions are likely to be influenced by multiple factors such as supervising clinicians’ opinions, media reports and students' understanding of published material [9, 10].

As recent medical school graduates, it is possible that UK Foundation Doctors may share similar views. Recent research has identified national variation in the level of preparedness for clinical practice by Foundation Doctors [11]. It is therefore crucial to understand their use of CPGs and EBM in frontline clinical practice together with identifying barriers and enablers that may influence their implementation.

These tie in with a national report from the High Level Working Group on Evidence Based Clinical Effectiveness which emphasised the need for primary research incorporating behavioural theories across undergraduate students and practising physicians to understand the drivers that influence evidence-based practice (EBP) [2]. Building on our earlier national survey of medical students, we sought to investigate the extent of use of CPGs and EBM using the Theoretical Domains Framework (TDF).

The TDF is a validated framework to both assess healthcare professionals’ behaviours and inform interventions to change them [12]. It is commonly used within a four-step approach to designing a theory-informed implementation intervention. These steps consist of: (1) identifying the problem (who needs to do what differently), (2) assessing the problem (using the TDF to identify which barriers and enablers need to be addressed), (3) forming possible solutions (which intervention components could overcome the modifiable barriers and enhance the enablers) and (4) evaluating the selected intervention (how can behaviour change be measured and understood) [13, 14].

For example, to improve adherence to a CPG in acute low back pain management in primary care, the TDF was utilised to identify the barriers and enablers to the uptake of evidence into practice [15]. They were then categorised according to the behavioural domain that they operated. Intervention components (behaviour change techniques and modes of delivery) were then selected to overcome the modifiable barriers and enhance the enablers. These components were then combined into a cohesive intervention using the UK Medical Research Council (MRC) framework [15].

The TDF’s strengths lie in enabling a systematic approach starting from ascertaining target behaviours, identifying theoretical domains, matching behaviour change techniques, and finally designing an implementation intervention to address the behaviours [16]. However, it has been criticised for its subjectivity, its focus on behaviour rather than attitudes and the considerable time and resources required for intervention development [13]. Despite its censure, there is a rapidly increasing evidence base of its use in developing effective theory-informed implementation interventions [1216].

Our study had three aims. First, we aimed to investigate the use of CPGs and EBM in frontline clinical practice amongst Foundation Doctors across different locations and specialties in England & Wales. Second, using the TDF, we sought to understand which behavioural domains correlated most strongly with current and future intentions to use CPGs [17]. Thirdly, we wanted to test the usefulness of the TDF through our study of self-reported behaviours.


Survey implementation

We conducted a self-reported, anonymous cross-sectional survey. Participants were Foundation Doctors based in the NHS in England and Wales (see box) working across a range of specialties (total population n = 13,138). The survey was undertaken between December 2012 and May 2013. A third party survey provider (SurveyMonkey) was used to disseminate the survey via a hyperlink to medical workforce deaneries and local education coordinators [18]. Participants were also given the option to complete a paper version of the questionnaire if preferred.

Three methods were utilised to maximise the response rate. First, all 14 regional Medical Workforce Deaneries in England & Wales were asked to forward an invitation e-mail to their doctors. As non-responders could not be identified, a reminder e-mail was sent to all Deaneries a month later to be cascaded. Secondly, every NHS hospital trust in England, and every Health Board in Wales was contacted (n = 167).

In each body, the postgraduate education programme coordinator was identified and approached to forward the survey invitation to Foundation Doctors employed by their local NHS hospital trust or Board. Thirdly, from the above two invitations, foundation doctors were invited and recruited (n = 64) to act as local survey promoters by disseminating questionnaires at local teaching sessions.

Foundation Doctors
In the UK, all newly qualified doctors undertake the Foundation Programme, a mandatory two-year programme of general postgraduate medical training which forms the bridge between medical school and specialist/general practice training [19].
Foundation doctors rotate in six placements, each of four months in duration, through various specialties across a region, and through hospitals ranging from large university to small county hospitals.
Over the programme, they are expected to build on their undergraduate medical education by gaining a breadth of experience on the use of EBM and CPGs in a variety of healthcare settings and to develop the skills necessary to appraise current evidence, in accordance with the UK GMC “Tomorrow’s Doctors” syllabus [3].

Questionnaire development

The questionnaire which we validated and published in the national medical student survey was originally based on published aggregated survey responses to CPGs [6, 8]. As the questionnaire did not formally utilise the TDF, the questions asked could only be mapped to the TDF behavioural domains “Knowledge” and “Social/Professional Role and Identity” retrospectively. It was therefore necessary to expand the questionnaire to assess the 11 behavioural domains identified by the TDF (fully listed in Table 3 below) [12].

From our experience in developing the questionnaire utilised in the national medical student survey, five point Likert scales (responses were: 1 ‘Strongly disagree’, 2 ‘Disagree’, 3 ‘Unsure’, 4 ‘Agree’, and 5 ‘Strongly agree’) together with guidance on its use were deemed to be the most appropriate for inclusion in the questionnaire.

Questions from a previously published survey designed to assess current frequency and future intention to use CPGs utilising eight and four-point Likert scales were used respectively [20]. These were not converted to five point Likert scales due to its validation using those existing scales.

Questions were randomised with approximately half assessing positive views and the corresponding half assessing negative views towards EBM and CPGs. This was undertaken to minimise social desirability bias (i.e. tendency of survey respondents to answer questions in a manner viewed favourably by others) which have been reported in numerous other studies including a recent TDF-based study investigating midwives’ roles in smoking cessation during pregnancy [16].

The questionnaire also collected demographic information related to age, foundation year, ethnicity, gender, country, medical school of qualification, place of work, and current and future specialty of choice. Questionnaire development was undertaken in consultation with a health psychologist throughout the entire process.

Questionnaire piloting

Several development iterations were undertaken during two rounds of piloting to produce a questionnaire that contained both a sufficient number of positive and negative items to minimise response bias and items to calculate a Cronbach’s alpha (measure of the internal reliability of a composite measure) for each domain. Respondent fatigue was assessed during each piloting round.

Piloting was undertaken by two groups of 19 Foundation Doctors (23 Foundation Year One, 6 Foundation Year Two and 9 unspecified) across 18 subspecialties from 7 Workforce Deaneries. Cronbach’s alpha was computed for each of the 11 domains.

Weak to moderate reliability scores were identified in the 1st pilot and thus a repeat pilot was subsequently undertaken utilising different questionnaire versions (i.e. positive and negative wording). Cronbach’s alpha values remained low.

Despite the low values, a decision was undertaken to maintain the same number of positively and negatively worded questions in the final survey from the 2nd pilot. There was the possibility that the relatively small number of students in the pilot samples had been unrepresentative with literature noting that the alpha scores might change when larger samples of participants are utilised [21]. There was also a desire for flexibility, so that the composition of scales could be adjusted if necessary to raise the alpha levels.

Interim versions consisted of a maximum of 53 questions with 5 questions per domain. The final version consisted of a total of 39 questions covering 11 domains with an average of approximately three questions per domain. The Supplement 1 file contains the final questionnaire used.

Statistical analysis

Online responses, records and transcribed paper questionnaires were converted to International Business Machines (IBM) Statistical Package for Social Sciences (SPSS) Version 21.0 [22] for analysis. Items that had been negatively worded were reversed-coded prior to data analysis.

The responses are described through tabulation and proportions. Cronbach’s alpha scores were used to assess the internal reliability of the domains of the original TDF and those of the newly derived domains. A protocol diversion was undertaken when satisfactory Cronbach alpha values were not attained within the pre-specified TDF domains. In consultation with an expert statistical group and previously published studies, an exploratory (varimax) factor analysis was undertaken to derive alternative domains [23].

Multiple linear regression analysis was undertaken to explore which of the final domains correlated with self-reported behaviour on current frequency of CPG use and future use of CPGs. An analysis was also undertaken of the influence of demographic variables such as age, gender, time in practice, current specialty, intended future specialty and place of medical school graduation.


This was an anonymised, questionnaire study that was self-administered, with no feature that could identify the participants. Ethical review was undertaken and granted by the National Research Ethics Service (REF: 04/26/31).


Characteristics of the respondents

The overall response rate was poor at 12.9 % (n = 1693). Of these respondents, 50.6 % (n = 855) were working as Foundation Year One doctors with most (93 %, n = 1509) graduating from a UK medical school. To assess the representativeness of the survey respondents to the national cohort of Foundation Doctors, a comparison between current specialties between respondents and national data (UK National Training Survey 2013) [24] was also undertaken using chi-squared tests (Table 1). These noted no significant differences amongst Foundation Year One (X 2 = 7.390, df = 8, p = .495) and Two Doctors (X 2 = 7.714, df = 11, p = .738).

Table 1 Specialty comparison between survey respondents and national data

Domain creation prior to analysis

Our pre-specified analyses assumed that influences of CPG use amongst respondents would correlate with the TDF behavioural domains. Relatively low Cronbach’s alphas (Table 2) however were obtained for many of the domains (0.195 to 0.713). The TDF did not appear to be a good fit for our survey respondents.

Table 2 Factor loadings from exploratory factor analysis (varimax rotation). Values in bold correspond to the most significant results which were used to group the new domains.

In consultation with the King's College London (KCL) Department of Primary Care and Public Health Sciences Statistical Group, an exploratory factor analysis (varimax rotation) was undertaken to identify response patterns which could be categorised into alternative domains. Table 2 below contains details of factor loadings.

These factor loadings were considered by the research team in conjunction with scree plots. The derived factors were not the result of pre-specified cut-off loadings, but instead were selected on the basis of easily interpretable domains with satisfactory Cronbach alpha values. Five domains were identified and labelled as: “confidence”, “familiarity”, “commitment and duty”, “time” and “perceived benefits”.

These Cronbach’s alpha values (Table 3) ranged from 0.58 (confidence) to 0.79 (commitment and duty). Not all domain scales could be created using a mixture of positive and negative items with the first and third domains consisting completely of positive items. In addition, the fourth domain was composed of five negative items and only one positive item. Factor analysis with an oblique rotation was also undertaken however this did not identify any interpretable response patterns.

Table 3 Comparison between TDF and factor analysis-derived domains

In the varimax rotation-derived results, questions which formed parts of the newly derived domains were then aggregated through simple addition, producing domain scores that ranged from 5 to 35 in the case of the largest (seven item) domain. Out of the original list of 39 potential “domain” questions in the questionnaire, 11 were not selected for use in any scale.

Use of CPGs and EBM in frontline clinical practice amongst Foundation Doctors

Majority (62.5 ± 2.2 %, 1035 out of 1656) of respondents reported consistent use of CPGs for most patients. The commonest frequency of CPG use was reported as 2–3 times per week for 34.4 ± 2.1 % (575 out of 1672). Notably, a minority utilised CPGs less than once per week (18.6 ± 1.7 %, 311 out of 1672). A total of 63.4 ± 2.1 % (1050 out of 1656) of doctors were able to critically appraise evidence relevant to their clinical practice (i.e. Strongly agreed or agreed with the relevant question).

The majority 76.4 ± 1.9 % (1280 out of 1676), of doctors were influenced by senior advice with 49.6 ± 2.2 % (831 out of 1675) of doctors influenced by handbooks rather than CPGs. 86.0 ± 1.5 % (1437 out of 1670) of doctors reported that using CPGs reduced the risk of litigation whilst 29.4 ± 2.0 % (489 out of 1664) of doctors noted that CPG use reduced patient choice. A minority (22.9 ± 1.9 %, 382 out of 1670) of doctors believed that CPGs are inapplicable due to the complex circumstances of patients with 25.5 ± 1.9 % (425 out of 1668) of doctors noting that CPGs increased clinical practice complexity.

A proportion of doctors (43.9 ± 2.2 %, 736 out of 1674) reported being scared of colleagues’ reactions if they followed CPGs instead of senior advice; 76.6 ± 1.9 % (1277 out of 1668) of doctors stated they would feel guilty if a patient’s care was compromised as a result of CPG non-adherence. Whilst a majority of doctors (86.6 ± 1.5 %, 1448 out of 1673) were committed to EBP, a minority of doctors (18.7 ± 1.7 %, 313 out of 1670) found it difficult to practise. An even smaller minority (7.8 ± 1.5 %, 1340 out of 1675) stated that senior clinicians discouraged EBP.

Subgroup analyses by demographic variables were undertaken to ascertain the presence of any variation in the domain scores. T-tests and correlation coefficients were utilised. No difference by gender, time in practice (i.e. Foundation Year One or Two), current specialty, intended future specialty or place of medical training were identified. A small but statistically significant association between “confidence” and age (r = .115, p < .0001, n = 1582) was noted. Due to the number of subgroup analyses undertaken, this is likely due to a Type 1 error.

Domains that correlated most strongly with CPGs and EBM use

Two multiple regression analyses were subsequently undertaken with domains set as independent variables and current and future use of CPGs set as dependent variables. Normal regression assumptions were ascertained.

Regression analyses' findings are summarised in Tables 4 and 5. These suggest that “confidence”, “familiarity” and “commitment and duty” had the strongest and most consistent associations with both current and future use of CPGs in decreasing strength order. “Time” correlated significantly with future use only.

Table 4 Regression of current CPG use variable against the five domains
Table 5 Regression of intended future CPG use variable (three point collapsed version) against the five domains

No association was noted between the “perceived benefits” domain and current use however a small but significant negative correlation was noted with future intended use.


Principal findings

Firstly, we identified a generally high level of regard for EBP among Foundation Doctors in England and Wales. Both internationally and nationally, this is in keeping with existing research that notes favourable attitudes towards EBP irrespective of specialty and seniority [25]. This is in line with the UK Foundation Programme’s Curriculum on maintaining good medical practice [24].

A majority of doctors reported that CPGs reduced litigation and patient choice. This mirrors findings from our national medical student survey and may similarly reflect deficits in their knowledge of CPG development. A significant minority of doctors report low confidence in critically appraising evidence. This is in spite of the Foundation Programme Curriculum specifying competencies in CPGs and EBM that must be achieved in order to satisfactorily complete the 2-year programme [24].

Interestingly, we identified that respondents reported fear or guilt if CPG use was in conflict with senior advice or when patient care suffered as a result. This stands in contrast to a TDF-based interview study of Foundation Doctors investigating prescribing errors and existing literature on fear of medical litigation due to non-adherence to CPGs [26, 27]. Whilst a systematic review of GP attitudes towards CPGs did identify a fear of misdiagnosis, it was deemed to be unrelated to CPG use (i.e. all the participating physicians indicated that it did not matter whether they followed a CPG as long as they did not miss a diagnosis) [28].

Finally, we identified five behavioural domains (“confidence”, “familiarity”, “commitment and duty”, “time” and “perceived benefits”) of which “confidence”, “familiarity” and “commitment and duty” correlated with both current and future intention to use CPGs and “time” correlating with future intentions. Over the past decade, the field of implementation science had sought to explore the influence of other behavioural predictors beyond conventional barriers such as time in CPG and EBM use [29]. Amongst other frameworks, this led to the development and subsequent validation of the TDF [29]. This was therefore a surprising finding.

However, this was in keeping with a systematic review published in 1999 involving 76 articles that included 120 different surveys investigating 293 potential barriers to physician CPG adherence [30]. Here they identified poor familiarity, time and confidence as key barriers. Our study findings have therefore confirmed both the importance and persistence of traditionally researched barriers in improving uptake of CPGs.

We found that the TDF model was not useful in capturing the behavioural domains that influence implementation among our survey respondents. This was unexpected and stands in contrast to several other cross-sectional and qualitative studies that have successfully used the TDF to assess implementation difficulties [1316]. However, this was in keeping with a cross-sectional study of dental healthcare providers which noted similar difficulties in obtaining satisfactory Cronbach’s alpha scores which required exploratory factor analyses to be performed [31].


To date, this is the largest study to quantify the relative importance of various factors in implementing EBM and CPGs in frontline clinical practice by junior doctors. This follows a qualitative interview study of 22 Foundation Doctors that identified seven behavioural domains from the TDF that could potentially be targeted to reduce prescribing errors [26]. Whilst we were unable to map our survey respondents to the TDF, by utilising the TDF to expand and develop our questionnaire, a wide breadth of factors influencing CPGs beyond traditionally researched barriers such as “knowledge” and “time” were explored [29, 32].


Despite an aggressive recruitment strategy of contacting (1) all Workforce Deaneries (n = 14), (2) hospitals (n = 167) in England & Wales and (3) recruiting Foundation Doctors as local champions, the response rate was poor at 12.9 % (n = 1693). This is in keeping with existing research that notes doctors as a group from which it is often difficult to obtain high response rates [33].

As a result of not having a high response rate, there is a possibility of our study receiving responses from a biased sample of doctors. This limits the generalisability of our findings to the national cohort of Foundation Doctors. In addition, we were unable to assess for social desirability bias amongst our survey respondents (i.e. respondents overstating their actual use of CPGs).

Furthermore, owing to the cross-sectional study design, it remains unknown whether their behavioural structure is stable over time. For example, research has noted less favourable attitudes towards CPGs with increasing both seniority of clinical practice and in different specialties [34, 35].

Nevertheless, due to the large sample size, they were sufficient to produce relatively accurate whole population estimates. For example, for the questions on proportion of respondents who use CPGs, the achieved sample size allows a calculation of a 95 % confidence interval (CI) of plus or minus 2.2 %. Future studies should consider the use of unconditional incentives to boost response rates [36].

Implications for further research

Whilst the TDF was developed to encompass a broad range of different theories to arrive at a comprehensive framework that incorporates the main theoretical explanations for behaviour, it is however not an exhaustive list [12]. Unlike other studies that have successfully utilised the TDF to study the implementation of a defined guideline in a distinct healthcare professional group, we surveyed a large number of doctors from all specialties, each of which has a different approach towards both the development and use of CPGs.

One can therefore postulate that the TDF should not be utilised to investigate collective attitudes through a questionnaire study using closed-questions when a large response spectrum is anticipated. For example, the TDF was successfully used in a semi-structured interview study of 22 Foundation Doctors through use of an open-ended TDF-based topic guide that allowed interviewers to prompt participants to discuss their beliefs using general behavioural descriptions [26].

Another possible explanation is that while the questions included in the questionnaire were deliberately designed to “evoke” attitudes and feelings on the TDF domains among the participants, they were also inadvertently worded in such a way to also stimulate and tap into alternative attitudinal structures more strongly than anticipated.

From our experience, caution is advised when applying the TDF to a specific healthcare professional group without first attempting to validate the model. Use of a recently published generic questionnaire that can be tailored to suit different targets, actions, contexts, and times of interest with discriminant content validity established is recommended for future research [23].

Implications for practice

In this large survey of newly qualified doctors, “confidence”, “familiarity” and “commitment and duty” were identified as domains that influence use of CPGs in frontline practice. In addition, a significant minority were not confident in critically appraising evidence.

The implications of our research are potentially two-fold, given the possibility that they could inform both undergraduate and postgraduate medical education in the UK. At the level of medical education, our identified behavioural domains could be used as a framework to inform targeting interventions to improve the uptake of CPGs [29].

Despite publication of the GMC’s Tomorrow’s Doctors which mandates training in evidence-based practice and a national NICE education strategy, our findings are in keeping with a 2009 survey of Foundation Doctors' preparedness for clinical practice that reported only 63.4 % being able to critically appraise relevant evidence [11, 37]. There therefore remains a need to provide support and training to those attempting to undertake EBM teaching.

Clinically integrated teaching has been reported to improve knowledge, skills, attitudes toward CPG and EBM [32]. Training could be nuanced with a focus on encouraging trainee doctors to have greater confidence about seamless guideline use in routine practice, to be more familiar with the concepts and the development of CPGs and to assert their interest in EBP.


This exploratory survey of a large group of newly qualified doctors offers a hypothesis for what influences their implementation of CPGs and EBM. Whilst validated in several healthcare professional groups, the validity of the TDF should not be assumed for all groups.

In comparison with previous research, positive views of CPGs were generally noted amongst these doctors. Guideline developers, educators and implementers should be encouraged by these findings. Further scope to improve implementation remains possible however with a focus on addressing the identified factors as a way forward.



clinical practice guidelines


evidence-based medicine


General Medical Council


National Health Service


National Institute for Health and Care Excellence


  1. 1.

    National Institute for Health and Care Excellence. What We Do. Accessed 6 November 2015.

  2. 2.

    Tooke, J. Report of the High Level Group on Clinical Effectiveness. Department of Health, London, 2007.

  3. 3.

    General Medical Council. Tomorrow’s Doctors: Outcomes and standards for undergraduate medical education. GMC, London, 2009.

  4. 4.

    Sheldon TA, Cullum N, Dawson D, Lankshear A, Lowson K, Watt I, et al. What's the evidence that NICE guidance has been implemented? Results from a national evaluation using time series analysis, audit of patients' notes, and interviews. BMJ, 2004; 329:999.

  5. 5.

    General Medical Council. Factors that encourage or discourage doctors from acting in accordance with good practice. GMC, London, 2012.

  6. 6.

    Davies E, Littlejohns P. Views of Directors of Public Health about NICE Appraisal Guidance: results of a postal survey. J Public Health Med. 2002;24(4):319–25.

    Article  Google Scholar 

  7. 7.

    Farquhar CM, Kofa EW, Slutsky JR. Clinicians’ attitudes to clinical practice guidelines: a systematic review. Med J Aust. 2002;177:502–6.

    Google Scholar 

  8. 8.

    Manikam L, Banerjee J, Blackwell N, Lakhanpaul M. Barriers to Incorporating NICE Clinical Practice Guidelines in Medical Education: The Medical Student’s Perspective. Med Sci Educ. 2011;21(4):347–54.

    Article  Google Scholar 

  9. 9.

    Meats E, Heneghan C, Crilly M, Glasziou P. Evidence-based medicine teaching in UK medical schools. Med Teach. 2009;31(4):332–7.

    Article  Google Scholar 

  10. 10.

    Sandars J, Siddiqi K, Walsh K, Richardson J, Ibison J, Maxted M. An undergraduate education package on evidence-based medicine: some NICE lessons. Med Educ. 2010;44(5):511–2.

    Article  Google Scholar 

  11. 11.

    Goldacre MJ, Lambert TW, Svirko E. Foundation doctors’ views on whether their medical school prepared them well for work: UK graduates of 2008 and 2009. Postgrad Med J. 2014;90(1060):63-8.

    Article  Google Scholar 

  12. 12.

    Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7:37.

    Article  Google Scholar 

  13. 13.

    Huijg JM, Gebhardt WA, Dusseldorp E, Verheijden MW, van der Zouwe N, Middelkoop BJ, et al. Measuring determinants of implementation behavior: psychometric properties of a questionnaire based on the theoretical domains framework. Implement Sci. 2014;9:33.

    Article  Google Scholar 

  14. 14.

    Islam R, Tinmouth AT, Francis JJ, Brehaut JC, Born J, Stockton C, et al. A cross-country comparison of intensive care physicians’ beliefs about their transfusion behaviour: a qualitative study using the Theoretical Domains Framework. Implement Sci. 2012;7:93.

    Article  Google Scholar 

  15. 15.

    French SD, Green SE , O’Connor DA, McKenzie JE, Francis JJ, Michie S, et al. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework. Implement Sci. 2012;7:38.

    Article  Google Scholar 

  16. 16.

    Beenstock J, Sniehotta FF, White M, Bell R, Milne EM, Araujo-Soares V. What helps and hinders midwives in engaging with pregnant women about stopping smoking? A cross-sectional survey of perceived implementation difficulties among midwives in the North East of England. Implement Sci. 2012;7:36.

    Article  Google Scholar 

  17. 17.

    Eccles MP, Hrisos S, Francis J, Kaner EF, Dickinson HO, Beyer F, et al. Do self-reported intentions predict clinicians' behaviour: a systematic review. Implement Sci. 2006;1:28.

    Article  Google Scholar 

  18. 18.

    SurveyMonkey Inc., Palo Alto, California, USA,

  19. 19.

    The Foundation Programme. Accessed 30 November 2014.

  20. 20.

    Taylor H. Physicians’ use of clinical guidelines – and how to increase it. Harris Interact. 2008;8:4.

    Google Scholar 

  21. 21.

    Sijtsma K. On the use, the misuse, and the very limited usefulness of Cronbach’s Alpha. Psychometrika. 2009;74(1):107–20.

    Article  Google Scholar 

  22. 22.

    IBM Corp. Released 2012. IBM SPSS Statistics for Windows, Version 21.0. Armonk, NY: IBM Corp

  23. 23.

    Huijg J, Gebhardt W, Crone M, Dusseldorp E, Presseau J. Discriminant content validity of a theoretical domains framework questionnaire for use in implementation research. Implement Sci. 2014;9:11.

    Article  Google Scholar 

  24. 24.

    UK Foundation Programme Office. UK Foundation Programme Annual Report 2013. UK Foundation Programme Office, Birmingham, 2013.

  25. 25.

    Dunning J, Prendergast B, Mackway-Jones K. Towards evidence-based medicine in cardiothoracic surgery: best BETS. Interact Cardiovasc Thorac Surg. 2003;2(4):405–9.

    Article  Google Scholar 

  26. 26.

    Duncan EM, Francis JJ, Johnston M, Davey P, Maxwell S, McKay G, et al. Learning curves, taking instructions, and patient safety: using a theoretical domains framework in an interview study to investigate prescribing errors among trainee doctors. Implement Sci. 2012;7:86.

    Article  Google Scholar 

  27. 27.

    Mackey TK , Liang BA. The role of practice guidelines in medical malpractice litigation. Virtual Mentor. 2011;13(1):36–41.

    Article  Google Scholar 

  28. 28.

    Carlsen B, Glenton C, Pope C. Thou shalt versus thou shalt not: a meta-synthesis of GPs' attitudes to clinical practice guidelines. Br J Gen Pract. 2007;57(545):971–8.

    Article  Google Scholar 

  29. 29.

    Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A. Making psychological theory useful for implementing evidence based practice: a consensus approach. BMJ Qual Saf Health Care. 2005;14:26–33.

    Article  Google Scholar 

  30. 30.

    Cabana MD, Rand CS, Powe NR, Wu AW, Wilson MH, Abboud PA, et al. Why don't physicians follow clinical practice guidelines? A framework for improvement. JAMA. 1999;282(15):1458–65.

    Article  Google Scholar 

  31. 31.

    Amemori M, Michie S, Korhonen T, Murtomaa H, Kinnunen TH. Assessing implementation difficulties in tobacco use prevention and cessation counselling among dental providers. Implement Sci. 2011;6:50.

    Article  Google Scholar 

  32. 32.

    van Dijk N, Hooft L, Wieringa-de Waard M. What are the barriers to residents' practicing evidence-based medicine? A systematic review. Acad Med. 2010;85(7):1163–70.

    Article  Google Scholar 

  33. 33.

    James KM, Ziegenfuss JY, Tilburt JC, Harris AM, Beebe TJ. Getting physicians to respond: the impact of incentive type and timing on physician survey response rates. Health Serv Res. 2011;46(1 Pt 1):232–42.

    Article  Google Scholar 

  34. 34.

    Tunis SR, Hayward RS, Wilson MC, Rubin HR, Bass EB, Johnston M, et al. Internists' attitudes about clinical practice guidelines. Ann Intern Med. 1994;120(11):956–63.

    Article  Google Scholar 

  35. 35.

    Carlsen B, Bringedal B. Attitudes to clinical guidelines-do GPs differ from other medical doctors? BMJ Qual Saf. 2011;20(2):158–62.

    Article  Google Scholar 

  36. 36.

    Abdulaziz K, Brehaut J, Taljaard T, Émond M, Sirois MJ, Lee JS, et al. National survey of physicians to determine the effect of unconditional incentives on response rates of physician postal surveys. BMJ Open. 2015;5:e007166.

    Article  Google Scholar 

  37. 37.

    NICE. An Education Strategy to Support Implementation of NICE Guidance. National Institute for Health and Clinical Excellence, London, 2008.

Download references


The authors would like to thank Dr. Alison Wright, the King’s College London Department of Primary Care and Public Health Sciences Statistical Group, Dr. Fergus Macbeth, Elaine Collins, Lucy Scarlet, Lucy Connor, Elizabeth White and the rest of the NICE Fellows and Scholars team for their support in undertaking this project. NICE provided administrative support in promoting this survey but was not involved in the study design, data collection, analysis and interpretation, writing of the manuscript and decision to submit the manuscript for publication. No funding was obtained to undertake this study. PL and AK were supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South London at King's College Hospital NHS Foundation Trust. The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health.

Author information



Corresponding author

Correspondence to Logan Manikam.

Additional information

Competing interests

Logan Manikam was appointed as a NICE Scholar to conduct this project. He was sponsored to attend the NICE Annual Conferences in 2012 and 2013. He received no financial reimbursement or salary during this honorary appointment.

Andrew Hoy was an analyst employed by NICE during the time of this research.

Peter Littlejohns was formerly the Clinical and Public Health Director of NICE.

Jay Banerjee was formerly a National Collaborating Centre for Women's & Children's Health (NCC-WCH) Co-Director.

Monica Lakhanpaul was formerly a NICE Fellow, NCC-WCH Co-Director and member of the NHS Evidence Advisory Board.

Authors’ contributions

LM, JB, PL and ML conceived and participated in the design of the study. HF and MW coordinated the study and AH and AK performed the statistical analysis and data interpretation. LM and AH co-wrote the manuscript with all authors helping to draft, read and approve the final manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Manikam, L., Hoy, A., Fosker, H. et al. What drives junior doctors to use clinical practice guidelines? A national cross-sectional survey of foundation doctors in England & Wales. BMC Med Educ 15, 227 (2015).

Download citation


  • Clinical practice guidelines
  • Junior doctors
  • Evidence-based medicine