Research article | Open | Open Peer Review | Published:
Implementing an initiative promote evidence-informed practice: part 2—healthcare professionals’ perspectives of the evidence rounds programme
BMC Medical Educationvolume 19, Article number: 75 (2019)
The translation of research into clinical practice is a key component of evidence-informed decision making. We implemented a multi-component dissemination and implementation strategy for healthcare professionals (HCPs) called Evidence Rounds. We report the findings of focus groups and interviews with HCPs to explore their perceptions of Evidence Rounds and help inform the implementation of future similar initiatives. This is the second paper in a two-part series.
We employed total population, purposive sampling by targeting all of the health care professionals who attended or presented at group sessions exploring the evidence on clinical questions or topics chosen and presented by the HCPs. We conducted and audio-recorded in-person focus groups and one-to-one interviews, which were then transcribed verbatim. Two authors independently coded transcripts. NVivo software was used to collate the primary data and codes. We analysed data guided by the five steps involved in framework analysis; 1) familiarization 2) identifying a thematic framework 3) indexing 4) charting 5) mapping and interpretation.
Thirteen HCPs participated, of which 6 were medical doctors an d 7 were nursing or midwifery staff. We identified the following key domains; organisational readiness for change, barriers and facilitators to attendance, barriers and facilitators to presenting, communication and dissemination of information, and sustainability. During focus groups and interviews HCPs reported that Evidence Rounds had a positive impact on their continuing education and clinical practice. They also provided insights into how future initiatives could be optimised to support and enable them to narrow the gap between research evidence and practice.
Individual, departmental and organisational level contextual factors can play a major role in implementation within complex health services. HCPs highlighted how in combination with clinical guideline development, implementation of evidence could be increased. Further research after a longer period of implementation could investigate how initiatives might be optimised to promote the uptake of evidence, improve implementation and expedite behaviour change.
Evidence-informed decision making is fundamental to the provision of healthcare and central to this is the translation of research evidence into clinical practice. The use of the term “evidence-informed” highlights the need to acknowledge and address contextual influences and consider how the best available evidence can be used in specific circumstances .
There is a need to improve translation of research evidence into practice . The ever-growing volume of research publications [3, 4], the complex nature of research , gaps in skills  such as knowledge of how to interpret statistical information, publication bias  and nonlinear, non-rational processes in decision making  are just some of the potential barriers to translating evidence into practice. Research is growing in fields that attempt to tackle and narrow the gap between knowledge and action such as knowledge translation (KT), dissemination and implementation science, knowledge mobilisation and knowledge brokering.
KT strategies can employ single or multiple components such as professional educational meetings eg. journal clubs, educational materials, educational outreach visits, knowledge brokers, audit and feedback etc. A limitation of the traditional educational model of journal club is that its primary focus is on the critical appraisal of a single source . A Cochrane systematic review of 81 trials involving nearly 11,000 healthcare professionals found that standalone continuing education meetings and those with additional components can lead to small improvements in patient care and clinical practice with the exception of very complex behaviours . In a systematic review by Giguère and colleagues, printed educational materials appeared to positively effect professional practice outcomes. However, it was not possible to measure the size of the effect in relation to patient outcomes . In another systematic review, there was a lack of evidence to assess the effectiveness of knowledge brokers . A Cochrane review reported that small but important changes to clinical practice can result from audit and feedback . A Cochrane systematic review by Forsetlund and colleagues found moderate quality evidence that for HCPs working in primary and secondary healthcare settings, higher attendance at educational meetings was effective at increasing compliance with a target practice. Interestingly, they found decreased effectiveness for outcomes with a lower level of severity and no evidence of effectiveness for complex behaviours. They recommended the use of strategies to increase attendance although they did not specify the necessary components of these strategies .
While outcome evaluations tell us whether an implementation programme does or does not work, they can ignore confounding contextual factors  and fail to tell us more about why, how or under what circumstances a programme does or does not work . In a given population, there needs to be an understanding of barriers and facilitators to evidence based practice . To address these issues, it is necessary to examine the process and context. Translation of knowledge is a context-dependent process, contingent on many factors and takes place in complex healthcare systems .
While a number of definitions for the term dissemination have been suggested, in this paper we define it as “an active approach of spreading evidence-based interventions to the target audience via determined channels using planned strategies” . McCormack 2013 identified the broad goals of dissemination to clinicians as increasing the reach of the evidence b) increasing the motivation to utilise and apply evidence and c) increasing the ability to utilise and apply evidence .
We utilised a multi-faceted KT strategy to actively disseminate evidence to healthcare professionals and promote evidence informed practice including implementation of evidence where appropriate . Our initiative called Evidence Rounds, took place over nine months from July 2016 to March 2017 and featured three core components: 1) six educational group sessions examining the evidence on clinical topics or questions chosen by our target audience, 2) support from a KT professional and 3) the use of multiple modes of delivery to communicate and disseminate information including a dedicated website. For each session, three HCPs presented the evidence on a single topic that was agreed upon by the larger group. It was not mandatory for staff to give presentations rather, they were invited to volunteer to present. In some cases individuals were recommended by their peers and invited by the implementation team to present. After the presentations a discussion forum would take place where the applicability of the evidence was explored and if appropriate, potential resulting actions were identified and discussed by attendees. For example, staff identified a gap in their knowledge relating to evidence around antenatal steroid use for preterm deliveries less than 37 weeks gestational age. So, this topic was chosen to be the focus of an Evidence Rounds educational session. As a result of the Evidence Rounds session, awareness of the evidence increased and it was deemed appropriate to implement the evidence. Further discussed at a multidisciplinary team meeting and other meetings contributed to the process of implementation. The local guideline on preterm premature rupture of the membranes (PPROM) was updated to recommend that antenatal steroids be considered by the consultant dealing with the patient when there is a risk of preterm birth at a gestational age of 23 weeks + 0 days to 23 weeks + 6 days (previously 24 weeks + 0 days). Therefore, this Evidence Rounds educational session led to a guideline update and a change in practice. Some of the core elements of Evidence Rounds were based on Evidence in Practice Groups established by Jacqui LeMay and run by the Clinical Evidence Based Information Service (CEBIS) at University Hospitals Coventry and Warwickshire NHS Trust. We used collaborative processes to design and develop the initiative and actively sought feedback from key stakeholders (HCPs) throughout these phases. By doing this, we could adjust components to better suit the local context and meet the needs and preferences of our target audience. A more comprehensive description of Evidence Rounds and its process of implementation is available in paper 1 of this two-part series .
The objectives of this study were to use focus groups and interviews a) to identify HCP-reported barriers and facilitators to attending and presenting at educational group sessions b) to explore HCPs’ views of Evidence Rounds, particularly as a dissemination strategy, and c) to generate insights to improve the sustainability of future initiatives because evidence about the sustainability of KT interventions is still lacking [22, 23].
Study design, setting and participants
We utilised a qualitative study design which can provide valuable insights into contextual factors and intervention features which influence the success of KT interventions . We used total population, purposive sampling and invited all healthcare professionals working in the maternity unit of one urban hospital in Ireland who attended or presented at at least one Evidence Rounds educational session to participate. We excluded students on placement and other attendees who were not employed as health care staff at the hospital because the primary target audience of the initiative was HCPs who attended and presented at group sessions and we were specifically interested in learning more about their perceptions. We did not prespecify a target sample size before recruitment because we expected it to depend on attendance levels, availability and willingness to participate in the study as well as other potential factors. We decided that it was not appropriate to use other studies to provide the required estimates in our population. Nevertheless, focus groups were expected to consist of 4 to 8 participants each. No more than 10 individuals would be interviewed on a 1:1 format. If more than 10 individuals were to volunteer, selection would be prioritised using the following criteria: a) first priority would be given to any attendee type who is under-represented in the focus groups and b) second priority will be given to attendees who volunteered on a first come, first served basis.
Focus groups and interviews
We gave potential participants the choice to take part in either a focus group or an interview according to their individual preferences. We displayed posters in areas frequently accessed by our target population. To enhance recruitment, we entered each participant into a draw to win a voucher for a local restaurant. We developed an interview guide around the aims of the study (Additional file 1). We asked participants about the determinants impacting their choice to attend or present at our group sessions and how our initiative performed in relation to the goals identified by McCormack (2013). We questioned them about the sustainability of Evidence Rounds and the factors that might increase the likelihood of sustainability for other initiatives. We asked participants questions about specific components and modes of delivery to find out what worked and did not work for them. Our study was granted ethical approval by the Galway University Hospitals Clinical Research Ethics Committee (CREC). During recruitment, we distributed informed consent packs incorporating a participant information leaflet and consent form (Additional file 2), which all participants read and signed before taking part. We changed potential identifiers to protect the anonymity of our participants.
Data collection and analysis
We audio-recorded interviews and one author moderated all focus groups and interviews for consistency. Audio files were transcribed verbatim by a professional from a transcription service who had signed a confidentiality agreement. We chose to analyse the data using Richie and Spencer’s framework analysis which can be used in applied qualitative research . Our decision was based on its suitability for dealing with focus group and interview data and its focus on prospective actionable outcomes. We utilised an iterative rather than a linear process to complete the five components of this method of analysis:
AC who had been present at all recordings re-listened and where appropriate, made corrections and where possible, filled in sections of speech that were inaudible to the transcriber. Two authors (AC and MD) listened to the audio files while reading the corrected transcripts. AC reviewed the observational notes collected by the assistant moderator during the focus groups. Two authors (AC and MD) independently coded the transcripts and noted key points, repeated themes and issues considered important by participants.
Identifying a thematic framework
We began to create a thematic framework drawing from a list of 54 a priori key issues deemed relevant to our study aims, 21 additional emergent issues based from participant responses, and began to connect and look for patterns in participant responses to form analytical themes. The thematic framework took several iterations.
We uploaded the transcripts to NVivo Version 11 and systematically applied the thematic framework by assigning nodes and sub-nodes to text within each transcript. As is common in framework analysis papers, some text was coded into multiple nodes , others were merged and throughout this stage, we made further refinements to the framework.
We reviewed the data and made a decision to chart by core themes rather than cases. We created five tables each with a unique domain and used themes, sub-themes and illustrative quotes that demonstrated the range of participant responses. All authors reviewed the tables and made revisions to improve the presentation of data.
Mapping and interpretation
We referred to the main aims of the study and reviewed the tables. We considered the nature and range of participant perspectives. Using this method, it was possible to extract key dimensions of the barriers and facilitators to attending and presenting at Evidence Rounds, their perspectives of our dissemination strategy, and suggestions to make future initiatives more sustainable.
Thirteen HCPs participated in three focus groups (of between two and four participants), and five in one-to-one interviews. Six medical doctors and seven nursing or midwifery staff participated, of which four were male and nine were female. Our data analysis revealed five core domains regarding HCPs perspectives of Evidence Rounds: (1) barriers and facilitators to attending; (2) barriers and facilitators to presenting; (3) organisational readiness for change; (4) communication and dissemination of information; and (5) sustainability.
Barriers and facilitators to attendance
This domain included three themes namely; departmental context and resources, social context and individual level factors. Our study demonstrated that attendance levels at Evidence Rounds were determined by the availability and workloads of staff, the organisational climate, the attendance of others (colleagues and senior-level staff), level of interest in the topic and extrinsic factors such as certificates of attendance and continuing education units from a professional body. HCPs who had control over the timing for their daily activities experienced less scheduling-related restrictions compared to those who were providing front-line care on hospital wards. Lunchtime was identified as the most likely time to suit the majority of people. The provision of food and beverages was a facilitator to attendance especially for HCPs who would not get another opportunity to eat during their work shift. Keeping sessions within the advertised timeframe was appreciated by busy HCPs. A number of staff came into work on their days off to attend Evidence Rounds. Some line managers agreed to allow time in lieu for these staff. However, this was not offered to all employees and in general, being off duty was a barrier to attendance. We also identified a previously unknown scheduling conflict with a lunchtime meeting for obstetric staff. This may contribute to the fact that there were fewer attendees from this department. Busy workloads and inadequate staffing levels were barriers to HCPs attending sessions. Understandably, clinical care took priority and staff reported that some colleagues had trouble attending even mandatory training sessions (attendance at Evidence Rounds was voluntary).
All staff viewed the interprofessional and multiple disciplinary nature of Evidence Rounds as a facilitator to their attendance. Teamwork and the breaking down of professional silos were among the positive effects they saw from this approach. Consultant attendance and management support for Evidence Rounds was mentioned repeatedly as having a positive effect on non-consultant hospital doctor (NCHD), nursing and midwifery staff attendance. Senior staff acknowledged that their attendance set an example for junior staff. Some HCPs were motivated by a self-perceived benefit to attending e.g., obtaining professional credits for attendance, certificates of attendance or participation, claiming back time spent or enjoying a free lunch Table 1.
Barriers and facilitators to presenting
This domain included two themes of individual level factors, and departmental context and resources. The perceived benefit of taking part and having an interest in the topic or format facilitated presenting at Evidence Rounds. Presenting was considered as a more active way to engage with the literature. Some participants had a long-standing interest in their topic and viewed Evidence Rounds as a platform to promote discussion with colleagues. One participant took part because they wanted to experience giving a presentation in an alternative format to a journal club. Another participant shared that recruiting presenters was a continuing problem.
Staffing issues also influenced decisions to present at Evidence Rounds. Even though evidence was presented by 3 HCPs per session, a lot of preparatory work was required from each individual. Another important finding was that some staff were motivated by a strong interest in the topic, a need to set an example for less experienced HCPs or the desire to experience presenting in this unique initiative. Our study found multi-dimensional factors that can be both barriers and enablers to different individuals, at different times and under different circumstances. For example, the level of self-confidence in presenting in front of others could either encourage or discourage potential presenters from taking part.
Health care professionals who saw themselves or others as being deficient in knowledge, skills, or education or those without a research background, identified this as a barrier to presenting. For some participants, their taking part was done to motivate others and learn the process so that they could provide assistance to future presenters. Others presented because they were well-versed on the topic and felt confident to present. One participant mentioned their fear of being asked difficult questions by attendees but chose to present regardless.
The structure of Evidence Rounds whereby three HCPs presented at each session was encouraging for some staff. Some topics can cause information overload if there is a lot of published evidence so sharing the literature and the workload with colleagues helped to minimise any negative impact on work-life balance. Nevertheless, some HCPs viewed their busy schedules and the extra work associated with presenting as barriers.
The transience of junior medical staff was identified as a barrier because they were rotated to different hospital departments or hospitals every 6 months. They were perceived as being less willing to take part because they would be moved soon afterwards. Support from line managers i.e. protected time to prepare for their presentation, was identified as a determinant that would encourage staff to present Table 2.
Organisational readiness for change
This domain included two themes of acceptability and appropriateness, and pushing and changing slowly. All participants viewed Evidence Rounds as having a positive impact on their practice and education. It highlighted the need to improve their communication with colleagues in relation to approaches to care delivery. Evidence Rounds helped to ensure practice was based on research evidence as well as their own clinical experience and that of their colleagues. The initiative was acknowledged as having a wider scope, decreased risk of bias and more applicability to decision-making for clinical care than traditional journal clubs. Participants welcomed the opportunity for interprofessional collaboration across multiple professions and disciplines and saw this as a means to network and discuss key issues with colleagues they might work with infrequently. There was recognition that getting together for Evidence Rounds sessions helped to foster links between the midwifery, obstetric and neonatal departments.
Most participants acknowledged that key research findings highlighted as actions from Evidence Rounds were slow to be implemented although some recommendations had been implemented in practice. Bridging the gap between research and practice is often contingent on additional steps. Evidence Rounds was seen as a platform to begin a conversation and start to plan the formal process of updating or creating new guidelines so that there could be a widespread change in practice. One participant noted that having the relevant guideline developer in attendance would increase the likelihood of getting the evidence into practice Table 3.
Communication and dissemination of information
This domain included two themes; modes of delivery and communication and dissemination strategy considerations. We asked participants questions to gain insight into their preferred modes of delivery when receiving communications and disseminated information. One important finding is that participant feedback did not identify a single mode of delivery of information that would have engaged all staff. Therefore, our decision to employ a multi-component strategy was appropriate for our target population despite a lack of evidence that this is the most effective approach [27, 28]. HCPs agreed that posters displayed in appropriate hospital areas were effective at drawing attention to upcoming sessions. The use of email to communicate information about Evidence Rounds elicited diverse responses from participants. For individuals who spent at least part of their working day with access to a computer or mobile phone and had a work email address, this was a convenient way to receive information. However, it was not an effective way to reach others such as staff midwives who were more clinically based and either were not issued with, or did not regularly access their professional email accounts. Not all participants used the Evidence Rounds website but those who did, found it accessible and an easy way to retrieve and refer others to past presentations. One participant found the critical appraisal tool links useful to prepare for their presentation. For one healthcare professional who limited their engagement with technology, the website was not a suitable medium. Participants had mixed opinions about the use of text messaging and other mobile messaging technologies such as WhatsApp. On one hand, they acknowledged that they were a means whereby information could be communicated to the intended receiver in an easy and direct manner. On the other hand, many staff voiced concerns that work-life boundaries might be violated or feared that they might be bombarded with messages particularly when they were not working. Many of the HCPs were involved in shift work, which compounded their concern regarding this issue. Word of mouth was a popular method among staff to encourage their colleagues to attend sessions. Multiple reminders and reminders on the day of the sessions were viewed as having a particularly positive impact on attendance Table 4.
Finally, we asked HCPs about their perceptions of the sustainability of Evidence Rounds and how they would make future initiatives more sustainable. Sustainability is difficult to measure  so our qualitative approach allowed us to gain an understanding of context to help others select appropriate strategies during implementation to improve sustainability.
This domain included two themes; staff engagement and collaboration and individual and departmental influences on sustainability. Perhaps the most striking finding is the influence of resources and particularly the HCPs themselves, on sustainability. Their availability, schedules and workloads, level of interest and motivation, the engagement of senior-level staff and their willingness to lead and become champions for initiatives were hugely important factors. These results corroborate suggestions that behaviour change theory may be useful to positively impact implementation processes. HCPs identified a number of factors key to the sustainability of Evidence Rounds and similar initiatives after the support of the KT professional would be terminated. Staff representatives from both the neonatal and obstetric units would need to take ownership and assume responsibility for administrative tasks such as planning and scheduling the meetings. Some participants viewed champions as essential for sustainability. Two participants believed that there needed to be a dedicated person whose job it was to oversee education and another thought their role should include developing clinical practice guidelines.
All participants remarked positively on either or both of the interprofessional and multiple disciplinary aspects of the initiative. One individual believed that senior level staff from one discipline were more invested in keeping it going than those from the other discipline and worried about the impact of this. There was a sense that it was not always easy to come up with topics of simultaneous interest to midwifery, neonatal and obstetric departments. Evidence Rounds was just one of many educational opportunities open to staff during their working week. Taking into consideration the already busy workloads of the healthcare professionals, it was not easy to find staff to volunteer to take on the extra responsibility required to keep it going. Buy-in from senior level staff and having a consultant run the sessions were considered factors that might encourage staff to attend. Rotating presenters and dividing tasks between a team of three was a means of keeping the workload associated with presenting at a manageable level.
Assigning a HCP to pre-schedule the sessions for the coming year was suggested by multiple participants. Participants mentioned the need to involve someone with experience in performing systematic literature searches and to provide additional support to upcoming presenters Table 5.
This study sought to identify the barriers and facilitators to attending and presenting at Evidence Rounds. Our findings agreed with evidence from other studies that the provision of refreshments may be associated with increased HCP attendance at educational events [30, 31].
We wanted to improve our understanding of HCPs’ perspectives of Evidence Rounds as a dissemination strategy. We asked multiple questions to gain insight into their preferred modes of delivery when receiving communication and disseminated information.
Overall, our study findings were consistent with a mixed methods study also conducted in Ireland, to reach consensus on priorities for clinical learning environments for postgraduate medical education . Among the most important domains identified by participants in that study were: support for residents; opportunities to learn with senior doctors; engagement in clinical teams; organisational and working conditions.
Strengths and limitations
Evidence Rounds was an example of pragmatic, community-engaged dissemination and implementation research [33, 34] in which the community is the target population of HCPs. It came about because key stakeholders within our target audience approached staff at the National University of Ireland, Galway having identified a need for support in translating research evidence into practice. One of our authors (AC) was recruited as a PhD student to take on this project as a part of her PhD research, having had experience of implementing Evidence in Practice Groups with HCPs as part of CEBIS in the UK. The key strength of this study is the rich data from our focus groups and interviews, which provides context and contributes to the development of evidence about HCP perspectives on the implementation of KT strategies. Research has consistently shown that contextual factors in a given setting play a large role in the success or failure of these types of activities. We employed qualitative methods of research as a means to gain understanding of interactions between individuals, organisations and their unique contexts . The key finding of studies that have undertaken process evaluations is not only the significance of contextual factors but the fact that they can often have the most significant impact on the intervention . This information could be used to generate insights that decision-makers can use to plan, develop and implement their own initiatives. Notwithstanding, this study has some limitations. Despite our best efforts, recruitment of participants was low. Several factors could have attributed to this for example, the department where most staff worked was above capacity during the period when the focus groups were held. Nevertheless, one-to-one interviews were offered as an alternative.
It is not clear whether our participants were a representative sample of the population. More than half had presented or were involved in the co-design or implementation of the initiative. Therefore, this group may be more invested in Evidence Rounds than other potential participants. We did not capture the perspectives of healthcare professionals who did not attend Evidence Rounds. The inclusion criteria for our study specified that participants must have attended at least one group session.
Another limitation of the study is that the main researcher who implemented the initiative also moderated the focus groups and interviews and was involved in analysis and interpretation. Participants may have felt reluctant to share negative perceptions. To address this concern, at the start of each interview or focus group we emphasised that both positive and negative feedback was being sought to continue the initiative and make it more effective or to make recommendations for future initiatives.
In one systematic literature review, the authors reported that a timeframe of two or more years is required to examine the sustainability of evidence based interventions . Tricco and colleagues (2016) reported that the KT interventions included in their scoping review focused on sustainability ranged from 61 to 522 weeks . Our initiative was implemented over 9 months so this timeframe may not be adequate for optimal conditions to ask participants questions about sustainability.
Framework analysis uses an applied rather than a theoretical approach to research . Therefore, another potential limitation of our study is the lack of theory in our investigation of barriers. The use of a validated tool such as the Theoretical Domains Framework [36,37,38] would have allowed us to map our barriers to pre-specified behaviour change domains.
Implications for research and practice
Further research might explore how to leverage social media platforms to effectively communicate and disseminate evidence to a targeted population. Evidence Rounds was an initiative for HCPs in Ireland, which is classified as a high income country . Questions remain as to how the perspectives of health care professionals working in low and middle income countries might differ from those of our participants. Another important issue for future research is to determine how to integrate the values and preferences of patients, carers or the public, into initiatives like Evidence Rounds to inform and improve the decision-making process . Further, our findings may have implications for the understanding of how behaviour change theory might be used to optimise initiatives and strengthen capacity to improve the implementation of evidence.
The findings of this study uncovered a number of important points to inform individuals planning, developing or implementing initiatives aimed at HCPs. We encourage others to consider interprofessional and multiple professional/disciplinary platforms for these types of initiatives as this approach was valued highly by staff. Those planning similar initiatives may consider multi-component strategies. Our HCPs found more benefit relating to the provision of patient care in group sessions focusing on the best available evidence than on previous events like journal club which critically appraised single articles. Effective communication and dissemination aimed at HCPs requires careful consideration of a number of factors including the mode of delivery, scheduling, frequency, and organisational, departmental and individual-level preferences. Feedback during implementation from the target population may guide decisions to maintain, remove or modify aspects of the strategy. Others implementing similar initiatives may consider factoring in the provision of support and training for presenters who need help with critical appraisal, data presentation, statistical inference etc. The development of a plan for presenters and attendees would be ideal to build organisational capacity. Our health service staff did not feel that they had the skills to perform adequate searches on clinical topics or questions. Like other authors, we recommend the involvement of information specialists, librarians or individuals with experience of designing and conducting search strategies . We also recommend involving and collaborating with guideline developers to increase the likelihood of implementation of research findings.
We set out to identify barriers and facilitators to attending and presenting at group sessions from the perspectives of HCPs, to gain an understanding of their views of Evidence Rounds as a dissemination strategy and to generate insights to improve the sustainability of future initiatives. The results of this study and our analysis have extended our understanding and may be useful for guiding the development and implementation of future KT strategies for HCPs. Our focus groups and interviews highlighted the variability in preferences of mode of delivery in our target audience suggesting the multi-component approaches can be useful. They helped us gain insight into the influence of organisational and individual level factors (e.g. buy-in and support from senior staff, staffing levels and scheduling, self-confidence) on the willingness and ability of HCPs to attend, present at and sustain these types of initiatives.
Although HCPs invariably work in complex systems with unique contexts, this paper may help others to understand factors that can impact the implementation of initiatives to disseminate key research findings and promote evidence informed practice.
Clinical Evidence Based Information Service
Clinical Research Ethics Committee
Health Research Board Trials Methodology Research Network
Non-consultant hospital doctor
Nursing Midwifery Planning and Development Unit
Preterm premature rupture of the membranes
Bowen S, Zwi AB. Pathways to “evidence-informed” policy and practice: a framework for action. PLoS Med. 2005;2(7):e166. https://doi.org/10.1371/journal.pmed.0020166.
Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50. https://doi.org/10.1186/1748-5908-7-50.
Greenhalgh T, Howick J, Maskrey N. Evidence based medicine: a movement in crisis? BMJ. 2014;348:g3725. https://doi.org/10.1136/bmj.g3725.
Waddell C. So much research evidence, so little dissemination and uptake: mixing the useful with the pleasing. Evid Based Nurs. 2002;5:38–40. https://doi.org/10.1136/ebn.5.2.38.
Haynes B, Haines A. Barriers and bridges to evidence based clinical practice. BMJ. 1998;317(7153):273–6.
Grimshaw JM, Eccles MP, Walker AE, Thomas RE. Changing physicians' behavior: what works and thoughts on getting more things to work. J Contin Educ Heal Prof. 2002;22(4):237–43. https://doi.org/10.1002/chp.1340220408.
Vaucher C, Bovet E, Bengough T, Pidoux V, Grossen M, Panese F, et al. Meeting physicians’ needs: a bottom-up approach for improving the implementation of medical knowledge into practice. Health Research Policy and Systems. 2016;14:49. https://doi.org/10.1186/s12961-016-0120-5.
Greenhalgh T. Evidence. In: Greenhalgh T, editor. How to implement evidence based healthcare. Oxford: Wiley-Blackwell; 2017. p. 10–28.
Hatala R, Keitz SA, Wilson MC, Guyatt G. Beyond journal clubs: moving toward an integrated evidence-based medicine curriculum. J Gen Intern Med. 2006;21(5):538–41. https://doi.org/10.1111/j.1525-1497.2006.00445.x.
Forsetlund L, Bjorndal A, Rashidian A, Jamtvedt G, O’Brien MA, Wolf FM, et al. Continuing education meetings and workshops: effects on professional practice and health outcomes. Cochrane Database Syst. Rev 2009, Issue 2. Art. No.: CDhttps://doi.org/10.1002/14651858.CD003030.pub.2.
Giguère A, Légaré F, Grimshaw J, Turcotte S, Fiander M, Grudniewicz A, et al. Printed educational materials: effects on professional practice and healthcare outcomes. Cochrane Database Syst. Rev. 2012, Issue 10. Art. No.: CD004398. doi:https://doi.org/10.1002/14651858.CD004398.pub3.
Bornbaum CC, Kornas K, Peirson L, Rosella LC. Exploring the function and effectiveness of knowledge brokers as facilitators of knowledge translation in health-related settings: a systematic review and thematic analysis. Implement Sci. 2015;10:162. https://doi.org/10.1186/s13012-015-0351-9.
Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6(6):Art. No.: CD000259. doi:https://doi.org/10.1002/14651858.CD000259.pub3.
May CR, Johnson M, Finch T. Implementation, context and complexity. Implement Sci. 2016;11:141. https://doi.org/10.1186/s13012-016-0506-3.
Palinkas LA, Soydan H. Research on process and outcomes. In: Palinkas LA, Soydan H, editors. Translation and implementation of evidence-based practice. Oxford: Oxford University Press; 2012. p. 78–104.
Grol R, Wensing M. What drives change? Barriers to and incentives for achieving evidence-based practice. Med J Aust. 2004;180(6) SUPPL., 15):S57–60.
Harvey G, Kitson A. Translating evidence into healthcare policy and practice: single versus multi-faceted implementation strategies – is there a simple answer to a complex question. Int J Health Policy Manag. 2015;4(3):123–6. https://doi.org/10.15171/ijhpm.2015.54.
Rabin BA, Brownson RC. Developing the terminology for dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. Oxford: Oxford University Press; 2012. p. 23–51.
McCormack L, Sheridan S, Lewis M, Boudewyns V, Melvin CL, Kistler C, et al. Communication and dissemination strategies to facilitate the use of health-related evidence. Evidence Report/Technology Assessment No. 213. (Prepared by the RTI International–University of North Carolina Evidence-based Practice Center under Contract No. 290–2007-10056-I.) AHRQ Publication No. 13(14)-E003-EF. Rockville, MD: Agency for Healthcare Research and Quality; November 2013. https://effectivehealthcare.ahrq.gov/topics/medical-evidence-communication/research.
Boaz A, Baeza J, Fraser A. European Implementation Score Collaborative Group (EIS). Effective implementation of research into practice: an overview of systematic reviews of the health literature. BMC Res Notes. 2011;22(4):212. https://doi.org/10.1186/1756-0500-4-212.
Conway A, Dowling M, Devane D. Implementing an initiative to promote evidence informed practice: part 1— a description of the Evidence Rounds programme. BMC Medical Education. 2018 **** To be completed if accepted.
Wiltsey Stirman S, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7:17. https://doi.org/10.1186/1748-5908-7-17.
Tricco AC, Ashoor HM, Cardoso R, et al. Sustainability of knowledge translation interventions in healthcare decision-making: a scoping review. Implement Sci. 2016;11:55. https://doi.org/10.1186/s13012-016-0421-7.
Yost J, Ganann R, Thompson D, Aloweni F, Newman K, Hazzan A, et al. The effectiveness of knowledge translation interventions for promoting evidence-informed decision-making among nurses in tertiary care: a systematic review and meta-analysis. Implement Sci. 2015;10:98. https://doi.org/10.1186/s13012-015-0286-1.
Ritchie J, Spencer L. Qualitative data analysis for applied policy research. In: Bryman A, Burgess RG, editors. Analysing qualitative data. London: Routledge; 1994. p. 173–94.
Parkinson S, Eatough V, Holmes J, Stapley E, Midgley N. Framework analysis: a worked example of a study exploring young people’s experiences of depression. Qual Res Psychol. 2015;13(2):109–29. https://doi.org/10.1080/14780887.2015.1119228.
Squires JE, Sullivan K, Eccles MP, Worswick J, Grimshaw JM. Are multifaceted interventions more effective than single-component interventions in changing health-care professionals’ behaviours? An overview of systematic reviews. Implement Sci. 2014;9:152. https://doi.org/10.1186/s13012-014-0152-6.
Suman A, Dikkers MF, Schaafsma FG, van Tulder MW, Anema JR. Effectiveness of multifaceted implementation strategies for the implementation of back and neck pain guidelines in health care: a systematic review. Implement Sci. 2016;11:126. https://doi.org/10.1186/s13012-016-0482-7.
Rabin BA, Lewis CC, Norton WE, Neta G, Chambers D, Tobin JN, et al. Measurement resources for dissemination and implementation research in health. Implement Sci. 2016;11:42. https://doi.org/10.1186/s13012-016-0401-y.
Cosimini MJ, Mackintosh L, Chang TP. Number needed to eat: pizza and resident conference attendance. Med Educ. 2016;50(12):1204–7.
Segovis CM, Mueller PS, Rethlefsen ML, LaRusso NF, Litin SC, Tefferi A, et al. If you feed them, they will come: a prospective study of the effects of complimentary food on attendance and physician attitudes at medical grand rounds at an academic medical center. Med Educ. 2007;7:22. https://doi.org/10.1186/1472-6920-7-22.
Kilty C, Wiese A, Bergin C, et al. A national stakeholder consensus study of challenges and priorities for clinical learning environments in postgraduate medical education. BMC Medical Education. 2017;17:226. https://doi.org/10.1186/s12909-017-1065-2.
Holt CL, Chambers DA. Opportunities and challenges in conducting community-engaged dissemination/implementation research. TBM. 2017;7:389–92. https://doi.org/10.1007/s13142-017-0520-2.
Blachman-Demner DR, Wiley TRA, Chambers DA. Fostering integrated approaches to dissemination and implementation and community engaged research. TBM. 2017;7:543–6. https://doi.org/10.1007/s13142-017-0527-8.
Novotná G, Dobbins M, Henderson J. Institutionalization of evidence-informed practices in healthcare settings. Implement Sci. 2012;7:112. https://doi.org/10.1186/1748-5908-7-112.
Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A, et al. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14(1):26–33.
Atkins L, Francis J, Islam R, O'Connor D, Patey A, Ivers N, et al. A guide to using the theoretical domains framework of behaviour change to investigate implementation problems. Implement Sci. 2017;12(1):77. https://doi.org/10.1186/s13012-017-0605-9.
Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7:37. https://doi.org/10.1186/1748-5908-7-37.
World Bank. Ireland. https://data.worldbank.org/country/ireland. Accessed 31 Oct 2017.
Kelly MP, Heath I, Howick J, Greenhalgh T. The importance of values in evidence-based medicine. BMC Med Ethics. 2015;16(1):69. https://doi.org/10.1186/s12910-015-0063-3.
Klerings I, Weinhandl AS, Thaler KJ. Information overload in healthcare: too much of a good thing? Z Evid Fortbild Qual Gesundhwes. 2015;109(4–5):285–90. https://doi.org/10.1016/j.zefq.2015.06.005.
The authors wish to thank the participants who generously gave their time to take part in this study. We would like to express our appreciation to the staff of the women and children’s directorate at University Hospital Galway. We gratefully acknowledge the contribution of Claire Beecher, PhD Fellow, National University of Ireland Galway, who took on the role of Assistant Moderator for our focus groups. We thank Jacqui LeMay, former Head of Knowledge Services at University Hospitals Coventry and Warwickshire NHS Trust who established Evidence in Practice Groups and the Clinical Evidence Based Information Service (CEBIS) and which provided inspiration for Evidence Rounds. Finally, we thank our funding bodies; the Health Research Board-Trials Methodology Research Network (HRB-TMRN); College of Medicine, Nursing and Health Sciences, National University of Ireland Galway; Nursing Midwifery Planning and Development Unit (NMPDU), HSE West/Mid-West.
AC’s PhD studentship is funded by the Health Research Board-Trials Methodology Research Network and the College of Medicine, Nursing and Health Sciences, National University of Ireland Galway. Evidence Rounds was supported by funding from the Nursing Midwifery Planning and Development Unit (NMPDU), HSE West/Mid-West.
Availability of data and materials
Focus group and interview qualitative data cannot be made available to share because the Clinical Research Ethics Committee (CREC) approval for the study was granted on the basis that only the research team and transcription service would have access to the raw data.
Ethics approval and consent to participate
This study was granted approval by the Galway University Hospitals Clinical Research Ethics Committee (CREC) on the 2nd of June, 2016, Ref: C.A. 1505. All participants received an informed consent pack and signed a form indicating their consent to participate.
Consent for publication
The authors declare that they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This manuscript is Part 2 of a two part manuscript series. Part 1 is 10.1186/s12909-019-1489-y - Implementing an initiative to promote evidence-informed practice: part 1 — a description of the Evidence Rounds programme
About this article
- Implementation science
- Knowledge translation
- Evidence-informed practice
- Health services research
- Focus groups