Skip to main content
  • Research article
  • Open access
  • Published:

Implementing an initiative to promote evidence-informed practice: part 1 — a description of the Evidence Rounds programme

Abstract

Background

Evidence-informed practice is fundamental to the delivery of high quality health care. Delays and gaps in the translation of research into practice can impact negatively on patient care. Previous studies have reported that problems facing health care professionals such as information overload, underdeveloped critical appraisal skills, lack of time and other individual, organisational and system-level contextual factors are barriers to the uptake of evidence. Health services research in this area has been restricted largely to the evaluation of program outcomes. This paper aims to describe the implementation process of an educational initiative for health care professionals working in midwifery, neonatology or obstetrics aimed at disseminating evidence and enhancing evidence-informed clinical care.

Methods

We designed and implemented an educational initiative called Evidence Rounds for health care professionals working in the women and children’s division of an urban hospital in Ireland. It consisted of three core components: (1) group educational sessions examining evidence on topics chosen by staff (2) a dedicated website and (3) facilitation, enablement and support from a knowledge translation professional. We evaluated user engagement in the educational program by monitoring attendance figures and website analytics. We followed up with staff at 3, 16 and 21-month intervals after the last educational session to find out whether evidence had been implemented. We use Lavis’s organising framework for knowledge transfer and the Template for Intervention Description and Replication (TIDieR) checklist to describe the educational program and document the implementation process.

Results

Six educational sessions presented by 18 health care professionals took place over a nine month period with 148 attendances of which 85 were unique (individuals who attended at least one session). During the period spanning from one month before, during and one month after the running of the group sessions, 188 unique visitors, 331 visits and 862 page views were recorded on our website.

Conclusions

Audit and feedback processes can provide quantitative data to track practice outcomes. Achieving sustainable educational programs can be challenging without dedicated resources such as staffing and funding.

Peer Review reports

Background

Evidence-informed practice is central to the delivery of quality care and is associated with improvements in patient outcomes. Emparanza and colleagues [1] demonstrated that mortality and duration of hospital stay figures were reduced among patients treated in an evidence-based practice unit, when compared to either a standard practice unit or previous practice by the same health care professionals (HCPs). Nevertheless, a well-reported gap exists between clinical practice and much of the evidence available to HCPs [2]. When evidence is not translated into practice or there is a delay in the process, patients may be exposed to unnecessary risks or suboptimal care.

There are multiple barriers to evidence uptake and evidence-informed practice. Information overload [3,4,5] is a barrier which can cause HCPs to become overwhelmed by the volume of available literature when seeking to access the most relevant and up to date research [6]. We have long been in an era of information overload with, for example, more than 1 million publications related to biomedical research captured within the PubMed database each year [7]. Many health care workers have limited time to devote to reading research evidence [2]. Conversely, for some healthcare topics, there can be a lack of evidence or indeed, high quality evidence [8].

There is a need for evidence-informed, theory-based educational and knowledge translation initiatives aimed at HCPs to promote evidence-informed practice and the implementation of evidence where appropriate. There is also a knowledge gap regarding the implementation process of these type of initiatives. One of the most frequently used theories in research looking at the adoption of evidence based practice and implementation science is Everett Rogers’ diffusion of innovations theory (1983). Rogers identified four key elements instrumental to the adoption of an innovation; the innovation itself, communication channels (hereby referred to as modes of delivery), time, and the social system. He categorised stakeholders into five groups according to how they adopt innovations over time; innovators, early adopters, early majority, late majority and laggards [9].

Grimshaw and colleagues [10], highlight that there is a considerable body of evidence relating to KT strategies yet it remains incomplete. A much-debated question is whether combined or single component strategies are more effective [11]. Intuitively, a multicomponent strategy might be more effective when seeking to engage as many clinicians as possible, some of whom may have preferences or circumstances that makes a particular component work for them. However, Squires et al. [12] found that interventions with multiple components were no more effective than single component interventions. They also concluded that the effectiveness of multifaceted interventions did not increase incrementally as the number of components increased. It might be that multiple components used in some studies addressed the same rather than diverse issues or barriers and if so, then this might explain why they were not judged to be more effective. In a systematic review by McCormack et al. [13] multicomponent dissemination strategies focusing on reach, motivation and ability strategies were more likely to affect clinicians’ behaviours than single-component strategies. Another systematic review demonstrated that multifaceted interventions focused on educational meetings to increase implementation of physiotherapy clinical guidelines may improve some outcomes relating to practice but failed to have a positive impact on patient health outcomes or reducing costs [14]. Educational meetings on their own or in combination with other interventions may improve clinical practice or patient outcomes but may not change complex behaviours [15]. A Cochrane systematic review reported that interprofessional education may improve patient outcomes and improve adherence to clinical guidelines although the evidence was judged to be low quality [16]. Wallace and colleagues found that targeted messaging, summaries of research evidence and educational visits may improve the uptake of key research findings [17]. The inclusion of local opinion leaders in an intervention may make it more likely to align HCP behaviours with the desired practice [18]. In a before-and-after study by Segovis, the provision of food was identified by HCPs as a motivating factor to attend grand rounds [19]. According to the National Implementation Research Network (NIRN) based in the United States, an enabling context is an essential component of evidence-based programs for increasing their usefulness [20]. Implementation outcomes and the use of evidence can be driven to a large extent by contextual factors and their methods of delivery [21,22,23]. Contextual influences on implementation can be both barriers and enablers to different people at different times, under varying circumstances. In a recent systematic review, Geerligs et al. found that barriers and facilitators to implementation processes identified by HCPs were experienced at system, staff and intervention levels [24]. The authors recommend taking these three domains into account when designing implementation strategies. Hamilton and Mittman [21] and Proctor [25] have highlighted the need for further research to describe the implementation of these types of initiatives in sufficient detail.

Informed by this evidence, Evidence Rounds featured a multifaceted strategy centred around educational meetings and focused on increasing the reach of evidence and the motivation and ability to use and apply evidence. We also took an interprofessional approach, by involving multiple professions (midwifery, neonatology and obstetrics) and working with opinion leaders. We designed the initiative to address individual and organizational level factors and adapted it when necessary throughout the implementation process. We arranged for a local catering service to provide food at each session. Our description of the implementation of Evidence Rounds adds to the literature on educational initiatives in applied health services research. There is a general paucity in the existing literature of studies that provide insight into how contextual factors have influenced dissemination and implementation efforts.

Evidence Rounds was based loosely on an intervention conceived by Jacqui Le May, former Head of Knowledge Services at University Hospitals Coventry and Warwickshire, NHS Trust in the United Kingdom (UK). There, members of the Clinical Evidence Based Information Service (CEBIS) team run Evidence in Practice Groups to examine evidence in various departments within the hospital. Topics and questions are linked to a specific patient cases, series of patient cases or other general topics. As well as incorporating the best available evidence into our group sessions, we used evidence from key findings of systematic reviews and other research to inform the design and implementation of the initiative.

The goal of Evidence Rounds was to bridge the gap between evidence and practice through an educational initiative aimed at HCPs. The objectives were to disseminate the best available evidence to HCPs on topics of their choosing during group sessions; to promote evidence-informed practice through the provision of an in-person group platform for staff to discuss the implications of the evidence, the barriers and facilitators to its implementation and, to enhance evidence-informed practice by identifying and assigning resulting actions where appropriate.

The aims of this paper are to describe the process of planning, designing and implementing this multi-component educational initiative, to report data on quantitative performance indicators monitoring engagement during the implementation process and to provide follow up information regarding the implementation or lack of implementation of the evidence. The second paper in this two-part series reports the findings of focus groups and interviews about Evidence Rounds with HCPs who attended or presented at the group educational sessions [26].

Methods

In Fig. 1, we present a logic model developed iteratively to demonstrate the underlying logic behind the implementation strategy for Evidence Rounds. We designed it with the understanding that implementation processes and health systems are complex. May and colleagues [27] advised that implementation processes be understood as “non-linear, emergent and dynamic events within systems.” The model focuses on the components of the initiative, our planned activities and what we hoped to achieve through the initiative. We informed the pre-implementation and the implementation phases by adapting aspects of the CEBIS Evidence in Practice Groups, Rogers’ diffusion of innovations theory [9], the framework for knowledge transfer [28] and the Knowledge Translation Planning Template [29]. We used Rogers’ diffusion of innovation theory to drive the implementation strategy [9].

Fig. 1
figure 1

Process-oriented logic model of the Evidence Rounds educational initiative

The organising framework for knowledge transfer strategies conceived by Lavis et al. [28] was used to develop the implementation strategy. This framework asks five key questions: 1. What should be transferred to decision makers? 2. To whom should research knowledge be transferred? 3. By whom should research knowledge be transferred? 4. How should research knowledge be transferred? 5. With what effect should research knowledge be transferred?

  1. 1)

    What should be transferred to decision makers? To improve the likelihood of evidence uptake, HCPs were invited to select topics or clinical questions relating to treatment or diagnostic interventions. A member of staff who later confirmed with colleagues their agreement on her chosen topic suggested the topic for the first group session at a planning meeting. For subsequent sessions, a collective decision was made at group sessions about the topic to be covered in the next session. Sometimes, several suggestions were considered before a decision was made. At the request of one HCP, a topic suggestion sheet was passed around during sessions to accommodate staff who were reluctant to propose topics in front of their colleagues. HCPs were asked to submit suggestions based on gaps they perceived in their knowledge of the evidence or where there was a perceived gap between the evidence and their own practice. Topics were not limited to those known to have clear and conclusive evidence and suggestions covering controversial treatments, those that had conflicting evidence findings, or a lack of evidence, were encouraged. Our aim was to transfer the best available, most up to date, relevant and applicable evidence. A list of sub-questions or topics for each educational session is featured in Table 3. At the start of each session, national and international official guidance was explored to increase awareness of current recommendations. All of the selected topics and clinical questions involved healthcare interventions so we were particularly interested in accessing and presenting randomised trials and systematic reviews of trials. However, for all topics, we also included non-randomised or observational studies so that qualitative aspects of topics could be taken into consideration. For some sessions, HCPs requested and found it valuable to read reports on what other units were doing and compare and contrast their own practice. The final selected topics are presented in Table 3.

  2. 2)

    To whom should research knowledge be transferred? Our target audience consisted of HCPs working in the neonatal and obstetric departments in the women and children’s division of an urban hospital in Ireland. We took a multi-disciplinary and interprofessional approach to maximise the potential for the dissemination and implementation of evidence and to promote collaboration with the ultimate goal of implementation of evidence where appropriate. We also invited staff members outside of key departments when deemed appropriate to the topic. For example, laboratory staff were invited to attend the fourth session: antenatal screening for group B streptococcus. When these staff were identified, invitations were extended through the presenting HCPs. The implementation team also invited students who were on placement in the departments during the time of the sessions.

  3. 3)

    By whom should research knowledge be transferred? We took a team approach to the transfer of knowledge. Three HCPs presented at each session with representatives from both medical and nursing and midwifery staff in each session. Staff from the neonatal and obstetric departments presented when the topic covered both disciplines. To recruit HCPs to present, staff were asked to volunteer during group sessions or previous presenters contacted individuals they perceived as suitable candidates. The KT professional who is an author on this paper (AC) introduced each session, discussed the literature search process, the breadth of the literature on the chosen topic, and directed discussion to decide on the next topic.

  4. 4)

    How should research knowledge be transferred? The KT strategy involved both active and passive methods of promotion, communication and dissemination. In line with Rogers’ diffusion of innovations theory [9], we accepted that our target audience was likely to adopt the evidence presented in the educational initiative at different points in time. Therefore, we deemed it appropriate to use a multifaceted educational strategy. To increase the reach of the evidence: We identified and arranged meetings with key staff at the hospital - to build an implementation team and identify potential champions or opinion leaders that could help us communicate with HCPs and disseminate evidence. Our group sessions targeted multiple disciplines and professions to increase the impact. We employed a variety of communication and dissemination modes of delivery (See Table 1) e.g. face-to-face meetings, telephone calls, emails, an open access website, based on the assumption that we were likely to encounter stakeholder groups similar to those identified by Rogers [9] who may adopt the initiative at different points in the process and for a variety of reasons. To increase motivation to use and apply the evidence: HCPs took ownership by choosing topics that had the potential to improve their practice and that were meaningful and timely for them. We focused on the applicability of the evidence to the local context. When requested, we presented information on how other national and international units were providing healthcare services relating to the topic for benchmarking purposes. In 3 of the 6 sessions, retrospective audit data were presented to capture data relating to recent practice and potentially act as a driving force to change future practice. To increase the ability to use and apply the evidence: We addressed the issue of information overload by designing and performing pragmatic yet comprehensive search strategies, sifting through the frequently large volume of search results and discarding obviously irrelevant records. Searches were ran on appropriate databases and websites including; the Cochrane Library databases, Medline or PubMed, CINAHL, Embase, Google (to identify guidelines and grey literature), relevant professional bodies and organisations’ websites, healthcare organisations’ websites, DynaMed, Trip Database Pro and the Geneva Foundation for Medical Education and Research (GFMER). Presenting HCPs were provided with a significantly reduced number of records to screen for inclusion. After feedback from the first session, a “Quick Guide for Presenters” (see Additional file 1) was provided to HCPs who had signed up to present. Key data and findings from multiple studies were extracted and summarised during group sessions. We fostered an environment where critical appraisal was key and highlighted the strengths and weaknesses of included evidence. The KT professional provided support and enabling services to presenters to reduce their workload and improve levels of health information literacy e.g. obtaining full text of papers, helping with interpreting statistical data e.g. forest plots and key statistical concepts such as P values and confidence intervals, identifying appropriate critical appraisal tools, sourcing images to put into presentations (in compliance with licensing and copyright restrictions), providing feedback on presentation slides, populating reference sections, extracting key information and data, providing guidance on selecting papers for inclusion etc. During the discussion forum, obstacles to the implementation of evidence were identified to increase the likelihood that they would be addressed and plans for change could be tailored [30].

Table 1 Modes of delivery used in Evidence Rounds for promotional purposes, communication and dissemination

At the initial planning meetings, we emphasized that we did not intend on imposing the Evidence in Practice Groups model from the UK on staff at our hospital. Baumann recommends taking an adaptive approach to implementation because no single intervention will be a perfect fit in all settings [31]. Proactive adaptation played a key role in our strategy [32] so that we could shape the initiative in response to important individual, organisational and contextual factors. We tailored it to suit the local context with currently available information before implementation and adapted it iteratively throughout in accordance with feedback loops, observations and performance indicator monitoring. See the Table 2 for a list of core components and some adaptations.

  1. 5)

    With what effect should research knowledge be transferred? The main aims of Evidence Rounds were to provide an educational program that disseminated evidence to health care professionals and promoted evidence-informed practice. We undertook process evaluation by capturing and monitoring data for key indicators throughout the initiative. Firstly, we distributed sign-in sheets at group sessions to record attendance figures. We wanted to track neonatal and obstetric staff attendances and identify potential patterns. Secondly, we monitored usage analytics on our dedicated website. Both informed us of the penetration of Evidence Rounds to the HCP community within the department. Thirdly, our focus groups and interviews provided self-reported data on how the HCPs were receiving the educational initiative and how they viewed it in relation to their own evidence-informed practice. Using this data, we identified individual, organisational and intervention level barriers and facilitators to attending and presenting at Evidence Rounds. We were better able to understand the complexity of the behaviours and gauge opinions on whether and how Evidence Rounds was promoting evidence-informed practice for them. These results are published in the second paper of this two part series [26]. Fourthly, we followed up with the implementation team to check the status of evidence implementation. Dissemination strategies play an essential role but on their own, do not guarantee the implementation of evidence [13, 33]. For this reason, and when appropriate during the discussion forum, barriers, facilitators and specific actions to aid implementation of evidence were identified, discussed and actions were assigned to specific HCPs as appropriate. Three months after the final group session, we followed up with HCPs on the implementation team to see whether Evidence Rounds had influenced practice. They reported that a small number of recommendations from Evidence Rounds had been implemented. When implementation happens, the process can be slow, particularly for more complex issues. In the interviews and focus groups, several HCPs explained that changes in practice often cannot occur until the desired change is firstly made a part of a clinical guideline [26]. Writing and updating guidelines can be a lengthy process. Further follow up with the same HCPs occurred 16 and 21 months later.

Table 2 TIDieR checklist

We took measures to plan for sustainability (continuation of the initiative after support from the KT Specialist ended) such as developing tools that could be handed over easily. For example, we chose a web hosting platform that allowed us to build the website and create content using high quality templates without the need for coding or programming skills. Our choice was deemed the most likely option to promote sustainability because at the end of the period of support from the KT specialist, it could easily been handed over to a HCP lacking advanced technical skills of website design and administration/maintenance. We also linked in with library staff to confirm that they would be willing to design and conduct future searches, had conversations with key people, discussed it during our focus groups and interviews and offered guidance during a handover period. We planned to assess sustainability by following up with the implementation team to find out whether the educational initiative had continued to be delivered.

We employed the Template for Intervention Description and Replication (TIDieR) checklist, to complement the reporting of the initiative [34]. This reporting guideline has been recommended for use to report intervention implementation [35].

We collected and report a number of quantitative measures:

  • website analytics captured by our website hosting platform. We report figures spanning the period from one month before the first group session, during the group sessions and one month after the last group session:

  1. 1.

    unique visitors defined as the number of visitors visiting for the first time

  2. 2.

    visits defined as the number of browsing sessions and can involve multiple page views

  3. 3.

    page views defined as the number of times a webpage from our website was fully loaded by a browser

  • the total number of HCPs and other attendees who attended each Evidence Round session (other attendees included academic partners and students from health-related higher education courses on placement at the hospital site)

  • the total number of HCPs who presented at an Evidence Rounds session.

We contacted the 5 HCP members of the implementation team three, 16 and 21 months after the initiative ended to find out whether Evidence Rounds had led to the implementation of research findings.

Results

Six Evidence Rounds group sessions were run over a 9-month period. There was a total of 148 attendees of which 85 were unique (individuals who signed the attendance sheet at a minimum of one session). See Table 3 for a breakdown of attendance numbers by educational session. Attendance numbers fluctuated according to factors such as the chosen topic (some of which were common to midwifery, neonatology and obstetrics, and some of which were primarily neonatology-focused), level of interest in the topic subject matter and clinical staffing levels.

Table 3 Evidence Rounds session details and follow-up

Seventeen HCPs who worked at the hospital presented during the period of implementation. One external HCP (DD, who is an author of this paper), was asked to present at a session because he authored two relevant papers that were selected for inclusion in the presentation (session number 6).

Between 01/06/2016 and 29/04/2017, 188 unique visitors, 331 visits and 862 page views were recorded on the website. See Fig. 2 for a breakdown of these figures.

Fig. 2
figure 2

Website analytics data showing number of unique visitors to the website, visits (number of browsing sessions) and number of page views (requests on the website which were fully loaded), by month and year

In Table 3, we present the clinical questions and topics explored, the resulting actions identified during the discussion forum and the actual resulting actions that were carried out for each of the 6 educational sessions. This information was gathered during follow up with the implementation team.

Follow up with the implementation team also confirmed that the educational program was not sustained beyond the period of support from the KT Specialist.

Discussion

Limitations and lessons learned

We would like to acknowledge that our study has several limitations. Firstly, the six educational sessions were carried out over nine months. It is unlikely that this was a sufficient duration of implementation to allow for the initiative to realise its full potential, become fully integrated or adopted by staff that Rogers [9] might describe as the late majority and laggards. In this way, the potential of Evidence Rounds to demonstrate sustainability may have been restricted. Secondly, our theoretical approach did not include pedagogical theory to develop our educational initiative. Thirdly, attendance data collected through sign-in sheets can be viewed as a conservative estimate of actual attendance figures. We are aware of several attendees who did not sign in during sessions for reasons such as being bleeped or called away to attend to a patient. Fourthly, the number of unique visitors recorded using website analytics may be inaccurate because the same person could potentially access the website multiple times using more than one IP address or computer. This would have resulted in them being counted as more than one user. Fifthly, our initiative was implemented at one institution and may be received differently by HCPs in other settings. Sixthly, the information presented in Table 3 regarding follow up lacks quantitative data measures of practice changes following the educational sessions, compared to prior practice. The study by Emparanza [1] provides a good example of quantitative outcome measurement.

In terms of implications for practice, the issue of sustainability is important to consider. Despite the steps we describe in the Methods section aimed at increasing the sustainability of the initiative, it was not sustained beyond the period of support from the KT professional. Without a nominated person or team with dedicated professional hours and taking into consideration the time spent planning and developing, we were aware that there was reduced potential to sustain the initiative at our busy hospital setting. Ideally, future initiatives will have a longer period of implementation to allow for appropriate capacity building and so that they have a better chance of integration and becoming accepted and adopted by staff.

A key learning point for us has been that initiatives like Evidence Rounds are only as strong as the people involved. We recommend collaboration and partnership with the target audience starting from the planning stages and continuing throughout. The multi-disciplinary and interprofessional approach worked very successfully for Evidence Rounds and according to informal feedback and our focus group and interview data [26] it was highly valued by our target audience. We engaged with them, listened to their feedback and found ways to address their identified needs when possible. Our key message in this regard would be to network and engage with champions, opinion leaders, enthusiastic individuals, early adopters and do not wait around for laggards. Involving an Information Specialist or Librarian or someone who has knowledge of appropriate databases and other online resources and is experienced in carrying out systematic and detailed literature searches is essential. They can help to address issues of information overload and reduce the workload of HCPs involved in presenting.

Adaptation and adherence to a small number of core components was a fundamental of the initiative. Baker et al. [37], found that positive outcomes are more likely if an adaptive approach is taken to implementing interventions when compared to no intervention or dissemination alone. Feedback from HCPs who participated in our focus groups and interviews suggested that choosing topics based on when guidelines are being created or updated increases the likelihood of implementation of evidence.

Further studies are required to assess the effectiveness of Evidence Rounds, similar educational initiatives including those implemented in settings in the developing world. Evaluation could include pre and post-testing of knowledge of topics the initiative addressed, impact on HCP behaviour and patient care outcomes. More studies are needed to better understand and identify additional underlying mechanisms and contextual factors that influence educational programs. Additional research is also needed to understand how a social media strategy might be optimised for use in the delivery of similar initiatives.

Conclusion

Evidence Rounds presents a novel educational initiative to support a knowledge translation strategy targeted at HCPs. It moves beyond the journal club model that was familiar to our target audience. It was designed and implemented based on feedback obtained by proactively engaging with staff. We have helped address the need for more research that provides a detailed account of the implementation of knowledge translation strategies [21, 22]. We have also highlighted the contextual factors and modes of delivery that influence implementation outcomes. This paper therefore, will help others to understand the process involved in implementing an educational initiative. Evidence Rounds was a complex initiative to implement due to individual, contextual and intervention-level factors. We used a multi-faceted strategy to disseminate key research findings to our clinical audience and promote evidence-informed practice. We collaborated with and involved our target audience from the start of the planning phase and throughout implementation. This paper provides useful insight into processes and mechanisms involved in rolling out an initiative. We describe the practical aspects or the process of implementing an educational initiative. The level of detail we have provided will aid reproducibility for those wishing to roll out a similar program or elements of the program. We highlighted contextual factors that had an impact on implementation in our setting so that others might use them to inform the planning of their own initiatives.

Abbreviations

AB:

Áine Binchy

AC:

Aislinn Conway

CEBIS:

Clinical evidence based information service

CREC:

Clinical research ethics committee

DCC:

Delayed cord clamping

DD:

Declan Devane

DN:

Deirdre Naughton

EOGBS:

Early-onset group B Streptococcus

GA:

Gestational age

GBS:

Group B Streptococcus

HCP:

Healthcare professional

HRB-TMRN:

Health Research Board Trials Methodology Research Network

IAP:

Intrapartum antibiotic prophylaxis

JG:

Jane Grosvenor

JJ:

Jean James

KT:

Knowledge translation

MC:

Margaret Coohill

MD:

Maura Dowling

NCHD:

Non-consultant hospital doctor

NICU:

Neonatal intensive care unit

NMPDU:

Nursing Midwifery Planning and Development Unit

RCOG:

Royal College of Obstetricians and Gynaecologists

SOP:

Standard operating procedure

References

  1. Emparanza JI, Cabello JB, Burls AJ. Does evidence-based practice improve patient outcomes? An analysis of a natural experiment in a Spanish hospital. J Eval Clin Pract. 2015;21(6):1059–65. https://doi.org/10.1111/jep.12460.

    Article  Google Scholar 

  2. Grimshaw JM, Eccles MP, Walker AE, Thomas RE. Changing physicians’ behavior: what works and thoughts on getting more things to work. J Contin Educ Heal Prof. 2002;22:237–43. https://doi.org/10.1002/chp.1340220408.

    Article  Google Scholar 

  3. Klerings I, Weinhandl AS, Thaler KJ. Information overload in healthcare: too much of a good thing? Evid Fortbild Qual Gesundhwes. 2015;109(4–5):285–90. https://doi.org/10.1016/j.zefq.2015.06.005.

    Article  Google Scholar 

  4. Greenhalgh T, Howick J, Maskrey N. Evidence based medicine: a movement in crisis? BMJ. 2014;348:g3725. https://doi.org/10.1136/bmj.g3725.

    Article  Google Scholar 

  5. Bastian H, Glasziou P, Chalmers I. Seventy-five trials and eleven systematic reviews a day: how will we ever keep up? PLoS Med. 2010;7(9):e1000326. https://doi.org/10.1371/journal.pmed.1000326.

    Article  Google Scholar 

  6. Grandage KK, Slawson DC, Shaughnessy AF. When less is more: a practical approach to searching for evidence-based answers. J Med Libr Assoc. 2002;90(3):298–304.

    Google Scholar 

  7. Landhuis E. Scientific literature: information overload. Nature. 2016;535(7612):457–8.

    Article  Google Scholar 

  8. Waddell C. So much research evidence, so little dissemination and uptake: mixing the useful with the pleasing. Evid Based Nurs. 2002;5:38–40. https://doi.org/10.1136/ebn.5.2.38.

    Article  Google Scholar 

  9. Rogers EM. Diffusion of innovations. 3rd Edition. New York: The Free Press; 1983.

    Google Scholar 

  10. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50. https://doi.org/10.1186/1748-5908-7-50.

    Article  Google Scholar 

  11. Hulscher M, Wensing M, Grol R. Chapter 18 Multifaceted strategies for improvement. In: Grol R, Wensing M, Eccles M, Davis D, editors. Improving patient care: the implementation of change in health care. 2nd edition. West Sussex: Wiley; 2013.

    Google Scholar 

  12. Squires JE, Sullivan K, Eccles MP, Worswick J, Grimshaw JM. Are multifaceted interventions more effective than single-component interventions in changing health-care professionals’ behaviours? An overview of systematic reviews. Implement Sci. 2014;9:152. https://doi.org/10.1186/s13012-014-0152-6.

    Article  Google Scholar 

  13. McCormack L, Sheridan S, Lewis M, Boudewyns V, Melvin CL, Kistler C et al. Communication and Dissemination Strategies To Facilitate the Use of Health-Related Evidence. Evidence Report/Technology Assessment No. 213. (Prepared by the RTI International–University of North Carolina Evidence-based Practice Center under Contract No. 290–2007–10056-I.) AHRQ Publication No. 13(14)-E003-EF. Rockville: Agency for Healthcare Research and Quality; 2013. https://effectivehealthcare.ahrq.gov/topics/medical-evidence-communication/.

  14. van der Wees PJ, Jamtvedt G, Rebbeck T, de Bie RA, Dekker J, Hendriks EJ. Multifaceted strategies may increase implementation of physiotherapy clinical guidelines: a systematic review. Aust J Physiother. 2008;54(4):233–41. https://doi.org/10.1016/S0004-9514(08)70002-3.

    Article  Google Scholar 

  15. Forsetlund L, Bjørndal A, Rashidian A, Jamtvedt G, O’Brien MA, Wolf FM, Davis D, Odgaard-Jensen J, Oxman AD. Continuing education meetings and workshops: effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2009; Issue 2. Art. No.: CD003030. doi:https://doi.org/10.1002/14651858.CD003030.pub2.

  16. Reeves S, Perrier L, Goldman J, Freeth D, Zwarenstein M. Interprofessional education: effects on professional practice and healthcare outcomes (update). Cochrane Database Syst Rev 2013; Issue 3. Art. No.: CD002213. DOI:https://doi.org/10.1002/14651858.CD002213.pub3.

  17. Wallace J, Byrne C, Clarke M. Improving the uptake of systematic reviews: a systematic review of intervention effectiveness and relevance. BMJ Open. 2014;4:e005834. https://doi.org/10.1136/bmjopen-2014-005834.

    Article  Google Scholar 

  18. Flodgren G, Parmelli E, Doumit G, Gattellari M, O’Brien MA, Grimshaw J et al. Local opinion leaders: effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2011; Issue 8. Art. No.: CD000125. doi:https://doi.org/10.1002/14651858.CD000125.pub4.

  19. Segovis CM, Mueller PS, Rethlefsen ML, LaRusso NF, Litin SC, Tefferi A, et al. If you feed them, they will come: a prospective study of the effects of complimentary food on attendance and physician attitudes at medical grand rounds at an academic medical center. Med Educ. 2007;7:22. https://doi.org/10.1186/1472-6920-7-22.

    Article  Google Scholar 

  20. National Implementation Research Network (NIRN): Implementation defined. https://nirn.fpg.unc.edu/learn-implementation/implementation-defined. Accessed May 1 2018.

  21. Hamilton AB, Mittman BS. Chapter 23: Implementation science in healthcare. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. 2nd edition. Oxford: Oxford University Press; 2018.

    Google Scholar 

  22. Rycroft-Malone J. Evidence-informed practice: from individual to context. J Nurs Manag. 2008;16(4):404–8. https://doi.org/10.1111/j.1365-2834.2008.00859.x.

    Article  Google Scholar 

  23. Rycroft-Malone J, Seers K, Chandler J, et al. The role of evidence, context, and facilitation in an implementation trial: implications for the development of the PARIHS framework. Implement Sci. 2013;8:28. https://doi.org/10.1186/1748-5908-8-28.

    Article  Google Scholar 

  24. Geerligs L, Rankin NM, Shepherd HL, Butow P. Hospital-based interventions: a systematic review of staff-reported barriers and facilitators to implementation processes. Implement Sci. 2018;13:36. https://doi.org/10.1186/s13012-018-0726-9.

    Article  Google Scholar 

  25. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139. https://doi.org/10.1186/1748-5908-8-139.

    Article  Google Scholar 

  26. Conway A, Dowling M, Devane D. Implementing an initiative to promote evidence-informed practice: part 2 healthcare professionals’ perspectives of the Evidence Rounds programme. https://doi.org/10.1186/s12909-019-1488-z.

  27. May CR, Johnson M, Finch T. Implementation, context and complexity. Implement Sci. 2016;11:141. https://doi.org/10.1186/s13012-016-0506-3.

    Article  Google Scholar 

  28. Lavis JN, Robertson D, Woodside JM, McLeod CB, Abelson J. Knowledge transfer study group. How can research organizations more effectively transfer research knowledge to decision makers? Milbank Q. 2003;81(2):221–48, 171-2. https://doi.org/10.1111/1468-0009.t01-1-00052.

    Article  Google Scholar 

  29. Barwick M. Knowledge translation planning template. Ontario: The Hospital for Sick Children; 2008, 2013.

    Google Scholar 

  30. Grol R. Personal paper. Beliefs and evidence in changing clinical practice. Br Med J. 1997;315(7105):418–21.

    Article  Google Scholar 

  31. Baumann AA, Cabassa LJ, Stirman W. Adaptation in dissemination and implementation science. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. Oxford: Oxford University Press; 2018.

    Google Scholar 

  32. Moore JE, Bumbarger BK, Cooper BR. Examining adaptations of evidence-based programs in natural contexts. J Prim Prev. 2013 Jun;34(3):147–61. https://doi.org/10.1007/s10935-013-0303-6.

    Article  Google Scholar 

  33. Rabin BA, Brownson RC. Chapter 2: developing the terminology for dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science into practice. Oxford: Oxford University Press; 2012.

    Google Scholar 

  34. Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687. https://doi.org/10.1136/bmj.g1687.

    Article  Google Scholar 

  35. Wilson PM, Sales A, Wensing M, Aarons GA, Flottorp S, Glidewell L, et al. Enhancing the reporting of implementation research. Implement Sci. 2017;12:13. https://doi.org/10.1186/s13012-017-0546-3.

    Article  Google Scholar 

  36. Hughes RG, Brocklehurst P, Steer PJ, Heath P, Stenson BM, on behalf of the Royal College of Obstetricians and Gynaecologists. Prevention of early-onset neonatal group B streptococcal disease. Green-top guideline no. 36. BJOG. 2017;124:e280–305.

    Article  Google Scholar 

  37. Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, Robertson N, Wensing M, Fiander M, Eccles MP, Godycki-Cwirko M, van Lieshout J, Jäger C. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. 2015; Issue 4. Art. No.: CD005470. doi:https://doi.org/10.1002/14651858.CD005470.pub3.

Download references

Acknowledgements

The authors wish to thank the participants who generously gave their time to take part in our research. We would like to express our appreciation to the staff of the women and children’s directorate at University Hospital Galway. We gratefully acknowledge the contribution of Claire Beecher, PhD Fellow, National University of Ireland Galway, who took on the role of Assistant Moderator for our focus groups. We thank Jacqui LeMay, former Head of Knowledge Services at University Hospitals Coventry and Warwickshire NHS Trust who established Evidence in Practice Groups and the Clinical Evidence Based Information Service (CEBIS) and which provided inspiration for Evidence Rounds. Finally, we thank our funding bodies; the Health Research Board-Trials Methodology Research Network (HRB-TMRN); the College of Medicine, Nursing and Health Sciences, National University of Ireland Galway; Nursing Midwifery Planning and Development Unit (NMPDU) West/Mid-West, HSE West, Ireland.

Funding

AC’s PhD studentship was funded by the Health Research Board Trials Methodology Research Network and the College of Medicine, Nursing and Health Sciences, National University of Ireland Galway, Ireland. Evidence Rounds was supported by funding from the Nursing Midwifery Planning and Development Unit (NMPDU) West/Midwest, Health Service Executive (HSE) West, Ireland.

Availability of data and materials

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

All authors actively contributed to this paper. AC and DD conceived the study. MD, DD and AC contributed to the research design. AB, JG, MC, DN and JJ provided follow up data. AC wrote the first draft of the paper. DD and AC worked on subsequent drafts and feedback was given by all authors. All authors approved the final version of the paper.

Corresponding author

Correspondence to Aislinn Conway.

Ethics declarations

Ethics approval and consent to participate

This study was granted approval by the Galway University Hospitals Clinical Research Ethics Committee (CREC) on the 2nd of June, 2016, Ref: C.A. 1505. Focus group and interview participants provided written, informed consent for their participation.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional information

This manuscript is Part 1 of a two part manuscript series. Part 2 is 10.1186/s12909-019-1488-z - Implementing an initiative promote evidence-informed practice: part 2—healthcare professionals’ perspectives of the Evidence Rounds programme.

Additional files

Additional file 1:

Quick guide for presenters (DOCX 59 kb)

Additional file 2:

Sample poster promoting Evidence Rounds (PDF 8151 kb)

Additional file 3:

Sample certificate of attendance (PDF 103 kb)

Additional file 4:

Sample certificate of participation (PDF 103 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Conway, A., Dowling, M., Binchy, Á. et al. Implementing an initiative to promote evidence-informed practice: part 1 — a description of the Evidence Rounds programme. BMC Med Educ 19, 74 (2019). https://doi.org/10.1186/s12909-019-1489-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-019-1489-y

Keywords