Interventions to improve the use of systematic reviews in decision-making by health system managers, policy makers and clinicians
Healthcare decision-makers have to deal with a large volume of evidence and may or may not have the capacity to use this evidence in decision-making. Systematic reviews can help with this process by bringing together the relevant evidence and using clear methods which seek to minimise bias. Interventions to help improve the uptake of systematic review evidence have been developed to help with improving the use of evidence. We identified eight studies which evaluated the effectiveness of these types of interventions. There was some evidence to show that a clear message, based on systematic review evidence, which is targeted and disseminated to the relevant healthcare professionals may improve evidence-based practice. If the aim is to help decision-makers increase their awareness and knowledge of systematic review evidence, and evidence-based practice, then interventions that address some of these aims have been evaluated but have shown little effect on practice.
Figures in this publication
Available from: Kathryn Oliver
- "Emerging as a new stream of research, eleven studies evaluated or described knowledge broker roles or related concepts [6,35,37,51-57] with dedicated dissemination strategies evaluated in 7 studies and mentioned as a facilitator in 43. Incentives to use evidence and client demand for research evidence were described as facilitators in one study each [10,58]. "
[Show abstract] [Hide abstract]
ABSTRACT: The gap between research and practice or policy is often described as a problem. To identify new barriers of and facilitators to the use of evidence by policymakers, and assess the state of research in this area, we updated a systematic review.
Systematic review. We searched online databases including Medline, Embase, SocSci Abstracts, CDS, DARE, Psychlit, Cochrane Library, NHSEED, HTA, PAIS, IBSS (Search dates: July 2000 - September 2012). Studies were included if they were primary research or systematic reviews about factors affecting the use of evidence in policy. Studies were coded to extract data on methods, topic, focus, results and population.
145 new studies were identified, of which over half were published after 2010. Thirteen systematic reviews were included. Compared with the original review, a much wider range of policy topics was found. Although still primarily in the health field, studies were also drawn from criminal justice, traffic policy, drug policy, and partnership working. The most frequently reported barriers to evidence uptake were poor access to good quality relevant research, and lack of timely research output. The most frequently reported facilitators were collaboration between researchers and policymakers, and improved relationships and skills. There is an increasing amount of research into new models of knowledge transfer, and evaluations of interventions such as knowledge brokerage.
Timely access to good quality and relevant research evidence, collaborations with policymakers and relationship- and skills-building with policymakers are reported to be the most important factors in influencing the use of evidence. Although investigations into the use of evidence have spread beyond the health field and into more countries, the main barriers and facilitators remained the same as in the earlier review. Few studies provide clear definitions of policy, evidence or policymaker. Nor are empirical data about policy processes or implementation of policy widely available. It is therefore difficult to describe the role of evidence and other factors influencing policy. Future research and policy priorities should aim to illuminate these concepts and processes, target the factors identified in this review, and consider new methods of overcoming the barriers described.
Available from: Pedro Orduñez
[Show abstract] [Hide abstract]
ABSTRACT: Clinical scientists are at the unique interface between laboratory science and frontline clinical practice for supporting clinical partnerships for evidence-based practice. In an era of molecular diagnostics and personalised medicine, evidence-based laboratory practice (EBLP) is also crucial in aiding clinical scientists to keep up-to-date with this expanding knowledge base. However, there are recognised barriers to the implementation of EBLP and its training. The aim of this review is to provide a practical summary of potential strategies for training clinician-scientists of the next generation. Current evidence suggests that clinically integrated evidence-based medicine (EBM) training is effective. Tailored e-learning EBM packages and evidence-based journal clubs have been shown to improve knowledge and skills of EBM. Moreover, e-learning is no longer restricted to computer-assisted learning packages. For example, social media platforms such as Twitter have been used to complement existing journal clubs and provide additional post-publication appraisal information for journals. In addition, the delivery of an EBLP curriculum has influence on its success. Although e-learning of EBM skills is effective, having EBM trained teachers available locally promotes the implementation of EBM training. Training courses, such as Training the Trainers, are now available to help trainers identify and make use of EBM training opportunities in clinical practice. On the other hand, peer-assisted learning and trainee-led support networks can strengthen self-directed learning of EBM and research participation among clinical scientists in training. Finally, we emphasise the need to evaluate any EBLP training programme using validated assessment tools to help identify the most crucial ingredients of effective EBLP training. In summary, we recommend on-the-job training of EBM with additional focus on overcoming barriers to its implementation. In addition, future studies evaluating the effectiveness of EBM training should use validated outcome tools, endeavour to achieve adequate power and consider the effects of EBM training on learning environment and patient outcomes.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.