Interventions to improve the use of systematic reviews in decision-making by health system managers, policy makers and clinicians

UK Cochrane Centre, National Institute for Health Research, Oxford, UK.
Cochrane database of systematic reviews (Online) (Impact Factor: 6.03). 09/2012; 9(9):CD009401. DOI: 10.1002/14651858.CD009401.pub2
Source: PubMed


Systematic reviews provide a transparent and robust summary of existing research. However, health system managers, national and local policy makers and healthcare professionals can face several obstacles when attempting to utilise this evidence. These include constraints operating within the health system, dealing with a large volume of research evidence and difficulties in adapting evidence from systematic reviews so that it is locally relevant. In an attempt to increase the use of systematic review evidence in decision-making a number of interventions have been developed. These include summaries of systematic review evidence that are designed to improve the accessibility of the findings of systematic reviews (often referred to as information products) and changes to organisational structures, such as employing specialist groups to synthesise the evidence to inform local decision-making.
To identify and assess the effects of information products based on the findings of systematic review evidence and organisational supports and processes designed to support the uptake of systematic review evidence by health system managers, policy makers and healthcare professionals.
We searched The Cochrane Library, MEDLINE, EMBASE, CINAHL, Web of Science, and Health Economic Evaluations Database. We also handsearched two journals (Implementation Science and Evidence and Policy), Cochrane Colloquium abstracts, websites of key organisations and reference lists of studies considered for inclusion. Searches were run from 1992 to March 2011 on all databases, an update search to March 2012 was run on MEDLINE only.
Randomised controlled trials (RCTs), interrupted time-series (ITS) and controlled before-after studies (CBA) of interventions designed to aid the use of systematic reviews in healthcare decision-making were considered.
Two review authors independently extracted the data and assessed the study quality. We extracted the median value across similar outcomes for each study and reported the range of values for each median value. We calculated the median of the two middlemost values if an even number of outcomes were reported.
We included eight studies evaluating the effectiveness of different interventions designed to support the uptake of systematic review evidence. The overall quality of the evidence was very low to moderate.Two cluster RCTs evaluated the effectiveness of multifaceted interventions, which contained access to systematic reviews relevant to reproductive health, to change obstetric care; the high baseline performance in some of the key clinical indicators limited the findings of these studies. There were no statistically significant effects on clinical practice for all but one of the clinical indicators in selected obstetric units in Thailand (median effect size 4.2%, range -11.2% to 18.2%) and none in Mexico (median effect size 3.5%, range 0.1% to 19.0%). In the second cluster RCT there were no statistically significant differences in selected obstetric units in the UK (median effect RR 0.92; range RR 0.57 to RR 1.10). One RCT evaluated the perceived understanding and ease of use of summary of findings tables in Cochrane Reviews. The median effect of the differences in responses for the acceptability of including summary of findings tables in Cochrane Reviews versus not including them was 16%, range 1% to 28%. One RCT evaluated the effect of an analgesic league table, derived from systematic review evidence, and there was no statistically significant effect on self-reported pain. Only one RCT evaluated an organisational intervention (which included a knowledge broker, access to a repository of systematic reviews and provision of tailored messages), and reported no statistically significant difference in evidence informed programme planning.Three interrupted time series studies evaluated the dissemination of printed bulletins based on evidence from systematic reviews. A statistically significant reduction in the rates of surgery for glue ear in children under 10 years (mean annual decline of -10.1%; 95% CI -7.9 to -12.3) and in children under 15 years (quarterly reduction -0.044; 95% CI -0.080 to -0.011) was reported. The distribution to general practitioners of a bulletin on the treatment of depression was associated with a statistically significant lower prescribing rate each quarter than that predicted by the rates of prescribing observed before the distribution of the bulletin (8.2%; P = 0.005).
Mass mailing a printed bulletin which summarises systematic review evidence may improve evidence-based practice when there is a single clear message, if the change is relatively simple to accomplish, and there is a growing awareness by users of the evidence that a change in practice is required. If the intention is to develop awareness and knowledge of systematic review evidence, and the skills for implementing this evidence, a multifaceted intervention that addresses each of these aims may be required, though there is insufficient evidence to support this approach.

11 Reads
  • Source
    • "Emerging as a new stream of research, eleven studies evaluated or described knowledge broker roles or related concepts [6,35,37,51-57] with dedicated dissemination strategies evaluated in 7 studies and mentioned as a facilitator in 43. Incentives to use evidence and client demand for research evidence were described as facilitators in one study each [10,58]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The gap between research and practice or policy is often described as a problem. To identify new barriers of and facilitators to the use of evidence by policymakers, and assess the state of research in this area, we updated a systematic review. Systematic review. We searched online databases including Medline, Embase, SocSci Abstracts, CDS, DARE, Psychlit, Cochrane Library, NHSEED, HTA, PAIS, IBSS (Search dates: July 2000 - September 2012). Studies were included if they were primary research or systematic reviews about factors affecting the use of evidence in policy. Studies were coded to extract data on methods, topic, focus, results and population. 145 new studies were identified, of which over half were published after 2010. Thirteen systematic reviews were included. Compared with the original review, a much wider range of policy topics was found. Although still primarily in the health field, studies were also drawn from criminal justice, traffic policy, drug policy, and partnership working. The most frequently reported barriers to evidence uptake were poor access to good quality relevant research, and lack of timely research output. The most frequently reported facilitators were collaboration between researchers and policymakers, and improved relationships and skills. There is an increasing amount of research into new models of knowledge transfer, and evaluations of interventions such as knowledge brokerage. Timely access to good quality and relevant research evidence, collaborations with policymakers and relationship- and skills-building with policymakers are reported to be the most important factors in influencing the use of evidence. Although investigations into the use of evidence have spread beyond the health field and into more countries, the main barriers and facilitators remained the same as in the earlier review. Few studies provide clear definitions of policy, evidence or policymaker. Nor are empirical data about policy processes or implementation of policy widely available. It is therefore difficult to describe the role of evidence and other factors influencing policy. Future research and policy priorities should aim to illuminate these concepts and processes, target the factors identified in this review, and consider new methods of overcoming the barriers described.
    BMC Health Services Research 01/2014; 14(1):2. DOI:10.1186/1472-6963-14-2 · 1.71 Impact Factor
  • Source
  • [Show abstract] [Hide abstract]
    ABSTRACT: Clinical scientists are at the unique interface between laboratory science and frontline clinical practice for supporting clinical partnerships for evidence-based practice. In an era of molecular diagnostics and personalised medicine, evidence-based laboratory practice (EBLP) is also crucial in aiding clinical scientists to keep up-to-date with this expanding knowledge base. However, there are recognised barriers to the implementation of EBLP and its training. The aim of this review is to provide a practical summary of potential strategies for training clinician-scientists of the next generation. Current evidence suggests that clinically integrated evidence-based medicine (EBM) training is effective. Tailored e-learning EBM packages and evidence-based journal clubs have been shown to improve knowledge and skills of EBM. Moreover, e-learning is no longer restricted to computer-assisted learning packages. For example, social media platforms such as Twitter have been used to complement existing journal clubs and provide additional post-publication appraisal information for journals. In addition, the delivery of an EBLP curriculum has influence on its success. Although e-learning of EBM skills is effective, having EBM trained teachers available locally promotes the implementation of EBM training. Training courses, such as Training the Trainers, are now available to help trainers identify and make use of EBM training opportunities in clinical practice. On the other hand, peer-assisted learning and trainee-led support networks can strengthen self-directed learning of EBM and research participation among clinical scientists in training. Finally, we emphasise the need to evaluate any EBLP training programme using validated assessment tools to help identify the most crucial ingredients of effective EBLP training. In summary, we recommend on-the-job training of EBM with additional focus on overcoming barriers to its implementation. In addition, future studies evaluating the effectiveness of EBM training should use validated outcome tools, endeavour to achieve adequate power and consider the effects of EBM training on learning environment and patient outcomes.
    The Clinical biochemist. Reviews / Australian Association of Clinical Biochemists 08/2013; 34(2):93-103.
Show more