Maximizing the impact of systematic reviews in health care decision making: a systematic scoping review of knowledge-translation resources.

Centre for Reviews and Dissemination, University of York, Heslington, York YO105DD, UK.
Milbank Quarterly (Impact Factor: 5.06). 03/2011; 89(1):131-56. DOI: 10.1111/j.1468-0009.2011.00622.x
Source: PubMed

ABSTRACT Barriers to the use of systematic reviews by policymakers may be overcome by resources that adapt and present the findings in formats more directly tailored to their needs. We performed a systematic scoping review to identify such knowledge-translation resources and evaluations of them.
Resources were eligible for inclusion in this study if they were based exclusively or primarily on systematic reviews and were aimed at health care policymakers at the national or local level. Resources were identified by screening the websites of health technology assessment agencies and systematic review producers, supplemented by an email survey. Electronic databases and proceedings of the Cochrane Colloquium and HTA International were searched as well for published and unpublished evaluations of knowledge-translation resources. Resources were classified as summaries, overviews, or policy briefs using a previously published classification.
Twenty knowledge-translation resources were identified, of which eleven were classified as summaries, six as overviews, and three as policy briefs. Resources added value to systematic reviews by, for example, evaluating their methodological quality or assessing the reliability of their conclusions or their generalizability to particular settings. The literature search found four published evaluation studies of knowledge-translation resources, and the screening of abstracts and contact with authors found three more unpublished studies. The majority of studies reported on the perceived usefulness of the service, although there were some examples of review-based resources being used to assist actual decision making.
Systematic review producers provide a variety of resources to help policymakers, of which focused summaries are the most common. More evaluations of these resources are required to ensure users' needs are being met, to demonstrate their impact, and to justify their funding.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background Clinical Commissioning Groups (CCGs) are mandated to use research evidence effectively to ensure optimum use of resources by the National Health Service (NHS), both in accelerating innovation and in stopping the use of less effective practices and models of service delivery. We intend to evaluate whether access to a demand-led evidence service improves uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives.Methods/designThis is a controlled before and after study involving CCGs in the North of England. Participating CCGs will receive one of three interventions to support the use of research evidence in their decision-making: 1) consulting plus responsive push of tailored evidence; 2) consulting plus an unsolicited push of non-tailored evidence; or 3) standard service unsolicited push of non-tailored evidence. Our primary outcome will be changed at 12 months from baseline of a CCGs ability to acquire, assess, adapt and apply research evidence to support decision-making. Secondary outcomes will measure individual clinical leads and managers¿ intentions to use research evidence in decision making. Documentary evidence of the use of the outputs of the service will be sought. A process evaluation will evaluate the nature and success of the interactions both within the sites and between commissioners and researchers delivering the service.DiscussionThe proposed research will generate new knowledge of direct relevance and value to the NHS. The findings will help to clarify which elements of the service are of value in promoting the use of research evidence. Those involved in NHS commissioning will be able to use the results to inform how best to build the infrastructure they need to acquire, assess, adapt and apply research evidence to support decision-making and to fulfil their statutory duties under the Health and Social Care Act.
    Implementation Science 01/2015; 10(1):7. · 3.47 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Aim: Literature about research use suggests that certain characteristics or capabilities may make policy agencies more evidence attuned. This study sought to determine policy makers’ perceptions of a suite of organisational capabilities identified from the literature as potentially facilitating research uptake in policy decision making. Method: A literature scan identified eight key organisational capabilities that support research use in policy making. To determine whether these capabilities were relevant, practical and applicable in real world policy settings, nine Australian health policy makers were consulted in September 2011. We used an open-ended questionnaire asking what facilitates the use of research in policy and program decision making, followed by specific questions rating the proposed capabilities. Interviews were transcribed and the content analysed. Results: There was general agreement that the capabilities identified from the literature were relevant to real world contexts. However, interviewees varied in whether they could provide examples of experiences with the capabilities, how essential they considered the different capabilities to be and how difficult they considered the capabilities were to achieve. Conclusion: Efforts to improve the use of research in policy decision making are likely to benefit from targeting multiple organisational capabilities, including staff skills and competence, tools such as templates and checklists to aid evidence use and leadership support for the use of research in policy development. However, such efforts should be guided by an understanding of how policy agencies use evidence and how they view their roles, and external factors such as resource constraints and availability of appropriate research.
    Public Health Research and Practice. 11/2014; 25(1).
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Network meta-analyses (NMAs) are complex methodological approaches that may be challenging for non-technical end-users, such as policymakers and clinicians, to understand. Consideration should be given to identifying optimal approaches to presenting NMAs that help clarify analyses. It is unclear what guidance researchers currently have on how to present and tailor NMAs to different end-users. A systematic review of NMA guidelines was conducted to identify guidance on how to present NMAs. Electronic databases and supplementary sources were searched for NMA guidelines. Presentation format details related to sample formats, target audiences, data sources, analysis methods and results were extracted and frequencies tabulated. Guideline quality was assessed following criteria developed for clinical practice guidelines. Seven guidelines were included. Current guidelines focus on how to conduct NMAs but provide limited guidance to researchers on how to best present analyses to different end-users. None of the guidelines provided reporting templates. Few guidelines provided advice on tailoring presentations to different end-users, such as policymakers. Available guidance on presentation formats focused on evidence networks, characteristics of individual trials, comparisons between direct and indirect estimates and assumptions of heterogeneity and/or inconsistency. Some guidelines also provided examples of figures and tables that could be used to present information. Limited guidance exists for researchers on how best to present NMAs in an accessible format, especially for non-technical end-users such as policymakers and clinicians. NMA guidelines may require further integration with end-users' needs, when NMAs are used to support healthcare policy and practice decisions. Developing presentation formats that enhance understanding and accessibility of NMAs could also enhance the transparency and legitimacy of decisions informed by NMAs.
    PLoS ONE 12/2014; 9(12):e113277. · 3.53 Impact Factor

Full-text (2 Sources)

Available from
May 16, 2014