Payback arising from research funding: Evaluation of the Arthritis Research Campaign

Brunel University London, अक्सब्रिज, England, United Kingdom
Rheumatology (Impact Factor: 4.48). 10/2005; 44(9):1145-56. DOI: 10.1093/rheumatology/keh708
Source: PubMed


Using a structured evaluation framework to systematically review and document the outputs and outcomes of research funded by the Arthritis Research Campaign in the early 1990s. To illustrate the strengths and weaknesses of different modes of research funding.
The payback framework was applied to 16 case studies of research grants funded in the early 1990s. Case study methodology included bibliometric analysis, literature and archival document review and key informant interviews.
A range of research paybacks was identified from the 16 research grants. The payback included 302 peer-reviewed papers, postgraduate training and career development, including 28 PhD/MDs, research informing recommendations in clinical guidelines, improved quality of life for people with RA and the reduction of the likelihood of recurrent miscarriage for women with antiphospholipid syndrome. The payback arising from project grants appeared to be similar to that arising from other modes of funding that were better resourced.
There is a wide diversity of research payback. Short focused project grants seem to provide value for money.

Download full-text


Available from: Martin Buxton, Jun 18, 2014
  • Source
    • "The payback framework, developed initially for the health sciences, was one of the initial research evaluation tools that incorporated both academic outputs and societal impact as a criterion for assessment. This framework, which was developed during the 1990s (Buxton and Hanney 1996), has been used to assess a number of health science funding programs, including those in the UK (Wooding et al. 2005), as well as internationally (Bernstein et al. 2006; Kwan et al. 2007; Oortwijn et al. 2008; Scott et al. 2011; Donovan et al. 2014). It uses an outcome-based retrospective, narrative, case study approach to assess a series of five outcome categories of individual paybacks from research. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The relative newness of ‘impact’ as a criterion for research assessment has meant that there is yet to be an empirical study examining the process of its evaluation. This article is part of a broader study which is exploring the panel-based peer and end-user review process for societal impact evaluation using the UK’s national research assessment exercise, the Research Excellence Framework (REF) 2014, as a case study. In particular, this article explores the different perceptions REF2014 evaluators had regarding societal impact, preceding their evaluation of this measure as part of REF2014. Data are drawn from 62 interviews with evaluators from the health-related Panel A and its subpanels, prior to the REF2014 exercise taking place. We show how going into the REF exercise, evaluators from Panel A had different perceptions about how to characterize impact and how to define impact realization in terms of research outcomes and the research process. We conclude by discussing the implications of our findings for future impact evaluation frameworks, as well as postulating a series of hypotheses about the ways in which evaluators’ different perceptions going into an impact assessment could potentially influence the evaluation of impact submissions. Using REF2014 as a case study, these hypotheses will be tested in interviews with REF2014 evaluators post-assessment.
    Research Evaluation 07/2015; 24(3). DOI:10.1093/reseval/rvv007 · 0.85 Impact Factor
  • Source
    • "We undertook a sequence of activities drawing initially on existing methods. We adopted successful methodologies used originally to evaluate health services research [10], and further applied to assess the impacts from research funding in diabetes [23], arthritis [11], asthma [16], and health technology assessments [13,14]. But we also developed new methods to expand the analysis. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Funders of health research increasingly seek to understand how best to allocate resources in order to achieve maximum value from their funding. We built an international consortium and developed a multinational case study approach to assess benefits arising from health research. We used that to facilitate analysis of factors in the production of research that might be associated with translating research findings into wider impacts, and the complexities involved. We built on the Payback Framework and expanded its application through conducting co-ordinated case studies on the payback from cardiovascular and stroke research in Australia, Canada and the United Kingdom. We selected a stratified random sample of projects from leading medical research funders. We devised a series of innovative steps to: minimize the effect of researcher bias; rate the level of impacts identified in the case studies; and interrogate case study narratives to identify factors that correlated with achieving high or low levels of impact. Twenty-nine detailed case studies produced many and diverse impacts. Over the 15 to 20 years examined, basic biomedical research has a greater impact than clinical research in terms of academic impacts such as knowledge production and research capacity building. Clinical research has greater levels of wider impact on health policies, practice, and generating health gains. There was no correlation between knowledge production and wider impacts. We identified various factors associated with high impact. Interaction between researchers and practitioners and the public is associated with achieving high academic impact and translation into wider impacts, as is basic research conducted with a clinical focus. Strategic thinking by clinical researchers, in terms of thinking through pathways by which research could potentially be translated into practice, is associated with high wider impact. Finally, we identified the complexity of factors behind research translation that can arise in a single case. We can systematically assess research impacts and use the findings to promote translation. Research funders can justify funding research of diverse types, but they should not assume academic impacts are proxies for wider impacts. They should encourage researchers to consider pathways towards impact and engage potential research users in research processes.
    Implementation Science 04/2014; 9(1):47. DOI:10.1186/1748-5908-9-47 · 4.12 Impact Factor
  • Source
    • "There is a growing body of literature that seeks to understand and develop frameworks for assessing, the ways in which research influences policy and practice [10,11]. We drew upon such frameworks in this work, but also approaches to evaluating the impact of capacity building [12]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The Fogarty International Center (FIC) has supported research capacity development for over twenty years. While the mission of FIC is supporting and facilitating global health research conducted by U.S. and international investigators, building partnerships between health research institutions in the U.S. and abroad, and training the next generation of scientists to address global health needs, research capacity may impact health policies and programs and therefore have positive impacts on public health. We conducted an exploratory analysis of how FIC research training investments affected public health policy and program development in Kenya and Uganda. We explored the long term impacts of all FIC supported research training programs using case studies, in Kenya and Uganda. Semi-structured in-depth interviews were conducted with 53 respondents and 29 focus group discussion participants across the two countries. Qualitative methods were supplemented by structured surveys of trainees and document review, including a review of evidence cited in policy documents. In the primary focal areas of FIC grants, notably HIV/AIDS, there were numerous examples of work conducted by former FIC trainees that influenced national and global policies. Facilitators for this influence included the strong technical skills and scientific reputations of the trainees, and professional networks spanning research and policy communities. Barriers included the fact that trainees typically had not received training in research communication, relatively few policy makers had received scientific training, and institutional constraints that undermined alignment of research with policy needs. While FIC has not focused its programs on the goal of policy and program influence, its investments have affected global and national public health policies and practice. These influences have occurred primarily through strengthening research skills of scientists and developing strong in-country networks. Further success of FIC and similar initiatives could be stimulated by investing more in the training of policy-makers, seeking to better align research with policy needs through more grants that are awarded directly to developing country institutions, and grants that better incorporate policy maker perspectives in their design and governance. Addressing structural constraints, for example supporting the development of national research agendas that inform university research, would further support such efforts.
    BMC Public Health 08/2013; 13(1):770. DOI:10.1186/1471-2458-13-770 · 2.26 Impact Factor
Show more