Assessing the impact of biomedical research in academic institutions of disparate sizes

BMC Medical Research Methodology (Impact Factor: 2.27). 05/2009; 9(1):33. DOI: 10.1186/1471-2288-9-33
Source: PubMed


The evaluation of academic research performance is nowadays a priority issue. Bibliometric indicators such as the number of publications, total citation counts and h-index are an indispensable tool in this task but their inherent association with the size of the research output may result in rewarding high production when evaluating institutions of disparate sizes. The aim of this study is to propose an indicator that may facilitate the comparison of institutions of disparate sizes.
The Modified Impact Index (MII) was defined as the ratio of the observed h-index (h) of an institution over the h-index anticipated for that institution on average, given the number of publications (N) it produces i.e. MII = h/10alphaNbeta (alpha and beta denote the intercept and the slope, respectively, of the line describing the dependence of the h-index on the number of publications in log10 scale). MII values higher than 1 indicate that an institution performs better than the average, in terms of its h-index. Data on scientific papers published during 2002-2006 and within 36 medical fields for 219 Academic Medical Institutions from 16 European countries were used to estimate alpha and beta and to calculate the MII of their total and field-specific production.
From our biomedical research data, the slope beta governing the dependence of h-index on the number of publications in biomedical research was found to be similar to that estimated in other disciplines ( approximately 0.4). The MII was positively associated with the average number of citations/publication (r = 0.653, p < 0.001), the h-index (r = 0.213, p = 0.002), the number of publications with > or = 100 citations (r = 0.211, p = 0.004) but not with the number of publications (r = -0.020, p = 0.765). It was the most highly associated indicator with the share of country-specific government budget appropriations or outlays for research and development as % of GDP in 2004 (r = 0.229) followed by the average number of citations/publication (r = 0.153) whereas the corresponding correlation coefficient for the h-index was close to 0 (r = 0.029). MII was calculated for first 10 top-ranked European universities in life sciences and biomedicine, as provided by Times Higher Education ranking system, and their total and field-specific performance was compared.
The MII should complement the use of h-index when comparing the research output of institutions of disparate sizes. It has a conceptual interpretation and, with the data provided here, can be computed for the total research output as well as for field-specific publication sets of institutions in biomedicine.

11 Reads
  • Source
    • "Articles were restricted to those published between January 1, 1998, and December 31, 2012, to help ensure relevancy to the contemporary understanding of translational research. We excluded articles that assessed research impact solely on the basis of bibliographic considerations, for example, the impact factor of published research articles (e.g., Rosas, Kagan, Schouten, Slack, & Trochim, 2011; Sypsa & Hatzakis, 2009). Nearly 100 articles matched one or more of these criteria. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Increasing emphasis is being placed on measuring return on research investment and determining the true impacts of biomedical research for medical practice and population health. This article describes initial progress on development of a new standardized tool for identifying and measuring impacts across research sites. The Translational Research Impact Scale (TRIS) is intended to provide a systematic approach to assessing impact levels using a set of 72 impact indicators organized into three broad research impact domains and nine subdomains. A validation process was conducted with input from a panel of 31 experts in translational research, who met to define and standardize the measurement of research impacts using the TRIS. Testing was performed to estimate the reliability of the experts' ratings. The reliability was found to be high (ranging from .75 to .94) in all of the domains and most of the subdomains. A weighting process was performed assigning item weights to the individual indicators, so that composite scores can be derived.
    Evaluation &amp the Health Professions 09/2013; 37(1). DOI:10.1177/0163278713506112 · 1.91 Impact Factor
  • Source
    • "Calculating the h-index by country can be considered an analog method, since it involves only the most cited publications. This index was designed by Hirsch to evaluate the publications of a single investigator (Hirsch 2005) but its use soon spread to wider areas and many variants were developed, applying to universities and countries (Ponce and Lozano 2010; Schubert 2007; Csajbók et al. 2007; Sypsa 2009; Spiroski 2010). Our study shows that those countries with large numbers of published papers and citations included in the 75 percentile are the countries with higher h-index. "
    [Show abstract] [Hide abstract]
    ABSTRACT: We analyzed the productivity and visibility of publications on the subject category of Clinical Neurology by countries in the period 2000–2009. We used the Science Citation Index Expanded database of the ISI Web of Knowledge. The analysis was restricted to the citable documents. Bibliometric indicators included the number of publications, the number of citations, the median and interquartile range of the citations, and the h-index. We identified 170,483 publications (84.9 % original articles) with a relative increase of 28.5 % throughout the decade. Fourteen countries published over 2,000 documents in the decade and received more than 50,000 citations. The average of citations received per publication was 8 (interquartile range: 3–20) and the h-index was 261. USA was the country with the highest number of publications, followed by Germany, Japan, the UK and Italy. Moreover, USA publications had the largest number of citations received (44.5 % of total), followed by the UK, Germany, Canada, and Italy. On the other hand, Sweden, the Netherlands and the UK had the highest median citations for their total publications. During the period 2000–2009 there was a significant increase in Clinical Neurology publications. Most of the publications and citations comprised 14 countries, with the USA in the first position. Interestingly, most of the publications and citations originated from only 14 countries, with European countries with relatively low population, such as Switzerland, Austria, Sweden, Belgium, and the Netherlands, in this top group.
    Scientometrics 06/2012; 95(3). DOI:10.1007/s11192-012-0880-9 · 2.18 Impact Factor
  • Source
    • "They are worried about how the current system rewards raw production rather than 'scholarship that addresses the questions that matter most to society'. Due to the importance of studying each subject field separately; as most rankings focus on top universities using elitist indicators, and because of the concerns expressed by some authors in regard with the inadequacy of most bibliometric indicators when applied to 772 D. Torres-Salinas et al. smaller size institutions (Sypsa and Hatzakis 2009); a new methodology is needed. Said methodology should take into account these factors in order to create rankings that allow the comparison between universities that cannot reach tops 100, 250 or 500 worldwide, and do so in a given subject field. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The problem of comparing academic institutions in terms of their research production is nowadays a priority issue. This paper proposes a relative bidimensional index that takes into account both the net production and the quality of it, as an attempt to provide a comprehensive and objective way to compare the research output of different institutions in a specific field, using journal contributions and citations. The proposed index is then applied, as a case study, to rank the top Spanish universities in the fields of Chemistry and Computer Science in the period ranging from 2000 until 2009. A comparison with the top 50 universities in the ARWU rankings is also made, showing the proposed ranking is better suited to distinguish among non-elite universities.
    Scientometrics 09/2011; 88(3). DOI:10.1007/s11192-011-0418-6 · 2.18 Impact Factor
Show more

Preview (2 Sources)

11 Reads
Available from