Article

Assessing the impact of biomedical research in academic institutions of disparate sizes

BMC Medical Research Methodology (Impact Factor: 2.17). 05/2009; 9:33. DOI: 10.1186/1471-2288-9-33
Source: PubMed

ABSTRACT The evaluation of academic research performance is nowadays a priority issue. Bibliometric indicators such as the number of publications, total citation counts and h-index are an indispensable tool in this task but their inherent association with the size of the research output may result in rewarding high production when evaluating institutions of disparate sizes. The aim of this study is to propose an indicator that may facilitate the comparison of institutions of disparate sizes.
The Modified Impact Index (MII) was defined as the ratio of the observed h-index (h) of an institution over the h-index anticipated for that institution on average, given the number of publications (N) it produces i.e. MII = h/10alphaNbeta (alpha and beta denote the intercept and the slope, respectively, of the line describing the dependence of the h-index on the number of publications in log10 scale). MII values higher than 1 indicate that an institution performs better than the average, in terms of its h-index. Data on scientific papers published during 2002-2006 and within 36 medical fields for 219 Academic Medical Institutions from 16 European countries were used to estimate alpha and beta and to calculate the MII of their total and field-specific production.
From our biomedical research data, the slope beta governing the dependence of h-index on the number of publications in biomedical research was found to be similar to that estimated in other disciplines ( approximately 0.4). The MII was positively associated with the average number of citations/publication (r = 0.653, p < 0.001), the h-index (r = 0.213, p = 0.002), the number of publications with > or = 100 citations (r = 0.211, p = 0.004) but not with the number of publications (r = -0.020, p = 0.765). It was the most highly associated indicator with the share of country-specific government budget appropriations or outlays for research and development as % of GDP in 2004 (r = 0.229) followed by the average number of citations/publication (r = 0.153) whereas the corresponding correlation coefficient for the h-index was close to 0 (r = 0.029). MII was calculated for first 10 top-ranked European universities in life sciences and biomedicine, as provided by Times Higher Education ranking system, and their total and field-specific performance was compared.
The MII should complement the use of h-index when comparing the research output of institutions of disparate sizes. It has a conceptual interpretation and, with the data provided here, can be computed for the total research output as well as for field-specific publication sets of institutions in biomedicine.

0 Followers
 · 
148 Views
  • Source
    • "They are worried about how the current system rewards raw production rather than 'scholarship that addresses the questions that matter most to society'. Due to the importance of studying each subject field separately; as most rankings focus on top universities using elitist indicators, and because of the concerns expressed by some authors in regard with the inadequacy of most bibliometric indicators when applied to 772 D. Torres-Salinas et al. smaller size institutions (Sypsa and Hatzakis 2009); a new methodology is needed. Said methodology should take into account these factors in order to create rankings that allow the comparison between universities that cannot reach tops 100, 250 or 500 worldwide, and do so in a given subject field. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The problem of comparing academic institutions in terms of their research production is nowadays a priority issue. This paper proposes a relative bidimensional index that takes into account both the net production and the quality of it, as an attempt to provide a comprehensive and objective way to compare the research output of different institutions in a specific field, using journal contributions and citations. The proposed index is then applied, as a case study, to rank the top Spanish universities in the fields of Chemistry and Computer Science in the period ranging from 2000 until 2009. A comparison with the top 50 universities in the ARWU rankings is also made, showing the proposed ranking is better suited to distinguish among non-elite universities.
    Scientometrics 09/2011; DOI:10.1007/s11192-011-0418-6 · 2.27 Impact Factor
  • Source
    • "The search for publications identified 199 studies published between 2005 and 2010 as article (n = 172), grey literature (n = 15), conference proceeding (n = 11), or book section (n = 1). Thirty-five out of the 199 publications reported all information required for the meta-analysis presented here: (1) at least one correlation coefficient between the h index and an h index variant, and (2) the sample size of the study (Antonakis & Lalive, 2008; Arencibia-Jorge & Rousseau, 2009; Bornmann, Marx, & Schier, 2009; Bornmann, Mutz, & Daniel, 2008; Bornmann, Mutz, Daniel, Wallon, & Ledin, 2009; Cabrerizo, Alonso, Herrera- Viedma, & Herrera, 2010; Costas & Bordons, 2008; de Visscher, 2010; Franceschet, 2009; García-Pérez, 2009; Harzing & van der Wal, 2008; Haslam & Laham, 2010; Hu et al., 2010; Hua, Wan, & Wu, 2010; Jin, Liang, Rousseau, & Egghe, 2007; Kosmulski, 2006; Lee, Kraus, & Couldwell, 2009; Liu & Rousseau, 2007, 2009; Lovegrove & Johnson, 2008; Mingers, 2009; Moussa & Touzani, 2010; Opthof & Wilde, 2009; Rousseau et al., 2010; Ruane & Tol, 2007; Sanderson, 2008; Schreiber, 2008a, 2009a, 2009b; Schubert, Korn, & Telcs, 2009; Sypsa & Hatzakis, 2009; Tol, 2009; Vinkler, 2009; Wohlin, 2009; Wu, 2010). Since three papers by Schreiber (2008a, 2009a, 2009b) and two papers by Liu and Rousseau (2007, 2009) refer to one and the same dataset, the number of studies that could be included in the meta-analysis decreased from 35 to 32. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents the first meta-analysis of studies that computed correlations between the h index and variants of the h index (such as the g index; in total 37 different variants) that have been proposed and discussed in the literature. A high correlation between the h index and its variants would indicate that the h index variants hardly provide added information to the h index. This meta-analysis included 135 correlation coefficients from 32 studies. The studies were based on a total sample size of N=9005; on average, each study had a sample size of n=257. The results of a three-level cross-classified mixed-effects meta-analysis show a high correlation between the h index and its variants: Depending on the model, the mean correlation coefficient varies between .8 and .9. This means that there is redundancy between most of the h index variants and the h index. There is a statistically significant study-to-study variation of the correlation coefficients in the information they yield. The lowest correlation coefficients with the h index are found for the h index variants MII and m index. Hence, these h index variants make a non-redundant contribution to the h index.
    Journal of Informetrics 07/2011; 5(3):346-359. DOI:10.1016/j.joi.2011.01.006 · 4.23 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: We analyzed the productivity and visibility of publications on the subject category of Clinical Neurology by countries in the period 2000–2009. We used the Science Citation Index Expanded database of the ISI Web of Knowledge. The analysis was restricted to the citable documents. Bibliometric indicators included the number of publications, the number of citations, the median and interquartile range of the citations, and the h-index. We identified 170,483 publications (84.9 % original articles) with a relative increase of 28.5 % throughout the decade. Fourteen countries published over 2,000 documents in the decade and received more than 50,000 citations. The average of citations received per publication was 8 (interquartile range: 3–20) and the h-index was 261. USA was the country with the highest number of publications, followed by Germany, Japan, the UK and Italy. Moreover, USA publications had the largest number of citations received (44.5 % of total), followed by the UK, Germany, Canada, and Italy. On the other hand, Sweden, the Netherlands and the UK had the highest median citations for their total publications. During the period 2000–2009 there was a significant increase in Clinical Neurology publications. Most of the publications and citations comprised 14 countries, with the USA in the first position. Interestingly, most of the publications and citations originated from only 14 countries, with European countries with relatively low population, such as Switzerland, Austria, Sweden, Belgium, and the Netherlands, in this top group.
    Scientometrics 06/2012; 95(3). DOI:10.1007/s11192-012-0880-9 · 2.27 Impact Factor
Show more

Preview

Download
1 Download
Available from