Article

SAVVY SEARCHING Five-year impact factor data in the Journal Citation Reports

Authors:
  • University of Hawaiʻi
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Purpose - The purpose of this paper is to examine the five-year journal impact factor (JIF) score of the Journal Citation Reports (JCR). Design/methodology/approach - The paper looks at one of the important enhancements to the JCR, the new five-year journal impact factor (JIF) score. This element complements the traditional JIF scores and data. The new indicator addresses the criticism against the short citation window for evaluating the performance of nearly 8,000 scholarly and professional journals on a medium term. Findings - It may be feasible that some of the other proposals presented by the best scientometricians for improving the JIF and its alternatives will be implemented in various specialty editions of JCR. Particularly interesting would be the adding of scores computed through diachronous instead of or in addition to synchronous measurement; creating new indicators based on the level of uncitedness of articles in journals; and calculating percentile JIF, JIF point averages and/or JIFs based on article count, with or without self-citations. Originality/value - The five-year mid-term JIF complements very well the short-term two-year JIF for indicating the prestige, reputation and influence of the journals through the prism of the average productivity of journals and the citedness counts of articles published in the journals for a longer time span. As mentioned above, breaking down the various indicators by disciplinary and subdisciplinary categories, or even by the language and the country of publication of the journals (not the country affiliation of the authors) can provide further insight into the landscape of scholarly publishing.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

Article
The journal indexes currently used to rank journals adopt a short citation window, which has long been criticized. In particular, journals with a prolonged impact are underestimated by this method. By combining the features of synchronic and diachronic journal impact factors, Fang (Scientometrics 125:2265–2282, 2020) proposed the integral synchronic journal impact factor (IIF). The citation window of the IIF can be enlarged without reducing the index value. Therefore, the IIF can effectively and fairly reflect the impact of journals with different citation patterns on the scientific community. To overcome the defect that the IIF might be unstable if a journal publishes few papers in individual years, this work modifies the IIF by borrowing the idea of the new CiteScore: using the citation counts of the adjacent previous years to assist in evaluating the impact level of the journal in the focal year. The modified index can be regarded as a combination of the JIFs and CiteScore. The experimental results show that the modified journal index is more stable over time than the IIF while retaining the associated advantages.
Article
Full-text available
Drug repurposing has become an effective approach to drug discovery, as it offers a new way to explore drugs. Based on the Science Citation Index Expanded (SCI-E) and Social Sciences Citation Index (SSCI) databases of the Web of Science core collection, this study presents a bibliometric analysis of drug repurposing publications from 2010 to 2020. Data were cleaned, mined, and visualized using Derwent Data Analyzer (DDA) software. An overview of the history and development trend of the number of publications, major journals, major countries, major institutions, author keywords, major contributors, and major research fields is provided. There were 2,978 publications included in the study. The findings show that the United States leads in this area of research, followed by China, the United Kingdom, and India. The Chinese Academy of Science published the most research studies, and NIH ranked first on the h-index. The Icahn School of Medicine at Mt Sinai leads in the average number of citations per study. Sci Rep, Drug Discov. Today, and Brief. Bioinform. are the three most productive journals evaluated from three separate perspectives, and pharmacology and pharmacy are unquestionably the most commonly used subject categories. Cheng, FX; Mucke, HAM; and Butte, AJ are the top 20 most prolific and influential authors. Keyword analysis shows that in recent years, most research has focused on drug discovery/drug development, COVID-19/SARS-CoV-2/coronavirus, molecular docking, virtual screening, cancer, and other research areas. The hotspots have changed in recent years, with COVID-19/SARS-CoV-2/coronavirus being the most popular topic for current drug repurposing research.
Article
In this study, we introduce a new literature-aging conceptual model to study the citation curve and discuss its implications. First, we improve the conceptual model by adding a period to describe the “death” of citations. Second, we offer a feasible operationalization for this conceptual model and implement a set of cross-discipline publications in the Web of Science to test its performance. Furthermore, we propose two measurements according to the new model—“Sleeping Period” and “Recognition Period”—to capture publications’ citation curve patterns. For instance, we find that half of the papers in Arts & Humanities published in 1985 receive no or extremely few citations in the first 5 years after their publication; after that, on average, those papers in Arts & Humanities have a 5-year-long period when their citations grow rapidly. In addition, we observe a special phenomenon named “literature revival” as some publications may have multiple citation life-cycles, which has received little attention from current research. Finally, we discuss the implications of our study, especially the application of the Sleeping Period and Recognition Period in improving scientific evaluation and collection development in libraries, and the inspiration of the “literature revival”.
Article
Full-text available
A partir de la base de datos Social Sciences Citation Index de WoS se identifican los artículos científicos brasileños que atienden a temas y métodos relacionados con la transformación digital en Información y Documentación, en el intervalo 2010-2019. Con los títulos y palabras clave de los 1039 artículos recuperados se elaboran tablas dinámicas. Luego los títulos se categorizan por expertos, mientras que para las keywords se usa el modelo de n-grams con el filtro de Fruchterman-Reingold. Sus coocurrencias se visualizan en clústeres elaborados con el método Louvain de detección de comunidades. Los resultados se exponen en tablas y gráficos comentados sobre la internacionalización, coautoría e interdisciplinariedad de los artículos, junto a las técnicas de investigación seguidas en ellos. Se estudia la progresión temática de los que se refieren al ambiente y la transformación digital. Se evidencia una aportación numerosa y significativa de los autores brasileños al estudio de los efectos que los escenarios digitales causan en el sector de la información.
Article
Full-text available
The main purpose of the paper is to perform a short review on the problem of implementation of scientometric indicators for research evaluation in the context of Ukraine. Peculiarities of usage of key scientometric terms in normative documents are examined. A number of case studies are given to illustrate the ambiguity of application of particular indicators in order to rate authors, research groups, institutions or scientific journals. The importance of balance between expert evaluation and quantitative analysis in the national system of research evaluation is highlighted, and the inadmissibility of any manipulations of scientometrical terms and notions is underscored.
Article
Full-text available
We propose the I3* indicator as a non-parametric alternative to the journal impact factor (JIF) and h-index. We apply I3* to more than 10,000 journals. The results can be compared with other journal metrics. I3* is a promising variant within the general scheme of non-parametric I3 indicators introduced previously: I3* provides a single metric which correlates with both impact in terms of citations (c) and output in terms of publications (p). We argue for weighting using four percentile classes: the top-1% and top-10% as excellence indicators; the top-50% and bottom-50% as output indicators. Like the h-index, which also incorporates both c and p, I3*-values are size-dependent; however, division of I3* by the number of publications (I3*/N) provides a size-independent indicator which correlates strongly with the 2- and 5-year journal impact factors (JIF2 and JIF5). Unlike the h-index, I3* correlates significantly with both the total number of citations and publications. The values of I3* and I3*/N can be statistically tested against the expectation or against one another using chi-squared tests or effect sizes. A template (in Excel) is provided online for relevant tests.
Article
Full-text available
Background: An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. Methods: The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor's definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Results: Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals' impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Research limitations: Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. Originality/ value: We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation.
Article
Scientometric indicators influence the standing of journals among peers, thus affecting decisions regarding manuscript submissions, scholars’ careers, and funding. Here we hypothesize that impact-factor boosting (unethical behavior documented previously in several underperforming journals) should not be considered as exceptional, but that it affects even the top-tier journals. We performed a citation analysis of documents recently published in 11 prominent general science and biomedical journals. In these journals, only 12 to 79% of what they publish was considered original research, whereas editorial materials alone constituted 11 to 44% of the total document types published. Citations to commissioned opinion articles comprised 3 to 15% of the total citations to the journals within 3 postpublication years, with even a higher share occurring during the first postpublication year. An additional 4 to 15% of the citations were received by the journals from commissioned opinion articles published in other journals. Combined, the parallel world of uncitable documents was responsible for up to 30% of the total citations to the top-tier journals, with the highest values found for medical science journals (New England Journal of Medicine, JAMA, and the Lancet) and lower values found for the Science, Nature, and Cell series journals. Self-citations to some of the top-tier journals reach values higher than the total citation counts accumulated by papers in most of the Web of Science-indexed journals. Most of the self-citations were generated by commissioned opinion articles. The parallel world of supposedly uncitable documents flourishes and severely distorts the commonly used scientometric indicators.
Article
Full-text available
This paper aims to present a quantitative analysis of arts management/marketing articles in leading general management/marketing journals, including an examination of the extent to which those top tier journal articles on arts/culture-related topics cite authors of leading arts management journal articles. Using bibliometric techniques, this study examines the content of 20 top tier management and marketing journals over 22 years to identify articles published on arts management/marketing, which authors were cited, and from which arts management/marketing journals. Analysis indicates that: relatively few citations in the top management/marketing journals reference arts management/marketing journals; assessment of interaction between the parent management/marketing disciplines and the arts management/marketing sub-discipline indicates that authors draw upon a large reserve of diverse literatures; and top journal arts-related management/ marketing articles tend to utilize citations to journal articles grounded in the social sciences and aesthetics of management, with an increasing trend of citations to arts management/marketing journals.
Article
Purpose – The traditional, annually issued Journal Citation Reports (JCR) have been enhanced since the 2007 edition by the Eigenfactor Scores (EFS), the Article Influence Scores (AIS) and the five‐year Journal Impact Factor (JIF‐5). This paper aims to focus on the issues. Design/methodology/approach – These scientometric indicators are also available from the Eigenfactor Project web site that uses data from the yearly updates of the JCR. Findings – Although supposedly identical data sources are used for computing the metrics, there are differences in the absolute scores reported, which in turn resulted in significant (more than ten rank positions) changes for several journals in the sample. Originality/value – The differences in the scores and rank positions by the three new scientometric indicators of 52 journals in the Information and Library Science category were analysed to determine the range of differences and the extent of changes in rank positions.
Article
Full-text available
A review of Garfield's journal impact factor and its specific implementation as the Thomson Reuters Impact Factor reveals several weaknesses in this commonly-used indicator of journal standing. Key limitations include the mismatch between citing and cited documents, the deceptive display of three decimals that belies the real precision, and the absence of confidence intervals. These are minor issues that are easily amended and should be corrected, but more substantive improvements are needed. There are indications that the scientific community seeks and needs better certification of journal procedures to improve the quality of published science. Comprehensive certification of editorial and review procedures could help ensure adequate procedures to detect duplicate and fraudulent submissions.
Article
The evaluation of scientific journals with bibliometric indicators has been dominated by the Impact factor since the 70s. However Thomson has recently included in the Journal Citation Reports the Eigen factor and the Article influence score. On the other hand Elsevier has included in Scopus the Source normalized impact per paper (SNIP) and the SCImago journal rank (SJR). In this paper we introduce and describe these indicators. Secondly, to study the similarities we analyze correlations of the traditional indicators and the new ones, detailing the results across 27 scientific fields. It was noted that some couples of indicators such as Eigen-Citations, Impact factor-ArticleScore, Impact factor–SJR and SJR-ArticleScore do correlate in many areas. Correlations showed different trends in Science and Social science; therefore, in the last section we discuss the need to take into account the scientific area when selecting an indicator.
ResearchGate has not been able to resolve any references for this publication.