Questions related to Scientometrics
I developed a study on the scientificity of the scientific field of Sociology in Brazil and discovered thematic variations (research) that apparently depend on the geographical location of the Graduate Program. My current effort is to correlate my scientometric results with the context in which sociological science is practiced. I am looking for partnerships to deepen this and other discussions.
There are different tools (e.g., R, BibExcel) for science mapping. The tools often help in extracting the highly cited references (with frequency count). I recently used one of them for practice and noticed several duplicates of cited references. It was challenging to determine the unique frequency count of a particular reference. I found the following issues:
1. Missing authors.
2. Short names of journals.
3. Format of text (capitalization)
4. Inclusive DOIs.
These were the common reasons that made the reference duplicate. However, I used Regular Expressions to fix these problems. But I am looking for a distinctive method or tool. Kindly suggest.
I am performing a bibliometric analysis of 2200 papers ranging from the year 1741 to 2020 and I wish to separate the network analyses (of collaboration between authors and between institutions) by a few stages from productivity. So far, I am only familiar with the industrial life cycle analysis approach, but it does not seem so relevant to assess productivity from the publication of scientific articles. In this sense, I would like to ask you: What is the best way to separate by a few stages the scientific productivity (ranges of years for subsequent analyses) in a field of knowledge?
Thank you very much
There are many fastest journals, which published within min. 15 days and max. 45 days.
Many more scientists need to fast publications.
- Bibliometrics, Scientometrics and Informetrics were introduced in twentieth century (later some others). Those years had their values, behaviors and characteristics in the world of higher education. Fraud, misconduct, data fabrication, plagiarism, were not so wide spread as they are now. In recent years we were introduced terms such as gift or ghost in authorship.
- But we are still using old methods (in new forms) in metric studies E.g. Scientometrics to say who is who in science, or where is where in science... In the age of virtual worlds available to everyone, everywhere, even the meaning of "publish or perish" has changed. Many universities around the globe not only put pressure on their faculty members but also many incentives in different forms, such as cash, to publish more and more... Too many authors have become addicted to see their works published at any costs. They show off their published works, citations, … and "ordinary Scientometrics" with any given names, descriptive or analytic in nature up to now help them. All appear to become "Soft Power" and "Matthew Effect" for those who might not deserve. We have to move to something "Interpretive", "Inferential" or the like to find out or pull back the curtain (please see also my other 2 discussions, related to this one).
wanted to know the similarities and difference between the following terms
what are the different ways to measure above all metrics and any good research study/studies to cover all the aspects of metrics.
Apart from Google Search or Google Scholar, any other way to find a bibliometric analysis paper?
Example for Systematic Review (SR) will be using International Prospective Register of Systematic Reviews (PROSPERO).
What about Bibliometric Analysis?
Any search database...
I am looking for a list of recommended journals for bibliometric / scientometric studies.
I am aware of Scientometrics - Journal information - Springer and Journal of Informetrics - Elsevier. however, I am looking for more options.
I am looking for software/ tools that can be used for scientometric analysis. I would appreciate if you analyze few of these tools comparatively based on some parameters.
The altmetrics score takes into account the interactions between the general public, via social networks, and the research community. It gives another value to scientific research, it allows the research community to measure even with uncertainties the level of societal appreciation. In this context, I would like to know its effectiveness.
I want to start a online discussion on Bibliometric and Scientometrics Tools for Learning from you and Sharing my own Experience. We can make a WhatsApp group as well as we can do weekly or monthly meeting on Microsoft Team.
Those who are agree with my idea, please share your number and willingness to create group.
The impact of research is analyzed and measured using scientometric indicators. One such indicator used to assess scientific research is citation analysis. However, the number of citations helps analyze the quality of scientific information.
I need a research colleague for a mini research in the type of a letter to the editor or short communication. This person must be a professional in searching and know the database to collect the data. She/he should extract the CV of about 60 authors. Her/his name is write as the second author in the paper. If she/he help more, may be selected as the first author (not promised). Let me know if anyone has the desire and time.
Is there any software which has the functionality to export bibliographic coupling matrix for analysis in SPSS? I know BibExcel can do it but it is very complex.
Hi everyone, I'm looking for sotware that helps me to clean data from bibliographic databases, i don't know how to program so I really need a tool that it's easy to use (maybe with a little bit of programming). I already used VantagePoint but I don't have more access to it :( Help me please
When looking at the sociology of science we aim to understand the social aspects of science. We look at the relationship to institutions as well as the influence on and construction of scientific knowledge.
Are there any points within the history of science where due to the sociological aspect of it, it was not neutral? and if so, how did that come to be?
For my thesis, my supervisor and I have decided to look at how questions are posed in research across different cultures. In order to do this, we will use framing analysis. The issue is, however, that I need to operationalise the sociology of science before I can start the analysis.
I need to figure out how to operationalise the preoccupation in social research. So what could possibly not be neutral, and where does it come from?
I really hope some of you can give me a helping hand or a direction to look in because I am really struggling.
Since she is an expert in scientometrics, I am trying to locate her in order to invite her to review an article on the subject.
Lecture on New Developments in Scientometrics. Today I gave my lecture entitled New Developments in Scientometrics. The challenge that arises is the gap between the ideals and current situation of Scientometrics. I categorized the evolution of scientometrics into three periods: the beginning, the growth, and the maturity. Scientometrics has seen remarkable progress in its relatively short path. I outline the progress made in the five areas of approaches, frameworks, tools, measures, and practices. I believe that all five areas need improvement. Today, it is possible to apply #Block_Chain technologies in scientific evaluations. Moving to a new position will yield result in outstanding achievements. what is your opinion?
I am fully aware that the question is quite broad, and yet do you know of any estimates of number of annual publications, let's say, in psychology? Or any other discipline belonging to social sciences? I am not interested in individual achievements, but rather in global numbers.
In advance, thank you for your inputs.
All the best,
Publishing and citing behavior of journals vary across fields. In different fields, different dissemination channel of research activity are preferred, such as in social science books are preferred over journal articles, and in computer science, results are mostly published in conference papers. The number of references per article also vary across disciplines. Similarly, some journals are multidisciplinary, some are open access and some are closed access. Impact factor does not solve the problem of journal comparison across domains. Which metrics and measures or factors could be important in comparing journals in and across the discipline? Reference to any related article will be highly appreciated.
In the literature, there is controversy whether or not there is a need for the treatment of uncertainties in scientometrics studies. What is your opinion?
Which types of research collaboration associate with increased citation impact: individual, institutional, or international?
I have seen in the last decades in Latin America, that countries such as Brazil, Mexico, Venezuela, Peru and Colombia, have move to try to organize national systems of research, in which the classification of research groups as well of researchers, is officially made in a periodical way. E.g. in Colombia, research groups, after criteria-based measurement (based mainly in article and other products historic and recent performance), classifies them in four levels: A1 (highest recognition), A, B and C. Also researchers as: junior, associate and senior (highest recognition). Then, my question is in which other countries, a national science agency classifies and certifies the research groups and researchers officially?
Recent results (2017) of classification of research groups and researchers in Colombia by the national agency Colciencias (in Spanish):
Alfonso J. Rodriguez-Morales
Research Group Public Health and Infection A1,
Universidad Tecnologica de Pereira,
Dear colleagues, I really would like to know your opinion. I am asking specifically for electrical engineering field and related fields.
What value of H-index (Web of Science citations only) is considered as "good enough" for Assistant Professor and Associated Professor job in your country (or university)?
Did anybody ask you for its value when applying for a job?
- The h-index reflects both the number of publications and the number of citations per publication. For example: a scientist with an h-Index of 20 has 20 papers cited at least 20 times.
- The g-index looks at overall record of citations from higher-cited articles to be used to bolster lower-cited articles. For instance a scientist with 20 papers, 15 of which have no citations with the remaining five having respectively 350, 35, 10, 4 and 1 citations would have a g-index of 20, but a h-index of 4 (four papers with at least 4 citations each).
I’m looking for options to publish in Mycology (specific or non-specific Journals on Mycology). In 2011, Hyde and KoKo published a very interesting and applicable paper entitled “Where to publish in mycology?” where they provided information for the major journals that publish manuscripts entirely devoted to Mycology. However, were created new Journals in the last years and (unfortunately) open access has forced the author to choice Journals with very expensive publications fees and that yet charge for reads theirs access to our articles. Because of this, I would like to know where do you are publishing your papers? It is worth to publish without open access? Which Journals do you recommend most for fungal biology/ecology publication? I’m asking it just because I'm very worried with the elitism of scientific knowledge and how it can delay the science progress and the dissemination of “what we do in our labs/universities”.
Thanks for your response!
Hyde & KoKo (2011): https://bit.ly/2IuUQrQ.
I have submitted a manuscript for publication where author many time instructed me to correct the citation format which must have page numbers at the end of each citation. I am doing it through ZOTERO STANDALONE citation Manager.
Following are the examples provided by the editor of the journal. I want to know the exact name of this citation format.
1. Changqing Cao and James D. Seymour (eds.), Through Dissident Chinese Eyes: Essays on Self-Determination (New York: M.E Sharpe Inc, 1998), pp.59-67.
2. Farzana Shakoor, ‘The Kargil Crisis: An Analysis’, Pakistan Horizon, Vol. 52, No. 3, July 1999, p.50.
sir/madam, apart from package (bibliometrix) what are the other packages required and available particularly for scientometrics studies. Also, whether compatible with with Rstudio.
to do a research, need list of job titles that scientometrics and bibliometrics experts can occupy. data analyst and research librarian are two examples.
Taxonomic bias in research papers is well established, but the underlying drivers are poorly understood. As professional scientists, we are under enormous pressure to publish, and the type of sophisticated research that appeals to the top journals often requires a well researched study system. This potentially limits research on understudied species. Moreover, limited resources mean that scientists study what is practically convenient rather than the species in most need of research. We are also motivated by personal biases, with many of us drawn to work on charismatic/iconic species.
We are currently constructing a conceptual model to better understand the drivers of taxonomic bias in conservation research, and I would love to hear about people's experiences of why they ended up working on a particular species.
Currently, there are several software for conducting bibliometric studies. Some of them are:
- PoP (Publish or Perish)
Regardless of what kind of analysis you want to conduct: which software is interesting four you based on criteria such as user-friendliness, computations, graphs, etc?
Currently I am working on plugin for Gephi platform that imports citations networks from Crossref resources.
Below you can watch it in action:
This networks can further be analyzed by Gephi functions.
As we know Crossref is a valuable data store of metadata of scientific works.
What other open access valuable bibliographic data sources (in terms of completeness, quality, well defined web access) of scientific papers citations (that could be used for bibliometric, webometric and scientometric analysis) would you recommend?
Hi, how can i convert or import scopus data into hist cite to analyze that sama as importing web of science data into histcite? thanks, im waiting for your answer
Question about new conditions which researchers are facing, i will appreciate your contribution.
this is the translation of Persian language of this site: http://isid.research.ac.ir/AboutUs.php
perhaps it is not the correct translation. we evaluated by google scholar and scopus not for articles which published in pubmed and PMC.
how can we solve this problem? is there any site or database for The Iranian Scientometric Information Database (ISID) which use for pubmed and pmc databases?
Pls consider the following.
Scientometrics is the science of science measurement and analysis that measures the scientific output of researchers, universities, and countries in the form of quantitative variables. Scientometric indices include indicators for assessing the quantity and quality of scientific output of researchers that can be the basis for evaluation, ranking and promotion of faculty members.
The Iranian Scientometric Information Database (ISID) in 1394=2014 with the aim of extracting and displaying the scientific determinants of faculty members of Iran's medical sciences universities by the Center for the Development and Coordination of Information and Scientific Publications of the Ministry of Health and Science Deputy Research and Technology , Medical treatment and medical education has been designed, implemented and implemented. In the ISID system, the general information of the faculty members of the medical sciences universities including the name, university, faculty and research center of the place of employment, the scientific rank, the field and the last level of education are included by the scientific experts of the research and technology departments of the medical universities. . Scientometric indices of faculty members in this system include the number of published articles, the total received citations, the average citations per article, the h-index index, the h-index without self-citation, the h-index index without self-citation of the authors and the index h -Index without citation of the book.
Method of collecting and providing information
The ordering of the results in the ISID system is by default based on the h-index of individuals. It should be noted that the arrangement can be changed by clicking on the arrow next to the title of each one on the home page.
In Scientometric system, faculty members can filter information based on the name of the university, research center, discipline, and academic level. In addition, people's search by name is also possible.
The hirsch index is one of the important indicators of science metrics that was presented by Dr. George Hirsch, a professor of physics at the University in 2005, in order to compute and display the quantity and quality of scientific output of researchers. Thus, the researcher h's index h is h of the number of his papers, each of which has at least h times counted.
The basis for calculating the scientometric indices in the ISID system is the latest data extracted from the Scopus database. Other information of each faculty member in this system, such as a photo, individual address in Google Scholar, CV, if the information is entered by the relevant University Scientific Scientist, is displayed by clicking on the name of the faculty member.
Dear members of the faculty, if you see any errors in the information in the system, you can request the correction of your information by sending a feedback form that is visible on the individual page of each person, or by contacting the University of Science Scientists Asking for correction of their information and follow up.
I'm not interesested in scientometric indicators of the impact of published researches, but in defining the tendencies of researches taking as a basis the objects of study, the methodology used, and the problems they address.
Dear Professor Grushka, Do you intend to write something about the theory of changeable sets for non-expert readers? I think that his works on the subject are very interesting in many respects, but usually too technical perhaps for non-mathematicians or mathematicians working in a different field. Since this theory has applications in both physics and mathematics, I believe such an introduction would be very interesting! With my best Regards, Ricardo Vieira.
Bruno Latour between other scholars of ANT theory has been cited on Social Science and Technology Studies, around the thesis: "Connaissance applicable non appliqué" -CANA- . Are there in Latinoamerican context some examples against CANA thesis?
Which measures might researchers use to present a more intelligently calculated picture, if they are asked for their h-index?
I have an export file (excel) from my literature management software (CITAVI) and want to import the data into Gephi for a Co-Citation Analysis. As far as I know I have to prepare a 'affiliation matrix' for a bipartite network (each line one node/article and each column one node/article (= from reference list). I have an excel file in which the analyzed articles are correctly listed in the rows. However, I have one column (=reference list) in which each cell contains the whole references from the corresponding article. I hope I made it clear which issue I am facing. To clarify it I made a screenshot a part of the table.
I would highly appreciate any suggestions how to transfer this table into a Gephi compatible file.
Thank you very much!
This is a general question about the relation between philosophy of science and social studies of science. Having worked for a while on the boundary between both fields, my impression is that hardly any connection exists between them. The usual explanation I get from scholars in both fields is that social studies of science are concerned with describing what science is (descriptive) and philosophy of science is concerned with why science works (normative).
Now my question is, hasn't this organic divide between both fields become obsolete as a result of recent trends in the evaluation of scientific research? Philosophers of science might claim to be conducting the "normative" study of science, but paradoxicaly it is the work in "descriptive" disciplines such as scientometrics that is today driving research policies worldwide. It seems that the normative discussions that matter today are being carried out no longer by philosophers in philosophy of science journals, but by statisticians in the methodological sections of their papers.
My impression is that philosophers tend to react to this in a patronizing way, uttering their general scepticism about the quality and relevance of scientometric data. But isn't that too easy, given that this data is de facto informing research policies worldwide? Shouldn't philosophers of science at least try to claim some of that ground? Because maybe, just maybe, policymakers are right in ignoring philosophers. I mean, would you trust doctors' normative advice about your health if you know those doctors are ignoring the largest available body of data about their field of expertise?
"Oh, but the statistical data about science is not relevant," I often hear. I'm afraid this kind of statement says more about the relevance of current philosophy of science than it says about scientometric data. I mean, why couldn't philosophers of science find a way to make that data relevant now that it's there anyway?
A reason why science is successful might lie not just with the truth of its statements or the methods used by its agents, but also with the evolution of its structure. If institutional economists like Douglass North can use the evolution of the structure of the economy to explain economic growth, why couldn't philosophers use the evolution of the structure of science to explain scientific progress? With the digitization of scientific research a generation of philosophers of science -for the first time ever!- can study the evolution of the structure of science and philosophize about its connection to scientific progress. It might be that there's no such connection, but how would we ever know this for sure without philosophers of science trying to operationalize existing theories of scientific progress? (for an example of this approach, see my paper "A comparison of two models of scientific progress")
So again my question: is it the case that this divide between the normative and descriptive study of science exists to this day, and isn't it time it came to an end? All opinions as well as pointers to relevant communities working on the edge of both disciplines would be much appreciated.
There has been a recent buzz about altmetrics to scientific article. People are saying it as sociatal impact of articles and some other are saying it the influence that an article creates within its community. but does it really measure impact as altmetric data can be easily manipulated? so how can we really say it measures the sociatal impact? what altmetrics really measures?
Cited references we can say are the main motivation that an author takes in preparing the current article. These cited references carry important information regarding establishing connection between the article and the subject domain of the article. Many people generally want to claim about the presence of this interconnection. I want to now is there any such research papers available which establishes this fact that the cited references can be an useful tool in establishing connection between an article and a subject area?
Does anyone have the literature comparing the publication characteristics/pattern of Scientists of R&D institutions with University teachers.
Is there a way to search in the Acknowledgements field of “all” published scientific papers? I can’t find such a delimiter in Web of Science (but maybe I just don’t know where to find it). Do you know of any other database / search engine that will let me do this?
I'm looking to quantify the use of various words and phrases.
Recently, I have read in a Spanish newspaper about the use of Fh index to assessing the scientific excellence of researchers being defined as the individual h weighted average h index of 3,784 researchers indexed in the same area of knowledge.
Is Fh index an extended criteria?. In which scientific database (ISI Web, Scopus or others) do the weighted h average of researchers for an specific area of knowledge appear?.
It is obvious that bibliometrics / scientometrics studies use quantitative approaches to measure publications, citations, collaborations etc. But what about qualitative methodologies in these fields? Is there any popular approach or well-known work on implementing qualitative methods on scientometrics studies?
Scientiometric evaluation is important for all of us. However, the methodologies of "measurement of our scientific publications" are varied, controversial and subjective. But then, how to address the issue of impact factor, citation index or h index?
Of course this must vary widely among disciplines, institutions, and regions. Does anybody have an example, preferably mentioning these three variables to provide context?
There are some metrics available for evaluating research and the researchers. Example, publication based metrics - citation, impact factors, h-index, altmetrics, etc... But how one can evaluate a mission based research. They do not publish papers in high impact journals or get citation.
We want to collect (and then code) a number of the key 'authoritative' articles containing certain key words. The Ebsco ranking algorithm is purely based on the text content of an article (and BTW - it is excellent for that purpose), but it does not allow for any ranking based on how often that article is cited. Is there a way of doing this in a systematic and automated way?
I'm working with Vantage Point, but I would like know about other options. My aim is to analyze interactions between institutions (eg. university-industry).
Commercial companies are counting citation indexes, forcing publishers to enter into contracts.
Publications for publishing houses that do not have signed contracts are not considered in the calculation.
"While it is incorrect to say that the impact factor gives no information about individual papers in a journal, the information is surprisingly vague and can be dramatically misleading." [Joint Committee on Quantitative of Research. Citation Statistics. A report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS). 2008].
I mean a formal assessment of research output, as in the UK and other countries. If you know of papers that talk about this, that would be useful too. Thanks in advance.
The recent paper by Omar Hernando Avila-Poveda puts the finger in the wound and shows that the perhaps rather complex naming rules in the latin countries cultures, may add an extra burden to those trying to initiate their academic career into the publish-or-perish world. Details such as compound surnames or the use of specific accents within latin names, may impair the chances of internet robot-programs to find and acknowledge your contribution. Do you agree with Omar's suggestions regarding a standard latin-origin author name? Any tip to the younger, less experienced, wannabe authors?
I am not sure if citations really mean scientific excellence, but it does mean impact in the scientific world. I seek ways to maximize citation numbers while keeping an ethic approach (no citation cooperative suggestions please).
Thanks for this article. In citation networks, it is often difficult to account for negative citations. This article may help us come up with a way to deal with this problem. Any thoughts on this challenge (from anyone in scientometrics) are welcome!
In the age of the Internet and the movement of open access to scientific knowledge it appears possible to publish articles in any journals, and not only in the elite ones. Such articles will be located using Google Scholar engine, downloaded, if their full text versions are available, and cited if found interesting. So by submitting articles to such journals, scientists will save time on long communications with editors and reviewers of elite journals, while the articles published in low-impact-factor journals will eventually reach the same audience faster and may get the same number of citations. The question of whether or not an article will be cited is becoming increasingly less dependent on the journal in which it is published. In the open access era, articles become available right away, and you do not need to look through volumes and issues of journals as before. Have you experienced any of these trends when you began to publish your articles in open access journals or make you articles from low-impact journals available through the open access facilities? Has anyone tried to trace a connection between the total downloads of articles (for example, on RG) and the number of their citations?
See also Lariviere, V.,Lozano and G.A.,Gingras, A. Are elite journals declining?-2013.-
If there is one, I would like to know the relationship between the universal impact factor (www.uifactor.org) values and Thomson Reuter impact factors. If somebody has made a comparison between journals which are evaluated in both companies, I would like to know the result of it. The main question is, what is the usability of UIF? Can UIF characterize a value of a journal (impact) well?