Commentary: Team Science
Dr. O'Brien is associate dean for research strategy, School of Medicine, University of California, San Francisco, San Francisco, California. Dr. Yamamoto is executive vice dean, School of Medicine, and vice chancellor for research, University of California, San Francisco, San Francisco, California. Dr. Hawgood is dean, School of Medicine, and vice chancellor for medical affairs, University of California, San Francisco, San Francisco, California. Academic medicine: journal of the Association of American Medical Colleges
(Impact Factor: 2.93).
02/2013; 88(2):156-7. DOI: 10.1097/ACM.0b013e31827c0e34
A revolution in biomedical science is under way, and participation demands the successful integration of new technologies and concepts drawn from many fields, including but not limited to the biologic sciences, the physical sciences, and engineering. This integration, often called team, or interdisciplinary, science, is easy to conceive but surprisingly hard to achieve. The authors reflect on the emerging ways teams assemble, confront institutional and cultural barriers, and integrate trainees. They focus in particular on the article by Ravid and colleagues in this issue of Academic Medicine, which describes three years of their institution's successful experiment to foster interdisciplinary science.The authors acknowledge the impressive outcomes of this experiment but state that the research community should be thinking down the road of ways to evaluate whether the output from team-based science actually has more impact in changing paradigms and opening up new avenues of research; whether more risk-taking science is being performed when science is team based; whether there are fundamental implications for the organization of academic health systems, schools, and departments; what the implications are for training our students; and what the short- and long-term implications are for investigator reward and development.
Available from: fasebj.org
[Show abstract] [Hide abstract]
ABSTRACT: There has been a dramatic increase in the number and percentage of publications in biomedical and clinical journals in which two or more coauthors claim first authorship, with a change in some journals from no joint first authorship in 1990 to co-first authorship of >30% of all research publications in 2012. As biomedical and clinical research become increasingly complex and team-driven, and given the importance attributed to first authorship by grant reviewers and promotion and tenure committees, the time is ripe for journals, bibliographic databases, and authors to highlight equal first author contributions of published original research.-Conte, M. L., Maat, S. L., Omary, M. B. Increased co-first authorships in biomedical and clinical publications: a call for recognition.
Available from: iai.asm.org
[Show abstract] [Hide abstract]
ABSTRACT: As the body of scientific knowledge in a discipline increases, there is pressure for specialization. Fields spawn subfields that then become entities in themselves that promote further specialization. The process by which scientists join specialized groups has remarkable similarities to the guild system of the middle ages. The advantages of specialization of science include efficiency, the establishment of normative standards and the potential for greater rigor in experimental research. However, specialization also carries risks of monopoly, monotony and isolation. The current tendency to judge scientific work by the impact factor of the journal in which it is published may have roots in over-specialization, as scientists are less able to critically evaluate work outside their field than before. Scientists in particular define themselves through group identity and adopt practices that conform to the expectations and dynamics of such of groups. As part of our continuing analysis of issues confronting contemporary science we analyze the emergence and consequences of specialization in science with a particular emphasis on microbiology, a field highly vulnerable to balkanization along microbial phylogenetic boundaries, and suggest that specialization carries significant costs. We propose measures to mitigate the detrimental effects of scientific specialism.
Available from: Jonathan Daniel Wren
[Show abstract] [Hide abstract]
As the amount of scientific data grows, peer-reviewed Scientific Data Analysis Resources (SDARs) such as published software programs, databases and web servers have had a strong impact on the productivity of scientific research. SDARs are typically linked to using an Internet URL, which have been shown to decay in a time-dependent fashion. What is less clear is whether or not SDAR-producing group size or prior experience in SDAR production correlates with SDAR persistence or whether certain institutions or regions account for a disproportionate number of peer-reviewed resources.
We first quantified the current availability of over 26,000 unique URLs published in MEDLINE abstracts/titles over the past 20 years, then extracted authorship, institutional and ZIP code data. We estimated which URLs were SDARs by using keyword proximity analysis.
We identified 23,820 non-archival URLs produced between 1996 and 2013, out of which 11,977 were classified as SDARs. Production of SDARs as measured with the Gini coefficient is more widely distributed among institutions (.62) and ZIP codes (.65) than scientific research in general, which tends to be disproportionately clustered within elite institutions (.91) and ZIPs (.96). An estimated one percent of institutions produced 68% of published research whereas the top 1% only accounted for 16% of SDARs. Some labs produced many SDARs (maximum detected = 64), but 74% of SDAR-producing authors have only published one SDAR. Interestingly, decayed SDARs have significantly fewer average authors (4.33 +/- 3.06), than available SDARs (4.88 +/- 3.59) (p < 8.32 × 10-4). Approximately 3.4% of URLs, as published, contain errors in their entry/format, including DOIs and links to clinical trials registry numbers.
SDAR production is less dependent upon institutional location and resources, and SDAR online persistence does not seem to be a function of infrastructure or expertise. Yet, SDAR team size correlates positively with SDAR accessibility, suggesting a possible sociological factor involved. While a detectable URL entry error rate of 3.4% is relatively low, it raises the question of whether or not this is a general error rate that extends to additional published entities.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.