ArticlePDF Available

Imagining the “open” university: Sharing scholarship to improve research and education

PLOS
PLOS Biology
Authors:

Abstract and Figures

Open scholarship, such as the sharing of articles, code, data, and educational resources, has the potential to improve university research and education as well as increase the impact universities can have beyond their own walls. To support this perspective, I present evidence from case studies, published literature, and personal experiences as a practicing open scholar. I describe some of the challenges inherent to practicing open scholarship and some of the tensions created by incompatibilities between institutional policies and personal practice. To address this, I propose several concrete actions universities could take to support open scholarship and outline ways in which such initiatives could benefit the public as well as institutions. Importantly, I do not think most of these actions would require new funding but rather a redistribution of existing funds and a rewriting of internal policies to better align with university missions of knowledge dissemination and societal impact.
This content is subject to copyright.
PERSPECTIVE
Imagining the “open” university: Sharing
scholarship to improve research and
education
Erin C. McKiernan*
Departamento de
´sica, Facultad de Ciencias, Universidad Nacional Auto
´noma de Me
´xico, Mexico City,
Mexico
*emckiernan@ciencias.unam.mx
Abstract
Open scholarship, such as the sharing of articles, code, data, and educational resources,
has the potential to improve university research and education as well as increase the
impact universities can have beyond their own walls. To support this perspective, I present
evidence from case studies, published literature, and personal experiences as a practicing
open scholar. I describe some of the challenges inherent to practicing open scholarship and
some of the tensions created by incompatibilities between institutional policies and personal
practice. To address this, I propose several concrete actions universities could take to sup-
port open scholarship and outline ways in which such initiatives could benefit the public as
well as institutions. Importantly, I do not think most of these actions would require new fund-
ing but rather a redistribution of existing funds and a rewriting of internal policies to better
align with university missions of knowledge dissemination and societal impact.
Introduction
Over the last few years, we have seen growth of grassroots movements to increase access to
scholarly products, such as articles, code, data, and educational resources (e.g., [15]). We
have also seen a rise in the number of government and private funders mandating open access
and open data [6,7] and the emergence of the Open Research Funders Group (http://www.
orfg.org). These initiatives have been key in raising awareness and acceptance of open scholar-
ship. However, despite these advances, I believe we have hit a wall that is impeding widespread
adoption. While increasing numbers of academics may ideologically support sharing their
work, many are concerned with how these practices will affect their career prospects and
advancement [813].
Academic institutions are one of the primary influencers affecting how faculty perceive
open scholarship and how willing they are to engage in certain practices [8,13,14]. Faculty
often cite a lack of institutional support for open access, especially in evaluations, as one reason
they are reluctant to publish in these journals [11]. Moreover, faculty express fear that open
scholarship practices, especially those that fall outside the traditionally rewarded research
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 1 / 25
a1111111111
a1111111111
a1111111111
a1111111111
a1111111111
OPEN ACCESS
Citation: McKiernan EC (2017) Imagining the
“open” university: Sharing scholarship to improve
research and education. PLoS Biol 15(10):
e1002614. https://doi.org/10.1371/journal.
pbio.1002614
Published: October 24, 2017
Copyright: ©2017 Erin C. McKiernan. This is an
open access article distributed under the terms of
the Creative Commons Attribution License, which
permits unrestricted use, distribution, and
reproduction in any medium, provided the original
author and source are credited.
Funding: This article was originally a white paper
submitted as part of a conference jointly supported
by the U.S. National Science Foundation (NSF) and
the National Institutes of Health (NIH) entitled,
"Imagining Tomorrow’s University: Rethinking
scholarship, education, and institutions for an
open, networked era" (http://www.ncsa.illinois.edu/
Conferences/ImagineU/inputs.html), held March
8th and 9th in Rosemont, IL. Funding for this event
was provided in part by NSF grant ACI-1645571
(PI: Daniel S. Katz) and NIH grants 5 U24
ES026465 02 and 3 U24 ES026465 02S1 (PI: John
Darrell Van Horn).
Competing interests: The opinions expressed
herein are those of the author and not necessarily
those of her institution or affiliated organizations.
The author is the founder of the "Why Open
Research?" project, an open scholarship advocacy
products, will not only not be rewarded but may even hurt their evaluations. For example, one
respondent of a 2011 survey of medical faculty [15] wrote,
To my knowledge, community-engaged scholarship is perhaps a liability in the promotion pro-
cess, because it slows work down and may result in fewer publications. Publications, by the
number, still reign supreme here.
Faculty understandably pay attention to what institutions value and where evaluation com-
mittees place the most weight to decide where to invest the most personal effort. As a Univer-
sity of Idaho faculty member wrote in response to a 2013 survey [11],
What will we value at tenure and promotion? That will be the predominant driver of what we
as a university community do. If public outreach and measure of its effectiveness can be cap-
tured and it becomes highly valued—then maybe that’s what we’ll be doing instead.
A 2015 survey in the United Kingdom found that academics are increasingly tailoring their
scholarly production and publication decisions to fit institutional evaluation criteria [16]. Thus,
I believe universities are in a unique position to support open scholarship and break through
some of the barriers to widespread adoption. This support could come in many forms, includ-
ing recognition of open access and open data in promotion and tenure evaluations, small grants
to support the development of open educational resources, and redirecting existing funds from
proprietary software to support creation and training in open source solutions. Simple actions
could demonstrate that universities value sharing, thereby changing faculty behavior. Such sup-
port could, in turn, have benefits for institutions, such as increased funding, visibility, and
recruiting power. Most importantly, the sharing of scholarly outputs could help universities
meet their stated missions to create and disseminate knowledge for broader public good.
What should universities consider “open scholarship”?
There is no one unanimously accepted definition of open scholarship; the debate continues as
to what the minimum requirements and best practices are for different types of open content
[17]. Some of the earliest and perhaps most well-accepted international open standards are the
Budapest Open Access Initiative (2002) [18], the Bethesda Statement (2003) [19], and the Ber-
lin Declaration (2003) [20]—all of which deal with open access to articles.
At the time these declarations were written, they were revolutionary, and their original lan-
guage still guides open scholarship efforts today. However, research has rapidly changed over
the last 10–15 years, and projects are now producing much more than just articles, including
large amounts of data, different types of digital media, electronic notebooks, and complex soft-
ware. In recent years, open science has emerged as an umbrella term to refer to open access,
open data, open notebooks, open source, or any other aspect of our work as researchers that
can be shared [21,22]. International standards for these products have emerged, including the
Open Source Definition (2007) [23] for openly licensed software and the Panton Principles for
open data (2010) [24].
More recently, there has been recognition that “open science” may not be as inclusive a
term as we might like [25], and some have opted instead to refer to “open research” to include
disciplines like the humanities [26,27]. I will use the even broader term “open scholarship” to
encompass sharing of research and nonresearch products, such as those arising from educa-
tional and outreach activities [28,29]. I see inclusivity as crucial to the success of open scholar-
ship as a social movement. While open scholarship can encompass all of the aforementioned
practices, academics do not have to engage in all of these to contribute. Openness can be
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 2 / 25
and educational site funded in part by the
Shuttleworth Foundation. She is also an advisor for
several open scholarship projects and services,
including the BOAI 15th Anniversary Working
Group, Center for Open Science, ContentMine,
DORA, Figshare, OpenCon, Overleaf, and PeerJ
Preprints, all in a volunteer capacity.
Abbreviations: APC, article processing charge;
ARCS, Advancing Research Communication &
Scholarship; ASAPbio, Accelerating Science and
Publication in biology; ECR, early-career
researcher; HGP, Human Genome Project; IF,
impact factor; MNI, Montreal Neurological Institute;
MOOC, massive online open course; OCW, open
courseware; OER, open educational resource;
SeeSD, Science Education Exchange for
Sustainable Development; SPARC, Scholarly
Publishing and Academic Resources Coalition;
STEM, science, technology, engineering, and
mathematics; UNAM, Universidad Nacional
Auto
´noma de Me
´xico; UNESCO, United Nations
Educational, Scientific and Cultural Organization.
considered a continuum of practices [6]. Researchers can start with simple actions, like self-
archiving free copies of their articles, and work their way up to sharing code, data, or note-
books. Educators can begin by sharing electronic copies of their class notes and work their way
up to the creation of open textbooks or interactive online materials. It is important we wel-
come people at whatever level of sharing with which they are comfortable.
For this to work, it is in turn important that universities have ways of recognizing diverse
scholarly products and different types of sharing. But with all the different standards, how are
universities to determine what counts as open scholarship? I propose that universities take
guidance from perhaps the simplest and all-encompassing international standard, the Open
Definition from Open Knowledge, which states, "Open means anyone can freely access, use,
modify, and share for any purpose" [30]. This definition can be applied to any educational or
research product, allowing universities to set a clear baseline. Colleges, schools, and depart-
ments could then set more specific standards to fit disciplinary needs.
Open scholarship can transform research and education
A comprehensive discussion of the benefits of open scholarship is beyond the scope of this
paper (see instead [6,31,32]). Here, I focus on just a few ways sharing can transform research
and education, falling largely into the democratic (“equal access for all”) and pragmatic (“shar-
ing improves research and education”) schools of thought [22]. In each section, I begin by out-
lining some of the democratic and pragmatic benefits of open scholarship, then describe how I
see such practices also benefiting universities and fitting in well with institutional missions.
While many of the societal benefits of open scholarship have sometimes been considered to be
at odds with the interests of institutions, I argue there are several points of intersection at
which what is good for the public may also be good for the university. In my opinion, many
universities have drifted away from their stated missions of knowledge dissemination, commu-
nity engagement, and public good. Open scholarship provides an opportunity for universities
to return to these core values.
Creating inclusive knowledge societies. In 2010, the United Nations Educational, Scien-
tific and Cultural Organization (UNESCO) committed to the creation of Inclusive Knowledge
Societies [33]:
In the past, information and knowledge have too often been the preserve of powerful social or
economic groups. Inclusive Knowledge Societies are those in which everyone has access to the
information that s/he needs and to the skills required to turn that information into knowledge
that is of practical use in her/his life.
Currently, our societies are far from inclusive. All over the world, people lack access to sci-
entific information (Fig 1). A study by Laakso and Bjo¨rk reported that only 17% of 1.6 million
articles published in 2011 were available without a subscription [34]. Studies up to 2012 [35]
and 2015 [10] put the estimate around 22%–24%, although this number is likely to vary with
discipline. A new study by Piwowar et al. estimates that, overall, 28% of the academic literature
is free to access online, and although that number is growing, it was only 45% as of 2015 [36].
A study by the World Health Organization demonstrates the scope of the problem [37]:
In the lowest-income countries, 56 percent of the institutions had no current subscriptions to
international journals and 21 percent had an average of only two journal subscriptions. In the
tier with the next-lowest incomes, 34 percent of institutions had no current subscriptions, and
34 percent had two to five journal subscriptions.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 3 / 25
Just recently, it was announced that scientists in Germany, Peru [38], and Taiwan are losing
access to Elsevier journals, in part because of increasing subscription fees [39]. Rising costs
have also made textbooks unaffordable, negatively impacting education [40,41]. As Nicole
Allen, Director of Open Education for the Scholarly Publishing and Academic Resources Coa-
lition (SPARC), has said, "Students can’t learn from materials they can’t afford" [42]. A lack of
access can impede learning and slow discoveries. Science itself could suffer, too, losing valuable
perspectives when many researchers can’t participate in their rapidly evolving fields.
Open scholarship democratizes access to information by making research available to all,
regardless of financial resources—a necessary, though not sufficient, step in creating a true
"knowledge democracy" [43]. Removing financial barriers helps those in low- and middle-
income countries keep up to speed with their fields, potentially increasing their participation
and the diversity of perspectives in research. (Improved access is a necessary condition but
should not be seen as the magic bullet that will resolve all inequalities [44]. Much more than
access to information is required to increase participation in research, including improved
infrastructure and better funding for research in these countries [45]. These are not easy
Fig 1. Scientific information is locked behind paywalls. People all over the world are locked out, unable to
access information due to high subscription costs. Image:John R.McKiernan and the “Why Open Research?
project (http://whyopenresearch.org).
https://doi.org/10.1371/journal.pbio.1002614.g001
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 4 / 25
problems to solve, but they should not be ignored.) In addition, when research is open, partici-
pation is not limited to academics. The fast-growing area of citizen science is a testament to
what can be achieved when we encourage contributions from outside the academy [46]. In
sum, open scholarship allows us to create Inclusive Knowledge Societies [33], which I would
argue should be one goal, if not the goal, of universities.
Open scholarship can make universities more inclusionary. Universities are by nature
exclusionary—there are limited spots and often only those with the highest grades and test scores
are accepted. In the 1940s, people began referring to academic institutions as ivory towers, where
an elite few engaged in intellectual pursuits, largely "disengaged" from the concerns or needs of
the public [47]. If anything, the perception of universities as ivory towers has only grown over
the last decades, as competition for student and faculty positions increases, leaving many more
on the outside. As Shapin writes, "Today, almost no one has anything good to say about the
Ivory Tower and specifically about the university in its supposed Ivory Tower mode" [47].
How can institutions move away from this negative image and become more inclusionary?
Increasing acceptance rates is not feasible for economic and infrastructure reasons. However,
universities can allow everyone access to the knowledge created inside their walls. Open educa-
tional resources (OERs) are a prime example of openness increasing inclusion [48,49] and are
especially important for increasing access to education in developing countries [50,51]. When
universities make lecture notes, exams, and textbooks openly available online, even those who
cannot attend in person can benefit from what the institution has to offer. In fact, 20%–50% of
surveyed visitors to open courseware (OCW) websites identify as "self learners" [52]. Educators
also benefit from OCW sites, making up around a quarter of visitors from regions like Latin
America, Eastern Europe, and the Middle East and North Africa [53]. As an educator in
Mexico, I use open textbooks available through projects like OpenStax (https://openstax.org),
run by Rice University, because I know my students cannot afford expensive textbooks but
still need access to quality information to learn.
The recent growth of massive online open courses (MOOCs) [54], particularly large-scale,
free course initiatives by prestigious United States universities (e.g., edX, https://www.edx.org,
run by Harvard and the Massachusetts Institute of Technology), is one indication that institu-
tions are recognizing their exclusionary nature as a problem and trying to improve access to
education by lowering financial and presential barriers. While this can be seen as positive, it is
also important to not lose sight of the goal to increase inclusion. The issue is not just access but
also participation [45]: who is creating knowledge, and how do their experiences influence and
inherently bias educational content? If the majority of OERs are produced by prestigious US
universities, it represents another form of exclusion and reinforces the problem of Western
perspectives (and the English language) dominating educational content [44,50,52].
Resource-rich universities in Canada, the US, and Europe should look for ways to support,
raise visibility, and increase the use of OERs from other countries with diverse global perspec-
tives to facilitate a "true knowledge exchange" [44]. An example of an OER project from Africa
is the Science Education Exchange for Sustainable Development (SeeSD; https://www.seesd.
org), based in Senegal, which is designing open resources to improve access to education and
participation in science, technology, engineering, and mathematics (STEM). SeeSD is also
developing a MOOC-style online learning platform called Afreecademy (http://afreecademy.
org). Examples from South Asia and Southeast Asia, respectively, include Sakshat from India
(http://www.sakshat.ac.in) and the Vietnam Open Educational Resources program (http://
www.voer.edu.vn). More on OER projects in Asia can be found in [55]. An example from
Latin America comes from the Universidad Nacional Auto
´noma de Me
´xico (UNAM), where I
work. UNAM does not have a financial barrier to entry, because tuition is not charged, but
there is a huge demand for a small number of places. UNAM annually accepts only
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 5 / 25
approximately 10% of bachelor’s degree applicants through open admissions testing [56]. In
2011, the university launched “Todo la UNAM en
´nea” (“All of UNAM online”, http://www.
unamenlinea.unam.mx) to provide open access to the knowledge generated by the institution
for the benefit of society.
Beyond the societal benefits, universities have reasons to adopt OERs to benefit their own
student population. Surveys show that many students do not buy textbooks due to high costs,
and that this may be associated with failure to pass classes and high dropout rates [41,57].
OERs can help address financial disparities among students and may improve performance. In
2013, Tidewater Community College became the first US institution to offer a degree program
using exclusively OERs. Not only have they shown it is feasible to run such a program but,
also, data up to 2015 indicate that switching to OERs is associated with better student learning
outcomes and retention rates, which may ultimately lead to quicker graduation times [58].
Such statistics on student performance, retention, and degree completion contribute to univer-
sity rankings and, consequently, to funding and recruitment power.
While there are benefits for students and the university, it should not be overlooked that
the development of OERs implies investment of time and effort by faculty. In addition to con-
tent creation, there exist higher standards when materials are shared via public platforms. For
example, the University of California, Berkeley, was recently told by the US Department of Jus-
tice that their online open educational materials did not meet accessibility standards required
by the Americans with Disabilities Act [59]. There are additional concerns with OERs, such as
ensuring that images pulled from primary sources are licensed for reuse. This added effort, in
turn, requires institutional recognition and support if OER creation is to be undertaken by
more than just a few altruistic individuals. Some evaluation systems for hiring, promotion, and
tenure put less weight on the publication of books and book chapters than journal articles.
Worse yet, electronic resources may not be recognized at all if not published by “prestigious”
publishing houses [60]. OER creation must be recognized in its multiple forms if faculty are
going to participate. A few steps universities could take to support OERs are listed in Box 1.
Sharing can increase the societal impact of university research. As part of their mission
statements, many universities emphasize the importance of contributing to society through
the “dissemination of knowledge.” For example, Cornell University’s mission [62] is as
follows:
Cornell's mission is to discover, preserve, and disseminate knowledge; produce creative work;
and promote a culture of broad inquiry throughout and beyond the Cornell community. Cor-
nell also aims, through public service, to enhance the lives and livelihoods of our students, the
people of New York, and others around the world.
These are excellent goals for a university. But how effectively is knowledge transmitted, and
how can it benefit the community, if a large percentage of our society can’t access it? Open
scholarship can help universities fulfill their missions by sharing research outputs so they have
the quickest and broadest societal impact.
Members of society want and need access to research. The “Who Needs Access?” project
(https://whoneedsaccess.org) has documented stories from nurses, patients, teachers, and
small business owners who tried to access scholarly articles for personal or professional uses
but were unable. The Open Access Button project (https://openaccessbutton.org) has logged
thousands of request for articles from nonacademics all over the world who do not have access.
When articles are available, the public is eager to access them. A recent survey of users of Latin
American open access platforms found that up to a quarter of respondents were from outside
universities, including nonprofit, private, and public sector employees [63]. Around 50% of
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 6 / 25
users were students, including many at the elementary and high school levels. The author
points out, as follows, that these results have implications for how we measure impact in uni-
versity evaluations:
The alternative impact of research uncovered here [is] again evidence of the shortcomings of
considering. . .a limited notion of the term “impact. It makes little sense to use citations as the
sole measure of evaluating research and researchers when over three quarter [sic] of the use of
research is from non-citing publics.
Likewise, open data can have impact far beyond university walls. Two projects—Open
Data’s Impact (http://odimpact.org) [64] and the Open Data Impact Map (http://
opendataimpactmap.org)—are collecting case studies from all over the world to show how phil-
anthropic, public health, social justice, and other similar organizations are using and sometimes
also creating open data to improve society. For example, a quick search of Open Data Impact
Map reveals nonprofit organizations in Mexico using open data to promote environmental
protection and defense of indigenous lands (CartoCrı
´tica, http://www.cartocritica.org.mx),
Box 1. Supporting open educational resources (OERs) and practices.
1. Redirect textbook purchasing funds to support faculty. Purchasing textbooks
involves buying a limited number of copies and requires buying new editions every
few years. Money would be better invested in openly licensed, electronic textbooks,
for which there is no limit on copy number, and these e-books can be updated in real
time as new discoveries are made. Faculty could be awarded small grants to write,
maintain, or even peer review open e-books. Support could also include providing
formal guidance on accessibility standards and licensing issues to lower the burden of
OER creation for faculty.
2. Develop 2–5-year plans to convert existing degree programs to OERs. Plans of
study typically undergo periodic evaluations. This would be a natural time to review
class syllabi, search for open alternatives to current textbooks, and identify areas in
which OERs are missing and could be developed by faculty.
3. Require all new degree programs to use primarily OERs. If new degree programs
are proposed, faculty can design core courses to rely primarily on OERs from the
start. Academic boards reviewing these proposals can be advised to evaluate OER use
as part of the approval criteria.
4. Devise incentives for OER creation and open educational practices. One incentive
would be positive mention of OERs in guidelines for promotion and tenure. An
example of such a policy comes from the University of British Columbia, which lists
creation of OERs as one way faculty can demonstrate "evidence of educational leader-
ship" [61]. Another incentive could be teaching prizes based on open educational
practices. This would be one way for institutions to establish prestige around open
education and signal their support.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 7 / 25
improve Mexican economic competitiveness (El Instituto Mexicano para la Competitividad,
http://imco.org.mx), and better the lives of Mexicans living with HIV (Derechohabientes
Viviendo con VIH del Instituto Mexicano del Seguro Social, http://www.dvvimss.org.mx).
The potential for shared code to benefit society is only limited by what people can think to
program. For example, the open source application REFUGE Restrooms (http://www.
refugerestrooms.org) helps transgender, intersex, and gender nonconforming people find safe
restrooms to use to avoid harassment and possible violence. HospitalRun (http://hospitalrun.
io) is open source software that helps hospitals in low- and middle-income countries manage
patient records. High Tech Humanitarians (http://www3.hthumanitarians.org), supported by
the Institute of International Humanitarian Affairs at Fordham University, is a collaborative
platform for people to share and improve open software and hardware tools for addressing
societal issues like clean and renewable energy, distribution of medical resources, disaster
management, and protection of human rights. Several of the projects on High Tech Humani-
tarians involve participation from universities like MIT and Harvard.
Academic institutions that share research products can be part of social change and
improvement. The Earlham Institute in the UK is an example of a research institute that has
committed to open scholarship, writing, "A determined commitment to open science, open
access and open data allows us to have a significant impact" [65]. Earlham has published sev-
eral "impact stories" (http://www.earlham.ac.uk/impact-stories) describing how open scholar-
ship is aiding in their research efforts to improve the global food supply, protect animals and
ecosystems, and create new technology. Having impact outside the academic environment
reflects positively on a university and can increase its funding and recruitment power. Funders
often ask for broader impact statements and may be more likely to award funding to research-
ers and institutions with a history of translating research into action. In addition, young stu-
dents want to go where they see potential to effect change.
A university’s societal impact depends on the commitment of faculty to transforming their
research into reusable information, sharing, and participating in community outreach. As said
before, if we want such commitment, universities must develop ways of recognizing and
rewarding these activities. Traditional scholarly metrics, like the number of articles published
and journal impact factor, give an incomplete picture of true impact. In my opinion, we need a
broader perspective (see Box 2).
It is important to emphasize here that it will not be enough for universities to simply pro-
vide space for faculty to describe their outreach activities or public impact. If the university
does not signal to the academic community that it values these things, they will likely continue
to be largely ignored by evaluation committees in favor of more traditional scholarly products.
If there are more university press releases about Nature or Science papers than school mentor-
ship programs, for example, then prestige will continue to be defined by high-profile papers
and not public engagement. The university can help redefine prestige; it can influence what
becomes high profile in academic circles. As suggested in Box 2, celebrate outreach events with
press releases, award faculty prizes for community engagement, and highlight public impact
stories on the university website. Such actions signal to academics and the public that the uni-
versity is truly committed to the ideals outlined in their mission statements.
Accelerating the pace of discovery. Sharing research allows for increased communication
within and across disciplines and can encourage diverse approaches [66]. Sharing code and
experimental protocols allows others to test and improve solutions. Sharing data allows others
to perform new analyses, which could lead to new discoveries. To my knowledge, there have
been no controlled studies comparing the pace of private versus public projects, but there are
powerful anecdotal examples to support the idea that sharing can accelerate the pace of
discovery.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 8 / 25
The Human Genome Project (HGP) was one of the first high-profile projects to commit to
open scholarship. In 1996, HGP researchers agreed to rapid data sharing [67]. This sharing
accord, known as the Bermuda Principles, has been hailed as “revolutionary,” accelerating the
huge task of sequencing billions of base pairs and leading to new gene discoveries [68].
In 2008, chemist Matthew Todd and colleagues began openly sharing their electronic labo-
ratory notebooks as part of a research project to synthesize a drug to treat a parasitic disease
[69]. The project attracted outside collaborators, and the suggestions made helped the
researchers find a solution to their drug synthesis problem. Todd and coauthors write [69],
. . .the research was accelerated by being open. Experts identified themselves, and spontane-
ously contributed based on what was being posted online. The research therefore inevitably
proceeded faster than if we had attempted to contact people in our limited professional circle
individually, in series.
Todd now works as the lead researcher on the Open Source Malaria project, which openly
shares all their electronic notebooks in real time to accelerate the search for malaria drugs [70].
Box 2. Recognizing nontraditional scholarly impact.
1. Recognize code and data in promotion and tenure evaluations. Shared code and
data should be recognized in academic evaluations as at least equal in value to pub-
lished articles. Code and data citations can be measured but will likely underrepresent
the use of these products, especially outside the academic sector. Additional metrics,
such as repository follows, forks, pull requests, and other measures of community
engagement should also be considered.
2. Recognize, celebrate, and support outreach activities. Many universities describe
outreach as a core part of their missions but sometimes do little to support it in prac-
tice. Recognition could start with simple actions, like providing space on academic
evaluation forms for faculty to describe how they are helping the university meet its
commitments to the community through their outreach efforts. Celebrating these
efforts could include circulating press releases or awarding faculty prizes for public
engagement. If possible, cover expenses for faculty to take a day and visit local schools
or clinics.
3. Consider altmetrics as one measure of broader impact. Nonprofit organizations,
patient groups, and grassroots communities often use social media to share and com-
municate research of interest to them. Altmetrics provide measures of how widely
scholarly products are being shared and discussed by groups who may be unlikely to
formally cite work.
4. Allow faculty to include narrative summaries of their impact. Numbers alone will
not capture the impact scholarly products have outside university walls. Faculty
should be allowed to include descriptions of use cases in their annual reports or ten-
ure packets, e.g., how their data was used by a local hospital or their software used by
a local school. Universities could highlight interesting impact stories by publishing
them on their website.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 9 / 25
In 2009, mathematician Tim Gowers launched the Polymath Project to experiment with
open collaboration as a way to solve difficult math problems. Using a blog and a wiki to share
ideas, "progress came far faster than anyone expected" [71]. Collaboration began on February
1, and by March 10, a solution was found. The project also shed light on the discovery process:
For the first time one can see on full display a complete account of how a serious mathematical
result was discovered. It shows vividly how ideas grow, change, improve and are discarded,
and how advances in understanding may come not in a single giant leap, but through the
aggregation and refinement of many smaller insights.
In 2015 and 2016, in light of recent Ebola and Zika outbreaks, the World Health Organization
[72] as well as funders and publishers [73] came out in support of data sharing and preprints to
quickly disseminate information and accelerate responses to public health emergencies.
Accelerated discovery can give universities an edge. In 2016, acknowledging the poten-
tial for open approaches to accelerate discovery, the Montreal Neurological Institute (MNI),
part of McGill University in Canada, announced its intention to become an open science insti-
tute [74]. Faculty at the institute have committed to sharing articles, code, data, and even phys-
ical samples and to not patent their research. In regards to not receiving patent income, the
director of the institute, Guy Rouleau, says [75],
Of course there is a risk that we might lose the economic returns of a blockbuster drug or a new
intervention, but we are ethically committed to taking that risk, as the bigger risk is for our
patients who are waiting for answers and new treatments.
Rouleau says their support of open scholarship is already bringing in "highly talented
researchers and trainees" [75]. This recruitment power may be seen by other universities that
support open approaches, especially if these approaches lead to accelerated discoveries. When
researchers are the first to make a discovery, it brings visibility and prestige both for the indi-
viduals and their institution, whose name is usually featured prominently in press releases and
journal publications. This prestige, in turn, can benefit the university by attracting students
and faculty as well as funding from public and private sources.
Participation in MNI’s open scholarship initiative will be voluntary, and faculty can decide
to independently patent their discoveries. However, MNI will not financially or administra-
tively support faculty in doing so [74]. I think this sets an important precedent. The institu-
tion’s approach is, “We will not force you to share your work, but we will not help you to lock
it up.” This approach could be implemented by other universities, allowing faculty to retain
academic freedom but making it clear where the institution stands on sharing. This and other
ideas for supporting open collaboration and faster discovery are listed in Box 3.
Addressing the reproducibility “crisis”. In recent years, large-scale projects in the fields
of psychology [78] and cancer biology [79,80] have attempted to reproduce key findings and
found a low rate of reproducibility. These problems have become so prevalent that it has led
many to say that science is facing a reproducibility crisis [81]. Last year, an article in Nature
described work by researchers to reproduce 50 studies in cancer biology and the difficulties
they faced obtaining original data [82]. In several cases, authors did not respond to requests
for data. In another, data were only obtained after a year of trying. Many authors, while willing
to participate, had trouble finding the original data, indicating poor data management.
We can only expect to reproduce a study if we know exactly what was done and how. Cur-
rently, too many crucial details remain hidden. Researchers struggle to recreate experimental
methods using only details provided in original papers [83]. A 2015 study by Womack found
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 10 / 25
that just 13% of articles in the top tier journals he examined shared their underlying data [84].
I believe the best way to improve reproducibility is to ensure that full experimental protocols,
raw data, and analysis code are openly available and licensed for reuse.
Several researchers are leading the way in reproducibility [8587]. In 2012, Lorena Barba, a
professor at George Washington University, published the "Reproducibility PI Manifesto"
describing her efforts to make the research in her lab more reproducible [85]. For Barba, this
means (1) all code is under version control and shared publicly, (2) code undergoes "verifica-
tion and validation" and reports are also shared, (3) data and scripts to recreate figures are
openly licensed, (4) manuscripts are posted as open preprints, and (5) her lab’s articles include
a reproducibility statement. Barba also considers it her responsibility to teach her students
about reproducibility. With respect to the learning involved, she writes [86],
My students don’t resent investing their time in this. They know that practices like ours are
crucial for the integrity of the scientific endeavor. They also appreciate that our approach will
help them show potential future employers that they are careful, conscientious researchers.
Box 3. Supporting open collaboration and accelerated discovery.
1. Remove financial and administrative support for patents. As at the Montreal Neu-
rological Institute (MNI), faculty could be allowed to patent but would not receive
funds or help filing. Most patent offices operate at a deficit [76,77], so this should not
present significant income loss for many universities, and funds could be redirected.
2. Redirect funds to hire grant and scholarly communication personnel. Funders are
increasingly awarding grants for open scholarship projects [6]. Having personnel
dedicated to finding these opportunities and helping faculty submit applications
could be profitable for the university. Hiring scholarly communication personnel to
write research summaries or organize outreach could help universities raise visibility
and find new partners.
3. Organize academic “cross-pollination” events. Many university events are targeted
at single departments, with few opportunities for students and faculty from different
disciplines to interact. Schedule events with broad interest and invite multiple depart-
ments. Scholarly communication personnel could be in charge of organization and
diffusion.
4. Establish shared, interdisciplinary laboratory spaces. Laboratory space is at a pre-
mium and often, there are not enough resources for everyone. By pooling resources
and establishing shared spaces co-run by researchers from different departments, one
space can serve multiple uses, as well as foster interdisciplinary communication and
projects. I co-run such a collaborative space at UNAM with professors from biology
and mathematics.
5. Develop ways to recognize collaborative efforts. Collaboration is hard to measure
and is discipline dependent. However, a place to start could be to ask faculty to sub-
mit short narratives of their collaborations, both inside and outside the university
and within and across disciplines.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 11 / 25
Reproducibility can affect university reputation. For universities, having "careful, con-
scientious researchers" [86] is to their benefit. When research is reproducible, it can reflect
positively on the institution and their standards. For example, just recently, the Memorial
Sloan Kettering Cancer Center received positive press in Science magazine when one of their
researcher’s leukemia studies was successfully reproduced by an independent group [88]. In
contrast, when research is not reproducible or, even worse, is suspected to be fraudulent, this
can reflect negatively on an institution. No institution wants the effort, expense, or publicity
involved in investigating one of their researchers for fraud. Therefore, it is in the interest of
universities to encourage researchers to be transparent and make their research more repro-
ducible. How can universities accomplish this? See Box 4.
Personal practice of open scholarship
As described previously, the success of institutional open scholarship initiatives depends in
large part on the commitment of individual academics. The best way researchers can support
open scholarship is to share their own work. In 2014, at the SPARC open access meeting in
Kansas City, I publicly pledged to only edit for, review for, and publish in open access journals
[92]. During the years since, I have committed to sharing more products of my research and
teaching (Box 5). Other researchers have made similar individual commitments [9395] or
signed on to organized pledges, both as authors (e.g., http://www.openaccesspledge.com and
https://moreopenaccess.net) and as reviewers (e.g., https://opennessinitiative.org and [96]). A
collection of links to open scholarship pledges can be found via [97].
Personal commitments to open scholarship are not made lightly and are often made know-
ing that many academic environments do not, at present, adequately support such stances.
Practicing open scholarship comes with a variety of challenges. The following is not an
Box 4. Increasing transparency and reproducibility.
1. Provide incentives for researchers to preregister their studies. Registering hypothe-
ses, data collection, and analysis plans before conducting research can diminish bias
and decrease selective reporting [87]. The Center for Open Science offers a US$1,000
prize to researchers who preregister their studies [89]. Universities could provide
small financial incentives to faculty. Evaluation committees could place more weight
on preregistered projects.
2. Encourage code and data sharing under version control. Universities could let code
and data sharing be voluntary but state that these products will only be counted in
hiring, promotion, and tenure evaluations if they are shared in an open repository
with version control, like GitHub or BitBucket.
3. Recognize preprints as valuable research products. Sharing preprints allows
researchers to get more eyes on their work and potentially spot weaknesses or errors
before formal publication. Versioning can show changes made due to peer feedback.
Funders like Wellcome Trust [90] and the National Institutes of Health [91] now
allow researchers to list preprints in grant applications and progress reports. Univer-
sities should allow researchers to list preprints in evaluation materials and count
these as evidence of productivity.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 12 / 25
exhaustive list of these challenges but are some I have faced personally, along with suggestions
as to how they could be addressed. I do not believe any of these challenges are insurmountable,
but they should be considered if universities want to increase adoption.
Economic challenges. While free and low-cost open publishing options do exist [6], arti-
cle processing charges (APCs) for many open access journals are high (Fig 2), with average
Box 5. My open pledge.
As an open scholar, I pledge to:
1. edit and review only for open access journals,
2. publish only in open access journals,
3. openly share my working manuscripts as preprints,
4. openly share my code and data under version control,
5. openly share my electronic laboratory notebooks,
6. sign my manuscript reviews,
7. preferentially assign openly licensed materials in my classes,
8. create openly licensed teaching materials,
9. ask my professional societies to support open scholarship,
10. speak out in support of open scholarship.
Fig 2. The high cost of publishing. Image: John R. McKiernan and the “Why Open Research?” project
(http://whyopenresearch.org).
https://doi.org/10.1371/journal.pbio.1002614.g002
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 13 / 25
estimates ranging from about US$900 [98,99] to about US$1,800 [100], depending on the set
of journals studied. Most open access journals provide waivers, but these are typically only
automatic for researchers in low income countries. Mexico, where I work, is classified as an
upper middle-income country [101], but we have limited funds for research and little to no
institutional funds for publishing. When we are offered waivers, they are usually partial—up to
50% off the APC—and the cost is still beyond what we can afford. Because I pledged to publish
only in open access journals, publishing in subscription journals and self-archiving is not an
option for me. Even if it were, many subscription journals have significant submission, page,
and color charges [102]. Thus, for researchers in Mexico and other similar countries, cost is an
ever-present consideration and a strong determinant of where researchers choose to publish.
Some of the high-profile and more expensive venues are out of our reach, which affects our vis-
ibility as researchers. Open access funding models besides “author pays” have to be explored.
In Latin America, many journals are free for readers and free for authors, which is possible
because of funding from governments, institutions, or cooperative efforts [103]. Universities
in other parts of the world should study Latin American journal funding models for guidance
and consider how they could support new publishing models for sustainable and affordable
open access. The means to finance these new models could come from redirecting journal sub-
scription funds in strategic ways and/or redirecting funds spent on proprietary software licens-
ing, as discussed more below.
Technical challenges. Sharing code and data is more complicated than sharing articles, in
part because these research products are much more varied, especially across disciplines. In addi-
tion, there seems to be less guidance available as to the preferred file formats and organization,
the level of documentation needed, different license types, and the best places to archive code
and data than there is for articles. Even the most motivated researchers can find navigating these
issues frustrating [104]. One standard that most agree on is that code should be shared under
version control [105,106], in which every change is tracked and users can return to previous ver-
sions at any time [107], but this is not trivial. Version control tools, like Git, are not always intui-
tive and most researchers do not receive such training. The barrier to entry is high, and
researchers may be reluctant to invest the time needed to become proficient [108]. Or, research-
ers may be willing to learn but simply be unsure where to start and what resources to use.
Similar challenges arise with open electronic notebooks. Currently, my lab uses Jupyter
notebooks [109] to document our research, but this tool requires that students are familiar
with both Python and Markdown and also presents a somewhat high barrier to entry, although
arguably lower than with raw code alone. Such barriers are particularly relevant when working
with undergraduate students, who often receive little to no training in programming or other
computer languages. The time involved to learn such tools can be a limiting factor, because
these students typically spend only 6 months to a year in my lab and need to hit the ground
running. Educational initiatives could address these challenges. Universities could offer
courses on essential research skills, including version control and basic programming. These
should not just be weekend workshops but courses integrated into all plans of study, beginning
at undergraduate and continuing up to graduate levels of education.
Redirect funds to address challenges and support academics. I see economic and tech-
nical challenges as going hand in hand, with solutions for the latter potentially also providing
the means to address the former. Many institutions spend hundreds of thousands to millions
of dollars per year on site licenses for proprietary software [110,111] and continue to invest
time and effort in training academics in these closed tools. For example, in 2017, the University
of Washington set aside over US$3.6 million for purchasing software licenses [111]. Imagine
what amazing things could be done if we redirected even half of that money into supporting
open solutions, like open source software and open access publishing.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 14 / 25
However, the problems with supporting proprietary software extend beyond just financial
costs; there are academic freedom and educational costs as well. As the free software definition
outlines, we are less interested in “free as in beer” than we are in “free as in speech” [112]. We
want the freedom to run, explore, modify, and redistribute the underlying source code. The
use of closed software can leave students and faculty less well equipped, because many analysis
functions exist as “black boxes,” in which we can’t see, and are rarely forced to understand,
what is being done with the data. As Red Hat founder, Bob Young, writes [113],
Would you buy a car with the hood welded shut?. . .We demand the ability to open the hood of
our cars because it gives us, the consumer, control over the product we've bought and takes it
away from the vendor. . .Having control over the technology they are using is the benefit that is
enabling users of open-source tools to build more-reliable, more-customized and lower-cost
systems than ever before.
In the spirit of being smart consumers who retain control over our academic tools as well as
the freedom to innovate, I believe universities should shift to open source solutions and pro-
vide training in open source alternatives to proprietary software. Data management courses
could use LibreOffice Calc instead of Microsoft Excel. Design classes could use GIMP and
Inkscape instead of Adobe Photoshop and Illustrator. Programming classes could use primar-
ily Python, rather than Matlab. This latter suggestion would especially help students learn how
to design algorithms, write their own functions, and hit the ground running when they get
their hands on computational models or data in their final year(s) of study. Training should
also include showing students how to give back by contributing to open source projects. In the
process of sharing their bug fixes or new functions with the online software community, they
would learn good coding practices, version control, and the use of tools like Git. Thus, switch-
ing to open source solutions could improve education, thereby addressing some of the techni-
cal challenges outlined above.
As an added bonus, many open source programs are also “free as in beer,” or cost much less
than proprietary software, typically charging only for things like formal software support. The
money saved in student and faculty licenses if universities switched to open solutions could
then be redirected to support open innovation or address economic challenges of open pub-
lishing. Listed in Box 6 are just a few ideas, which could be scaled depending on institutional
resources and needs.
Personal practice meets institutional policy
In my view, one of the biggest challenges open scholars face at the institutional level is how they
are evaluated for promotion and tenure decisions. There are tensions created by inconsistencies
between stated institutional values and evaluations in practice. For example, institutions often
emphasize the importance of community engagement and public outreach in their mission and
vision statements (e.g., [62,114,115]). However, surveys show that faculty feel this support
rarely translates into recognition in promotion and tenure. Pretenure faculty report being
actively "discouraged" from spending time on community engagement or public outreach activ-
ities that take time away from producing “real scholarship,” like peer-reviewed articles [60,
116118]. Harley et al. conclude that academics who spend significant time on activities like
writing for the general public may be "stigmatized for being ’public intellectuals[60].
Similarly, institutions often tout the importance of collaborative and interdisciplinary
research (e.g., [119,120]). Yet, many evaluation systems continue to focus primarily on indi-
vidual accomplishments, insisting that researchers demonstrate “independence,” and may
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 15 / 25
even include criteria that disadvantage those working in collaborative efforts [60,121]. For
example, some evaluation systems give priority to first or corresponding authorships and
devalue middle authorships on publications, especially with larger numbers of authors [122,
123]. The dominance of the journal article over other products as the "basic unit of scholar-
ship" [124] is also a problem lamented by faculty [60,125]. Surveys report that data, software,
online resources, and other digital products are often relegated to “tool development,” given
“secondary status,” and may not count at all unless worked somehow into article format [60,
116]. This can be true even when there is interest in and use of the product by academic peers,
creating a mismatch between community and institutional recognition [60].
The use of proxy measures, like journal impact factor (IF), to judge the quality and impor-
tance of articles is still pervasive in academic evaluations [60,126] (e.g., [127,128]), despite
studies showing that IF correlates poorly with the scientific quality of individual works [129].
Faculty report feeling intense pressure to publish in specific high IF venues [60,126,130].
Institutional requirements may also lead researchers to break apart research projects into
smaller, less in-depth units to increase publication numbers [60,130] or communicate their
Box 6. Supporting open source and innovation.
1. Develop a 2–5-year plan to move to open source software. A formal assessment
should be conducted to determine which proprietary software products are widely
used and which are underutilized by the university. The former could continue to be
supported for some time, while the latter would be phased out more quickly. Software
for which open source alternatives already exist would be canceled first to liberate
funds that could be immediately redirected. Faculty could continue to purchase
licenses independently but would not receive institutional support past prearranged
cutoff dates.
2. Offer financial incentives to faculty to develop or improve open source alterna-
tives to proprietary software. Grants to develop new open source software could be
for 1–2 years and offer US$5,000–US$10,000. A few bigger projects might be funded
depending on demand and complexity of the software needed. Larger awards would
be possible as more software licenses are phased out and more funds liberated. All
software development should be done in the open via platforms like GitHub or Bit-
Bucket, which could have the advantage of bringing in outside collaborators at no
added cost to the university. Smaller grants or faculty prizes could also be awarded
for demonstrated contributions to existing open source projects.
3. Redirect site license funds into supporting open access publishing. Redirecting
funds could also help address economic challenges of open publishing. For example,
if a university’s site license budget is similar to University of Washington’s [111], US
$1 million–US$1.5 million (less than half) could be used to set up an institutional
open access publishing fund. If universities do not wish to support article processing
charges (APCs), they could instead use the funds to support open publishing consor-
tia (e.g., Open Library of Humanities https://www.openlibhums.org) or explore new
models.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 16 / 25
research in venues that may not reach their ideal audience, just for the sake of prestige [60]. It
is understandable that people align their practices with institutional policies related to hiring,
promotion, and tenure and with the academic culture in which they find themselves embed-
ded. We, as researchers, want to get, keep, and be successful at our jobs so we can continue
doing the work we enjoy. We want recognition from our peers and institution. However, it is
not hard to imagine that making decisions that are contrary to what we believe is right or good
for our research could create stress, job dissatisfaction, and, in some cases, weaker scholarship.
None of these outcomes is good for either faculty or institution.
Those in senior leadership roles at universities can support faculty and promote open schol-
arship by ensuring that incentives exist to encourage and reward sharing. In the action items
listed throughout, I propose several ways that shared code, data, educational resources, out-
reach activities, preprints, and more could be recognized by committees. These and other sug-
gestions to reform promotion and tenure evaluations are summarized in Box 7. Several of
these recommendations arose from discussions among the Advancing Research Communica-
tion & Scholarship (ARCS), OpenCon, and SPARC communities (http://bit.ly/PTreform),
which include students, postdocs, and pretenure faculty who are understandably concerned
about how evaluation criteria will affect their career prospects and advancement. Unfortu-
nately, while early-career researchers (ECRs) may be the best equipped to say how evaluation
criteria affect career development or to propose ways of evaluating new forms of digital
Box 7. Recommendations to reform promotion and tenure
evaluations.
1. Stop using journal-level metrics, like impact factor, to evaluate the quality and
impact of research articles. Institutions can sign the San Francisco Declaration on
Research Assessment (http://www.ascb.org/dora).
2. Use article-level metrics, such as citation counts, as one quantitative measure of arti-
cle use and impact. While citation counts are not perfect, they are more representative
than journal-level metrics of the impact of individual articles.
3. Use alternative metrics, such as tweet activity and media coverage, as one way of
evaluating the broader societal impact of research works.
4. Consider shared code and data deposited in public repositories as research products
that count in evaluations. Quantitative measures of impact could include citations,
repository forks, and pull requests.
5. Consider preprints as evidence of academic productivity. Preprints do not necessar-
ily have to count as highly as peer-reviewed articles but should still count in evalua-
tions. Support for this perspective comes from the recent Accelerating Science and
Publication in biology (ASAPbio) meeting and movement [131].
6. Value scientific outreach, such as blogging and articles in popular media, as aca-
demic outputs that count in evaluations.
7. Make forms flexible by adding space for researchers to describe nontraditional
research outputs and their open scholarship activities.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 17 / 25
scholarship, they are rarely given formal opportunities to do so. Senior leadership could sup-
port ECRs by giving them more of an institutional voice and including ECR representatives on
faculty senates, hiring committees, and tenure review boards.
Institutions may take even stronger stances in favor of open scholarship. A policy similar to
that at the University of Liège, which requires that faculty upload their work to the institution’s
open access repository to be considered in promotion and tenure evaluations [132], could be
put in place. Of course, for institutions in which the governance structure does not support
such a top-down approach, open scholarship initiatives will have to be discussed and agreed
upon on at the level of colleges, schools, or even individual departments. Universities can also
take guidance from the Leiden Manifesto on research metrics, which includes recommenda-
tions for better aligning evaluation criteria with institutional missions, considering disciplinary
differences, and taking into account qualitative indicators [133].
The importance of institutional culture and signals
Reforming evaluations will be a huge step towards more widespread adoption of open scholar-
ship. However, changing policies alone will likely not be enough to transform universities and
make sharing the norm rather than the exception. Problems with evaluation systems can be
viewed as a symptom of a much bigger problem, namely, an academic culture that has come to
favor quantity over quality, labels over content, individual over group accomplishments, and
prestige over public good. Universities play a crucial role in determining this cultural environ-
ment. Through career advancement decisions, funding and space allocations, faculty prizes,
press releases, and even website content, the university signals to academics what it values and
what is required to be an accepted member of the community. As in any culture, there is a
sense of belonging fostered by what is seen to be a set of shared interests and values. Missions
statements are intended to explicitly outline those shared interests and values for the university
community, but these words can end up being empty when the institution signals through its
actions that its values are different or conflicting. Faculty pay acute attention to these signals
and can feel strong pressure to align their practices accordingly. This may be especially true for
faculty just starting out, who are working to integrate themselves into their new environment
and become valued community members. Thus, "the culture of an institution. . .is a strong
force affecting faculty values and activities" [134].
Importantly, I see the actions I have proposed throughout not so much as a dramatic shift
towards new academic cultural values, but more as a return to old ones. Broadening our defi-
nition of scholarship, valuing public engagement, wanting the university to be a force for posi-
tive social change—these are not new ideas [134136]. These are old ideas that have taken a
back seat to increasingly distorted priorities. I think what universities need is a “realignment”
such that what they say they value is better reflected in how they act. University mission state-
ments have to be more than just words.
Conclusions
I have outlined my vision of a university that endorses the principles of open scholarship, not
just in words but in practice, and actively supports faculty in sharing their work. This support
can span a continuum from simple steps, like providing space on evaluation forms for faculty
to describe their open scholarship or outreach efforts, to more complicated actions, like the
redistribution of institutional funds to finance open initiatives. I realize universities may not
be able to enact all the reforms I have proposed; some may not be possible due to certain uni-
versity governance structures, and others may meet with significant resistance. However, if
universities work towards just a few of these reforms over the next 2 to 5 years, I think they
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 18 / 25
could significantly increase the adoption of open scholarship practices. The most impactful
reforms, as suggested by faculty surveys, are likely to be changes made to evaluation criteria to
better recognize and reward diverse types of open scholarship, accompanied by outward sig-
naling from universities that these activities are valued. Such changes may be challenging to
enact, but I argue it is worth the effort. As universities embrace sharing, they will likely find it
has broad benefits, increasing their visibility, funding, and recruitment power and, most
importantly, helping institutions meet core missions like dissemination of knowledge and pos-
itive contributions to society.
Acknowledgments
The author thanks Lorraine Chuen for suggestions that improved this manuscript.
References
1. D.E. Atkins, J.S. Brown, and A.L. Hammond. A review of the open educational resources (OER)
movement: Achievements, challenges, and new opportunities. Report to the William and Flora Hewlett
Foundation, 2007. https://pdfs.semanticscholar.org/8d16/858268c5c15496aac6c880f9f50afd9640b2.
pdf.
2. Neuro Cloud Consortium. To the cloud! A grassroots proposal to accelerate brain science discovery.
Neuron, 920 (3):0 622–627, 2016. https://doi.org/10.1016/j.neuron.2016.10.033 PMID: 27810005
3. SPARC and the Right to Research Coalition. OpenCon Community Report, 2017. http://www.
opencon2017.org/community_report.
4. Choudhury S., Fishman J.R., McGowan M.L., and Juengst E.T.. Big data, open science and the brain:
Lessons learned from genomics. Frontiers in Human Neuroscience, 8:0 239, 2014. https://doi.org/10.
3389/fnhum.2014.00239 PMID: 24904347
5. LeBel E.P., Borsboom D., Giner-Sorolla R., Hasselman F., Peters K.R., Ratliff K.A., and Smith C.T..
PsychDisclosure.org: Grassroots support for reforming reporting standards in psychology. Perspec-
tives on Psychological Science, 80 (4):0 424–432, 2013. https://doi.org/10.1177/1745691613491437
PMID: 26173121
6. McKiernan E.C., Bourne P.E., Brown C.T., Buck S., Kenall A., Lin J., McDougall D., Nosek B.A., Ram
K., Soderberg C.K., Spies J.R., Thaney K., Updegrove A., Woo K.H., and Yarkoni T.. How open sci-
ence helps researchers succeed. eLife, 5:0 e16800, 2016. https://doi.org/10.7554/eLife.16800 PMID:
27387362
7. A. Swan, Y. Gargouri, M. Hunt, and S. Harnard. Open access policy: Numbers, analysis, effective-
ness. 2015. Preprint. https://arxiv.org/abs/1504.02261. Cited 9 September 2017.
8. Harley D., Earl-Novell S., Arter J., Lawrence S., and King C.J.. The influence of academic values on
scholarly publication and communication practices. Journal of Electronic Publishing, 100 (2):0 1–10,
2007. https://doi.org/10.3998/3336451.0010.204
9. Xia J.. A longitudinal study of scholars attitudes and behaviors toward open-access journal publishing.
Journal of the Association for Information Science and Technology, 610 (3):0 615–624, 2010. https://
doi.org/10.1002/asi.21283
10. Zhang L. and Watson E.M.. Measuring the impact of gold and green open access. The Journal of Aca-
demic Librarianship, Forthcoming, 2017. https://doi.org/10.1016/j.acalib.2017.06.004
11. Gaines A.. From concerned to cautiously optimistic: Assessing faculty perceptions and knowledge of
open access in a campus-wide study. Journal of Librarianship and Scholarly Communication, 30 (1):0
eP1212, 2015. https://doi.org/10.7710/2162-3309.1212
12. Hurrell C. and Meijer-Kline K.. Open access up for review: academic attitudes towards open access
publishing in relation to tenure and promotion. Open Excess, 10 (2), 2011. http://tsc.library.ubc.ca/
index.php/journal4/article/view/104.
13. The University of California Office of Scholarly Communication and the California Digital Library
eScholarship Program and Greenhouse Associates, Inc. Faculty attitudes and behaviors regarding
scholarly communication: Survey findings from the University of California. 2007. http://www.lib.
berkeley.edu/userresearch/surveys/2007_CDL_OSC_Survey.pdf.
14. I. Kuchma. Results of the SOAP Survey: A preliminary overview of the situation in EIFL partner coun-
tries. Electronic Information for Libraries, 2011. http://www.eifl.net/resources/results-soap-survey-
preliminary-overview-situation-eifl-partner-countries.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 19 / 25
15. Nokes K.M., Nelson D.A., McDonald M.A., Hacker K., Gosse J., Sanford B., and Opel S.. Faculty per-
ceptions of how community-engaged research is valued in tenure, promotion, and retention decisions.
Clinical and Translational Science, 60 (4):0 259–266, 2013. https://doi.org/10.1111/cts.12077 PMID:
23919360
16. C. Wolff, A.B. Rod, and R.C. Schonfeld. Ithaka S+R Jisc RLUK UK Survey of Academics 2015. http://
digitalcommons.unl.edu/scholcom/17/.
17. Neylon, C. Openness in scholarship: A return to core values? In L. Chan and F. Loizides, editors,
Expanding Perspectives on Open Science: Communities, Cultures and Diversity in Concepts and
Practices, pages 6–17. Proceedings of the 21st International Conference on Electronic Publishing,
IOS Press, 2017. http://ebooks.iospress.nl/publication/46638.
18. L. Chan, D. Cuplinskas, M. Eisen, F. Friend, Y. Genova, J-C. Gue
´don, M. Hagemann, S. Harnad, R.
Johnson, R. Kupryte, M. La Manna, I. Re
´v, M. Segbert, S. de Souza, P. Suber, and J. Velterop. Buda-
pest Open Access Initiative, 2002. http://www.budapestopenaccessinitiative.org/.
19. P.O. Brown, D. Cabell, A. Chakravarti, B. Cohen, T. Delamothe, M. Eisen, L. Grivell, J-C. Gue
´don, R.
S. Hawley, R.K. Johnson, M.W. Kirschner, D. Lipman, A.P. Lutzker, E. Marincola, R.J. Roberts, G.M.
Rubin, R. Schloegl, V. Siegel, A.D. So, P. Suber, H.E. Varmus, J. Velterop, M.J. Walport, and L. Wat-
son. Bethesda Statement on Open Access Publishing, 2003. http://legacy.earlham.edu/~peters/fos/
bethesda.htm.
20. Max-Planck-Gesellschaft. Berlin Declaration on Open Access to Knowledge in the Sciences and
Humanities, 2003. https://openaccess.mpg.de/Berlin-Declaration.
21. G. Bueno de la Fuente, Foster Group. What is open science? Introduction. https://www.
fosteropenscience.eu/content/what-open-science-introduction.
22. Fecher B. and Friesike S.. Open science: One term, five schools of thought. In Bartling S. and Friesike
S., editors, Opening Science:The Evolving Guide on How the Internet is Changing Research,Collabo-
ration and Scholarly Publishing, pages 17–47. SpringerOpen, 2014. https://doi.org/10.1007/978-3-
319-00026-8_2
23. Open Source Initiative. The Open Source Definition, 2007. https://opensource.org/osd.
24. Murray-Rust, P. and Neylon, C. and Pollock, R. and Wilbanks, J. Panton Principles, Principles for
open data in science, 2010. http://pantonprinciples.org/.
25. Kansa E.C.. The need to humanize open science. In Moore S.A., editor, Issues in Open Research
Data, pages 31–58. Ubiquity Press, 2014. https://doi.org/10.5334/ban.c
26. P. Kraker. Open science and the disciplinary culture clash—why is it so hard to reach a consensus?
The London School of Economics and Political Science, LSE Impact Blog, 2014. http://blogs.lse.ac.
uk/impactofsocialsciences/2014/10/29/open-science-disciplinary-culture-clash/.
27. K. Mayer. From Science 2.0 to Open Science—Turning rhetoric into action? Social Technology Soci-
ety Social Networking (STCSN) E-Letter, 30 (1), 2015. http://stcsn.ieee.net/e-letter/stcsn-e-letter-vol-
3-no-1/from-science-2-0-to-open-science.
28. Greenhow C. and Gleason B.. Social scholarship: Reconsidering scholarly practices in the age of
social media. British Journal of Educational Technology, 450 (3):0 392–402, 2014. https://doi.org/10.
1111/bjet.12150
29. Veletsianos G. and Kimmons R.. Assumptions and challenges of open scholarship. The International
Review of Research in Open and Distributed Learning, 130 (4):0 166–189, 2012. https://doi.org/10.
19173/irrodl.v13i4.1313
30. Open Knowledge International. The Open Definition. http://opendefinition.org/.
31. Tennant J.P., Waldner F., Jacques D.C., Masuzzo P., Collister L.B., and Hartgerink C.H.J.. The aca-
demic, economic and societal impacts of open access: an evidence-based review [version 3; referees:
4 approved, 1 approved with reservations]. F1000Research, 5:0 632, 2016. https://doi.org/10.12688/
f1000research.8460.3 PMID: 27158456
32. Willinsky J.. The Access Principle:The Case For Open Access to Research and Scholarship. MIT
Press, 2006. http://hdl.handle.net/10150/106529.
33. Souter D.. Towards inclusive knowledge societies:A review of UNESCO’s action in implementing the
WSIS outcomes. UNESCO, 2010. http://bit.ly/UNESCO_inclusive.
34. Laakso M. and Bjo
¨rk B-C.. Anatomy of open access publishing: A study of longitudinal development
and internal structure. BMC Medicine, 100 (1):0 124, 2012. https://doi.org/10.1186/1741-7015-10-
124 PMID: 23088823
35. Y. Gargouri, V. Larivière, Y. Gingras, L. Carr, and S. Harnad. Green and gold open access percent-
ages and growth, by discipline, 2012. Preprint. https://arxiv.org/abs/1206.3664. Cited 9 September
2017.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 20 / 25
36. Piwowar H., Priem J., Larivière V., Alperin J.P., Matthias L., Norlander B., Farley A., West J., and
Haustein S.. The State of OA: A large-scale analysis of the prevalence and impact of Open Access
articles, 2017. Preprint. PeerJ Preprints:e3119v1. Cited 9 September 2017.
37. Aronson B.. Improving online access to medical information for low-income countries. New England
Journal of Medicine, 3500 (10):0 966–968, 2004. https://doi.org/10.1056/NEJMp048009 PMID:
14999107
38. RPP Noticias. Cientı
´ficos ya no pueden acceder a crucial banco de datos por falta de dinero, 2016.
http://rpp.pe/ciencia/mas-ciencia/cientificos-ya-no-pueden-acceder-a-crucial-banco-de-datos-por-
falta-de-dinero-noticia-1016735.
39. Schiermeier Q. and Mega E.R.. Scientists in Germany, Peru and Taiwan to lose access to Elsevier
journals. Nature, 541:0 13, 2016. https://doi.org/10.1038/nature.2016.21223 PMID: 28054621
40. Kingkade T. College textbook prices increasing faster than tuition And inflation. Huffington Post,
2013. http://www.huffingtonpost.com.mx/entry/college-textbook-prices-increase_n_240915.
41. R. L. Donaldson, Shen E., and Florida Virtual Campus. 2016 Student Textbook and Course Materials
Survey. The Orange Grove, 2016. https://florida.theorangegrove.org/og/items/3a65c507-2510-42d7-
814c-ffdefd394b6c/1/.
42. Jaschik S.. Can a professor be forced to assign a $180 textbook? Inside Higher Ed, 2015. https://
www.insidehighered.com/news/2015/10/26/dispute-required-math-textbook-escalates-broader-
debate-about-costs-and-academic.
43. Hall B.L., Jackson E.T., Tandon R., Fontan J-M., and Lall N., editors. Knowledge, democracy and
action: Community-university research partnerships in global perspectives. Manchester University
Press, 2013.
44. S. Crissinger. A critical take on OER practices: Interrogating commercialization, colonialism, and con-
tent. In The Library With The Lead Pipe, 2015. http://www.inthelibrarywiththeleadpipe.org/2015/a-
critical-take-on-oer-practices-interrogating-commercialization-colonialism-and-content/.
45. L. Czerniewicz. Inequitable power dynamics of global knowledge production and exchange must be
confronted head on, 2013. http://blogs.lse.ac.uk/impactofsocialsciences/2013/04/29/redrawing-the-
map-from-access-to-participation/.
46. Silvertown J.. A new dawn for citizen science. Trends in Ecology & Evolution, 240 (9):0 467–471,
2009. https://doi.org/10.1016/j.tree.2009.03.017 PMID: 19586682
47. Shapin S.. The ivory tower: the history of a figure of speech and its cultural uses. The British Journal
for the History of Science, 450 (01):0 1–27, 2012. https://doi.org/10.1017/S0007087412000118
48. Bossu C., Bull D., and Brown M.. Opening up Down Under: the role of open educational resources in
promoting social inclusion in Australia. Distance Education, 330 (2):0 151–164, 2012. https://doi.org/
10.1080/01587919.2012.692050
49. Conole G.. Fostering social inclusion through open educational resources (OER). Distance Education,
330 (2):0 131–134, 2012. https://doi.org/10.1080/01587919.2012.700563
50. Kanwar A., Kodhandaraman B., and Umar A.. Toward sustainable open education resources: A per-
spective from the global south. The American Journal of Distance Education, 240 (2):0 65–80, 2010.
https://doi.org/10.1080/08923641003696588
51. Kumar M.S.V.. Open educational resources in India’s national development. Open Learning, 240
(1):0 77–84, 2009. https://doi.org/10.1080/02680510802627860
52. Centre for Educational Research and Innovation, Organisation for Economic Co-operation and Devel-
opment. Giving Knowledge for Free:The Emergence of Open Educational Resources. OECD Pub-
lishing, 2007. https://www.oecd.org/edu/ceri/38654317.pdf.
53. S. Carson. MIT opencourseware 2005 program evaluation findings report. 2006. https://ocw.mit.edu/
ans7870/global/05_Prog_Eval_Report_Final.pdf.
54. Yuan L. and Powell S.. MOOCs and open education: Implications for higher education. JISC and
CETIS, 2013. http://publications.cetis.org.uk/wp-content/uploads/2013/03/MOOCs-and-Open-
Education.pdf.
55. Dhanarajan G. and Porter D., editors. Open Educational Resources:An Asian Perspective. Common-
wealth of Learning and OER Asia, 2013. http://hdl.handle.net/11599/23.
56. Universidad Nacional Auto
´noma de Me
´xico. Portal de Estadı
´stica Universitaria. http://www.
estadistica.unam.mx/series_inst/.
57. Senack, E. and U.S. Public Interest Research Group (PIRG) Education Fund and The Student PIRGs.
Fixing the broken textbook market: How students respond to high textbook costs and demand alterna-
tives, 2014. http://www.uspirg.org/reports/usp/fixing-broken-textbook-market.
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 21 / 25
58. Hilton J. III, Fischer L., Wiley D., and William L.. Maintaining momentum toward graduation: OER and
the course throughput rate. International Review of Research in Open and Distributed Learning, 170
(6), 2016. Available from: http://www.irrodl.org/index.php/irrodl/article/view/2686/3967.
59. Straumsheim, C. Berkeley will delete online content. Inside Higher Ed, 2017. https://www.
insidehighered.com/news/2017/03/06/u-california-berkeley-delete-publicly-available-educational-
content.
60. D. Harley, S.K. Acord, S. Earl-Novell, S. Lawrence, and C.J. King. Assessing the Future Landscape of
Scholarly Communication: An Exploration of Faculty Values and Needs in Seven Disciplines. Center
for Studies in Higher Education, UC Berkeley, 2010. http://escholarship.org/uc/cshe_fsc.
61. The University of British Columbia. Guide to Reappointment, Promotion and Tenure Procedures at
UBC, 2016/17. http://www.hr.ubc.ca/faculty-relations/files/SAC-Guide.pdf.
62. Cornell University. University Mission. https://www.cornell.edu/about/mission.cfm.
63. J.P. Alperin. The public impact of Latin America’s approach to open access. PhD thesis, Stanford Uni-
versity, 2015. https://purl.stanford.edu/jr256tk1194.
64. Young A. and Verhulst S. The Global Impact of Open Data:Key Findings from Detailed Case Studies
Around the World. O’Reilly Media, Inc., 2016. http://www.oreilly.com/data/free/the-global-impact-of-
open-data.csp.
65. Earlham Institute. Embracing innovation through technology. http://www.earlham.ac.uk/impact-story-
embracing-innovation-through-technology.
66. Murray F., Aghion P., Dewatripont M., Kolev J., and Stern S.. Of mice and academics: Examining the
effect of openness on innovation. American Economic Journal:Economic Policy, 80 (1):0 212–252,
2016. https://doi.org/10.1257/pol.20140062
67. Office of Biological U.S. Department of Energy Office of Science and Environmental Research. Poli-
cies on Release of Human Genomic Sequence Data Bermuda-Quality Sequence. Human Genome
Project Information Archive. http://web.ornl.gov/sci/techresources/Human_Genome/research/
bermuda.shtml.
68. Contreras J.L.. Bermuda’s legacy: Policy, patents and the design of the genome commons. Minnesota
Journal of Law,Science & Technology, 120 (1):0 61–125, 2011. Available from: http://scholarship.
law.umn.edu/mjlst/vol12/iss1/5.
69. Woelfle M., Olliaro P., and Todd M.H.. Open science is a research accelerator. Nature Chemistry, 3:0
745–748, 2011. https://doi.org/10.1038/nchem.1149 PMID: 21941234
70. Robertson M.N., Ylioja P.M., Williamson A.E., Woelfle M., Robins M., Badiola K.A., Willis P., Olliaro
P., Wells T.N.C., and Todd M.H.. Open source drug discovery–a limited tutorial. Parasitology, 1410
(01):0 148–157, 2014. https://doi.org/10.1017/S0031182013001121 PMID: 23985301
71. Gowers T. and Nielsen M.. Massively collaborative mathematics. Nature, 4610 (7266):0 879–881,
2009. https://doi.org/10.1038/461879a PMID: 19829354
72. World Health Organization. Developing global norms for sharing data and results during public health
emergencies, 2015. http://www.who.int/medicines/ebola-treatment/blueprint_phe_data-share-results/
en/.
73. Wellcome Trust. Global scientific community commits to sharing data on Zika, 2016. https://wellcome.
ac.uk/press-release/global-scientific-community-commits-sharing-data-zika.
74. Owens B.. Montreal institute going ’open’ to accelerate science. Science, 3510 (6271):0 329–329,
2016. https://doi.org/10.1126/science.351.6271.329 PMID: 26797995
75. Rouleau G.. Open Science at an institutional level: an interview with Guy Rouleau. Genome Biology,
180 (1):0 14, 2017. https://doi.org/10.1186/s13059-017-1152-z PMID: 28109193
76. Love B.J.. Do university patents pay off? Evidence from a survey of university inventors in computer
science and electrical engineering. Yale Journal of Law and Technology, 160 (2):0 285–343, 2014.
Available from: http://digitalcommons.law.yale.edu/yjolt/vol16/iss2/2.
77. W.D. Valdivia. University start-ups: Critical for improving technology transfer. Center for Technology
Innovation at Brookings, 2013. https://www.brookings.edu/research/university-start-ups-critical-for-
improving-technology-transfer/.
78. Open Science Collaboration. Estimating the reproducibility of psychological science. Science, 3490
(6251):0 aac4716, 2015. https://doi.org/10.1126/science.aac4716 PMID: 26315443
79. Prinz F., Schlange T., and Asadullah K.. Believe it or not: how much can we rely on published data on
potential drug targets? Nature Reviews Drug Discovery, 100 (9):0 712–712, 2011. https://doi.org/10.
1038/nrd3439-c1 PMID: 21892149
80. Begley C.G. and Ellis L.M.. Drug development: Raise standards for preclinical cancer research.
Nature, 4830 (7391):0 531–533, 2012. https://doi.org/10.1038/483531a PMID: 22460880
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 22 / 25
81. Baker M.. 1,500 scientists lift the lid on reproducibility. Nature, 533:0 452–454, 2016. https://doi.org/
10.1038/533452a PMID: 27225100
82. van Noorden R.. Sluggish data sharing hampers reproducibility effort. Nature, 2015. https://doi.org/10.
1038/nature.2015.17694
83. Yong, E. How reliable are cancer studies? The Atlantic, 2017. https://www.theatlantic.com/science/
archive/2017/01/what-proportion-of-cancer-studies-are-reliable/513485/.
84. Womack R.P.. Research data in core journals in biology, chemistry, mathematics, and physics. PLoS
ONE, 100 (12):0 e0143460, 2015. https://doi.org/10.1371/journal.pone.0143460 PMID: 26636676
85. L.A. Barba. Reproducibility PI Manifesto. figshare, 2012. https://doi.org/10.6084/m9.figshare.104539.
v1.
86. Barba L.A.. The hard road to reproducibility. Science, 3540 (6308):0 142–142, 2016. https://doi.org/
10.1126/science.354.6308.142 PMID: 27846503
87. MunafòM.R., Nosek B.A., Bishop D.V.M., Button K.S., Chambers C.D., du Sert N.P., Simonsohn U.,
Wagenmakers E-J., Ware J.J., and Ioannidis J.P.A.. A manifesto for reproducible science. Nature
Human Behaviour, 10 (0021):0 1–9, 2017. https://doi.org/10.1038/s41562-016-0021
88. Kaiser J.. Cancer studies pass reproducibility test. Science, 2017. https://doi.org/10.1126/science.
aan7016
89. Center for Open Science. Preregistration Challenge. https://cos.io/prereg/.
90. Wellcome Trust. We now accept preprints in grant applications, 2017. https://wellcome.ac.uk/news/
we-now-accept-preprints-grant-applications.
91. National Institutes of Health. Reporting preprints and other interim research products. Notice Number:
NOT-OD-17-050, 2017. https://grants.nih.gov/grants/guide/notice-files/NOT-OD-17-050.html.
92. E.C. McKiernan. Being open as an early career researcher. figshare, 2014. https://doi.org/10.6084/
m9.figshare.954994.v1.
93. A. Goben. A personal open access plan. Hedgehog Librarian, 2012. http://hedgehoglibrarian.com/
2012/02/22/a-personal-open-access-plan/.
94. M.A. Smale. Making a Pledge. from the Library of Maura, 2011. https://msmale.commons.gc.cuny.
edu/2011/10/23/making-a-pledge/.
95. S. Wheeler. Sharp practice. Learning with ’e’s, 2011. http://www.steve-wheeler.co.uk/2011/09/sharp-
practice.html.
96. Aleksic J., Alexa A., Attwood T.K., Hong N.C., Dahl M., Davey R., Dinkel H., Fo
¨rstner K., Grigorov I.,
He
´riche
´J-K., Lahti L., MacLean D., Markie M.L., Molloy J., Schneider M.V., Scott C., Smith-Unna R.,
and Vieira B.M.. The open science peer review oath [version 2; referees: 4 approved, 1 approved with
reservations]. F1000Research, 3:0 271, 2014. https://doi.org/10.12688/f1000research.5686.2 PMID:
25653839
97. A. Holcombe. Stronger Pledges. Open Access Pledge. http://www.openaccesspledge.com/?page_
id=21.
98. Solomon D.J. and Bjo
¨rk B-C.. A study of open access journals using article processing charges. Jour-
nal of the Association for Information Science and Technology, 630 (8):0 1485–1495, 2012. https://
doi.org/10.1002/asi.22673
99. Morrison H., Salhab J., Calve
´-Genest A., and Horava T.. Open access article processing charges:
DOAJ survey May 2014. Publications, 30 (1):0 1–16, 2015. https://doi.org/10.3390/
publications3010001
100. Solomon D. and Bjo¨rk B-C.. Article processing charges for open access publication—the situation for
research intensive universities in the USA and Canada. PeerJ, 4:0 e2264, 2016. https://doi.org/10.
7717/peerj.2264 PMID: 27547569
101. The World Bank. Data: Mexico. https://data.worldbank.org/country/mexico.
102. Curb L.A. and Abramson C.I.. An examination of author-paid charges in science journals. Comprehen-
sive Psychology, 1:0 01–17, 2012. https://doi.org/10.2466/01.17.CP.1.4
103. Alperin J.P., Fischman G., and Willinsky J.. Open access and scholarly publishing in Latin America:
Ten flavours and a few reflections. Liinc em Revista, 40 (2):0 172–185, 2008. https://doi.org/10.
18617/liinc.v4i2.269
104. Neylon C.. As a researcher. . .I’m a bit bloody fed up with data management. Science in the
Open, 2017. http://cameronneylon.net/blog/as-a-researcher-im-a-bit-bloody-fed-up-with-data-
management/.
105. Stodden V. and Miguez S.. Best practices for computational science: Software infrastructure and envi-
ronments for reproducible and extensible research. Journal of Open Research Software, 20 (1):0
e21, 2014. https://doi.org/10.5334/jors.ay
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 23 / 25
106. Eglen S., Marwick B., Halchenko Y., Hanke M., Sufi S., Gleeson P., Silver R.A., Davison A., Lanyon
L., Abrams M., Wachtler T., Willshaw D.J., Pouzat C., and Poline J-B.. Towards standard practices for
sharing computer code and programs in neuroscience. Nature Neuroscience, 200 (6):0 770–773,
2017. https://doi.org/10.1038/nn.4550 PMID: 28542156
107. Blischak J.D., Davenport E.R., and Wilson G.. A quick introduction to version control with Git and
GitHub. PLoS Comput Biol, 120 (1):0 e1004668, 2016. https://doi.org/10.1371/journal.pcbi.1004668
PMID: 26785377
108. Ram K.. Git can facilitate greater reproducibility and increased transparency in science. Source Code
for Biology and Medicine, 8:0 7, 2013. https://doi.org/10.1186/1751-0473-8-7 PMID: 23448176
109. Kluyver T., Ragan-Kelley B., Pe
´rez F., Granger B., Bussonnier M., Frederic J., Kelley K., Hamrick J.,
Grout J., Corlay S., Ivanov P., Avila D., Abdalla S., Willing C., and Jupyter Development Team. Jupy-
ter notebooks—a publishing format for reproducible computational workflows. In Loizides F. and
Schmidt B., editors, Positioning and Power in Academic Publishing:Players,Agents and Agendas,
pages 87–90. 2016. https://doi.org/10.3233/978-1-61499-649-1-87
110. CITSADMN Site License Working Group and Conlon, M. A software acquisition proposal. University
of Florida, 1994. http://nersp.nerdc.ufl.edu/~oits/UFRFC03.pdf.
111. University of Washington, Information Technology. UW-IT’s annual budget: FY 2017. https://www.
washington.edu/uwit/2016-partnerships/annual-budget-fy17/.
112. GNU Operating System, Free Software Foundation. What is free software? https://www.gnu.org/
philosophy/free-sw.en.html.
113. Young, B. Open source is here to stay. ZDNet, 2000. http://www.zdnet.com/article/open-source-is-
here-to-stay/.
114. University College London. UCL vision, aims and values. http://www.ucl.ac.uk/about/what/vision-
aims-values.
115. The University of Alabama. Mission & Objectives. https://www.ua.edu/about/mission.
116. Marrero D.G., Hardwick E.J., Staten L.K., Savaiano D.A., Odell J.D., Comer K.F., and Saha C.. Pro-
motion and tenure for community-engaged research: An examination of promotion and tenure support
for community-engaged research at three universities collaborating through a clinical and translational
science award. Clinical and Translational Science, 60 (3):0 204–208, 2013. https://doi.org/10.1111/
cts.12061 PMID: 23751026
117. Otten J.J., Dodson E.A., Fleischhacker S., Siddiqi S., and Quinn E.L.. Getting research to the policy
table: A qualitative study with public health researchers on engaging with policy makers. Preventing
Chronic Disease, 12:0 140546, 2015. https://doi.org/10.5888/pcd12.140546 PMID: 25927604
118. Acord S.K. and Harley D.. Credit, time, and personality: The human challenges to sharing scholarly
work using Web 2.0. New Media & Society, 150 (3):0 379–397, 2013. https://doi.org/10.1177/
1461444812465140
119. DePaul University, Office of Academic Affairs. Mission Statement. https://offices.depaul.edu/oaa/key-
initiatives/innovation-through-collaboration/Pages/mission-and-goals.aspx.
120. University of Oxford. Strategic Plan 2013–18. https://www.ox.ac.uk/about/organisation/strategic-plan.
121. Soares M.B.. Collaborative research in light of the prevailing criteria for promotion and tenure in acade-
mia. Genomics, 1060 (4):0 193–195, 2015. https://doi.org/10.1016/j.ygeno.2015.07.009 PMID:
26232606
122. Seipel M.M.O.. Assessing publication for tenure. Journal of Social Work Education, 390 (1):0 79–88,
2003.
123. Wren J.D., Kozak K.Z., Johnson K.R., Deakyne S.J., Schilling L.M., and Dellavalle R.P.. The write
position: A survey of perceived contributions to papers based on byline position and number of
authors. EMBO reports, 80 (11):0 988–991, 2007. https://doi.org/10.1038/sj.embor.7401095 PMID:
17972896
124. R.C. Schonfeld and R. Housewright. Ithaka S+R Faculty survey 2009: Key strategic insights for librar-
ies, publishers, and societies. 2010. http://www.sr.ithaka.org/wp-content/uploads/2015/08/Faculty_
Study_2009.pdf.
125. Cheverie J.F., Boettcher J., and Buschman J.. Digital scholarship in the university tenure and promo-
tion process: A report on the sixth scholarly communication symposium at Georgetown University
Library. Journal of Scholarly Publishing, 400 (3):0 219–230, 2009. https://doi.org/10.3138/jsp.40.3.
219
126. Walker R.L., Sykes L., Hemmelgarn B.R., and Quan H.. Authors’ opinions on publication in relation to
annual performance assessment. BMC Medical Education, 100 (1):0 21, 2010. https://doi.org/10.
1186/1472-6920-10-21 PMID: 20214826
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 24 / 25
127. The Ohio State University, Office of Academic Affairs. Appointments, Promotion, and Tenure: Criteria
and Procedures for the Department of Emergency Medicine, 2016. https://oaa.osu.edu/assets/files/
governance/college-of-medicine/emergency-medicine/Emergency_Medicine_APT_2016-07-16.pdf.
128. Tulane University, School of Medicine. Guidelines for faculty appointments and promotion. http://
www2.tulane.edu/som/upload/P_H-Guidelines-06-07.pdf.
129. Brembs B., Button K., and MunafòM.. Deep impact: unintended consequences of journal rank. Fron-
tiers in Human Neuroscience, 7:0 291, 2013. https://doi.org/10.3389/fnhum.2013.00291 PMID:
23805088
130. van Dalen H.P. and Henkens K.. Intended and unintended consequences of a publish-or-perish cul-
ture: A worldwide survey. CentER Discussion Paper Series, 2012–003, 2012. https://doi.org/10.2139/
ssrn.1983205
131. Berg J.M., Bhalla N., Bourne P.E., Chalfie M., Drubin D.G., Fraser J.S., Greider C.W., Hendricks M.,
Jones C., Kiley R., King S., Kirschner M.W., Krumholz H.M., Lehman R., Leptin M., Pulverer B.,
Rosenzweig B., Spiro J.E., Stebbins M., Strasser C., Swaminathan S., Turner P., Vale R.D., VijayRa-
ghavan K., and Wolberger C.. Preprints for the life sciences. Science, 3520 (6288):0 899–901, 2016.
https://doi.org/10.1126/science.aaf9133 PMID: 27199406
132. R. Poynder. The OA Interviews: Bernard Rentier, Rector of the University of Liège. https://www.
richardpoynder.co.uk/Rentier_Interview.pdf.
133. Hicks D., Wouters P., Waltman L., De Rijcke S., and Rafols I.. The Leiden Manifesto for research met-
rics. Nature, 5200 (7548):0 429–431, 2015. https://doi.org/10.1038/520429a PMID: 25903611
134. Austin A.E.. Faculty cultures, faculty values. New directions for institutional research, 19900 (68):0
61–74, 1990. https://doi.org/10.1002/ir.37019906807
135. Boyer E.L.. Scholarship Reconsidered:Priorities of the Professoriate. The Carnegie Foundation for
the Advancement of Teaching, Princeton University Press, 1990. https://eric.ed.gov/?id=ED326149.
136. Boyer E.L.. The scholarship of engagement. Bulletin of the American Academy of Arts and Sciences,
490 (7):0 18–33, 1996. https://doi.org/10.2307/3824459
PLOS Biology | https://doi.org/10.1371/journal.pbio.1002614 October 24, 2017 25 / 25
... However, with the comprehensive development of the scholarship policy and fee system, the problems and contradictions among the students in colleges and universities have increased significantly compared with the past period when there was no scholarship policy, and the work of civic education is facing new difficulties and challenges [9][10][11][12]. Under the new situation, the analysis of university student scholarship policy in the invisible ideological education function, the ideological education work into the scholarship policy, gives full play to the incentive and orientation of the scholarship, is conducive to improve the quality of training of students in colleges and universities, for the society to deliver top talents to lay the theoretical foundation [13][14]. Relevant department experts also pointed out that the establishment of a longterm, multi-purpose college student scholarship policy system can help to strengthen national scholarships, academic scholarships, national grants, and other incentives for college students [15][16]. ...
... Subjective fuzzy evaluations include sibling's school attendance 13 X . Family illness 12 X . ...
... 12 X : Illness of family members. 13 X : School attendance of siblings. 14 X : Family lives in remote mountainous areas. ...
Article
Full-text available
The selection of college grants is affected by many factors, but some of the influencing factors cannot be quantitatively judged at present. In order to be more fair and reasonable to the students in need of financial assistance for the review and evaluation, this paper introduces the fuzzy comprehensive evaluation method into the process to establish a fuzzy hierarchical analysis based on the college national scholarship evaluation model, the article finally through a specific example of the application of this paper based on the fuzzy comprehensive evaluation of the optimization model of the bursary system. The affiliation degree of the fuzzy comprehensive evaluation-based scholarship system constructed in this paper is (0.536,0.323,0.130,0.014), and the article concludes that Zhang’s moral education status is very good, and he is qualified to receive the scholarship.
... Jay et al. 2021). Second, training and guidelines on reproducible code and software management are more available to researchers (Donoho et al. 2008;McKiernan 2017;Kohrs et al. 2023). Third, funders and journals have been slowly but steadily introducing code-sharing policies. ...
Preprint
Full-text available
Software code (e.g., analytical code) is increasingly recognised as an important research output, as it improves transparency, collaboration, and research credibility. Many scientific journals have introduced code-sharing policies; however, surveys show alarmingly low compliance with these policies. In this study, we expand on a recent survey of ecological journals with code-sharing policies by investigating sharing practices in a comparable set of ecological journals without code-sharing policies. Our aims were to estimate code- and data-sharing rates, assess key reproducibility-boosting features like the reporting of software versioning, and compare reproducibility potential between journals with and without a code-sharing policy. We reviewed a random sample of 314 articles published between 2015-2019 across 12 ecological journals without a code-sharing policy. Only 15 articles (4.8%) provided analytical code, with the percentage nearly tripling over time (2015-2016: 2.5%, 2018-2019: 7.0%). Data-sharing was higher than code-sharing (2015-2016: 31.0%, 2018-2019: 43.3%), yet only 8 articles (2.5%) shared both code and data. Compared with a comparative sample of 346 articles from 14 ecological journals with a code-sharing policy, journals without code-sharing policies showed 5.6 times lower code-sharing, 2.1 times lower data-sharing, and 8.1 times lower reproducibility potential. Despite these differences, key reproducibility-boosting features between the two types of journals were similar. About 90% of all articles reported the analytical software used; however, for journals with and without a code-sharing policy, software version was often missing (49.8% and 36.1% of articles, respectively), and only proprietary (i.e., non-free) software was used in 16.7% and 23.5% of articles, respectively. Our study suggests that journals with code-sharing policies have greater reproducibility potential than those without. Code-sharing policies are likely a necessary but insufficient key step toward increasing reproducibility. Journals should prioritize adopting explicit, easy-to-find and strict code-sharing policies to facilitate researcher compliance as well as implement mechanisms such as checklists to ensure compliance.
... It encompasses various aspects such as open access, open data, and open-source software (Fecher and Friesike 2014). By making research outputs and academic discussion more widely available, open scholarship can increase the indability, accessibility, re-use, and re-distribution of research products (McKiernan 2017), thereby accelerating discovery and better addressing the big challenges of our society (Besançon et al. 2021). ...
Article
This report presents the insights of the Open & Equitable Model Funding Program, a pilot of a cohort of eleven research funders interested in refining their grantmaking to foster open and equitable practices. Launched in April 2021 by the Open Research Funders Group (ORFG) with grants ranging from 5to5 to 560 million, this initiative brought together experts across various fields to create thirty-two interventions to promote open research and equitable grantmaking. The funders cohort fostered a collaborative learning environment through monthly meetings, allowing participants to share insights and tackle challenges. Supported by the ORFG's resources and guidance, this structured approach facilitated the tailoring of interventions to each funder's specific needs, emphasizing early identification of challenges to integrate these practices seamlessly into existing funding mechanisms. Despite facing challenges such as staff turnover, limited time, and resources, which impacted the full engagement with and implementation of the interventions, the pilot was appreciated for its organized and guided framework and its collaborative learning environment. Participants who met their pilot goals attributed their success to the clear, achievable interventions and the structured design of the pilot, which allowed for focused implementation and executive-level support. The initiative also encouraged collaboration among peers, fostering a community of like-minded organizations exploring common challenges. The ORFG's documentation of lessons learned and the testing of intervention suitability offers valuable insights for future funders to refine their grantmaking strategies, underscoring the importance of continuous effort and commitment to achieve lasting change. These recommendations were refined for relevance and completeness from direct engagement with applicants, grantees, and researchers from underserved communities, ensuring the incorporation of insights from historically marginalized groups and with the goal of tailoring more inclusive and practical improvements.
... Academic mentorship schemes in HE require a multifaceted approach, that not only recognises and rewards mentorship excellence but provides opportunities for career progression and academic promotion (DeWaard & Chavhan, 2020), by encouraging knowledge exchange and open scholarship, fostering a culture of transparency and collaboration, promoting effective feedback cultures and opportunities for growth, to excel in their research, teaching, and leadership roles, ultimately strengthening the institution's academic reputation and impact (McKiernan, 2017). ...