Content uploaded by Ines Mergel
Author content
All content in this area was uploaded by Ines Mergel on Feb 14, 2021
Content may be subject to copyright.
Full Terms & Conditions of access and use can be found at
http://www.tandfonline.com/action/journalInformation?journalCode=rpxm20
Download by: [University of Konstanz] Date: 24 April 2017, At: 09:04
Public Management Review
ISSN: 1471-9037 (Print) 1471-9045 (Online) Journal homepage: http://www.tandfonline.com/loi/rpxm20
Open innovation in the public sector: drivers and
barriers for the adoption of Challenge.gov
Ines Mergel
To cite this article: Ines Mergel (2017): Open innovation in the public sector: drivers and barriers
for the adoption of Challenge.gov, Public Management Review
To link to this article: http://dx.doi.org/10.1080/14719037.2017.1320044
Published online: 24 Apr 2017.
Submit your article to this journal
View related articles
View Crossmark data
Open innovation in the public sector: drivers and
barriers for the adoption of Challenge.gov
Ines Mergel
Department of Politics and Public Administration, University of Konstanz, Konstanz, Germany
ABSTRACT
Online Open Innovation (OI) platforms like Challenge.gov are used to post public
sector problem statements, collect and evaluate ideas submitted by citizens with the
goal to increase government innovation. Using quantitative data extracted from
contests posted to Challenge.gov and qualitative interviews with thirty-six public
managers in fourteen federal departments contribute to the discovery and analysis
of intra-, inter, and extra-organizational factors that drive or hinder the implementa-
tion of OI in the public sector. The analysis shows that system-inherent barriers
hinder public sector organizations to adopt this procedural and technological inno-
vation. However, when the mandate of the innovation policy aligns with the mission
of the organization, it opens opportunities for change in innovation acquisition and
standard operating procedures.
KEYWORDS Online Open Innovation platforms; barriers for e-government adoption; government innovation;
crowdsourcing innovations
Introduction
Open innovation (OI) is the process of crowdsourcing solutions to organizational
problems to ensure organizational survival or renewal (Chesbrough 2003). Online
platforms are used to announce product design contests, such as Lego’s design of new
sceneries, or Netflix’s search for a new algorithm to suggest movies to its users. The
knowledge of traditional innovators like R&D departments in private sector organi-
zations is supplemented by including external amateurs who are not part of the
organization into the product design process using monetary incentives to encourage
participation (Chesbrough and Crowther 2006).
Similarly, public sector organizations are using OI approaches to access knowledge
from citizens. For that purpose and as part of the Digital Government Strategy, the
US federal government established a new policy instrument to access knowledge of
external problem solvers, such as citizen scientists or citizen hackers (The White
House 2009). The policy and follow-up executive mandates (The White House 2010)
support the America Competes Act (2007) designed to increase economic develop-
ment and the effectiveness and efficiency of public service delivery. US federal
government agencies are using a shared online platform called Challenge.gov to
post their problem statements and collect ideas from citizens.
CONTACT Ines Mergel ines.mergel@uni-konstanz.de
© 2017 Informa UK Limited, trading as Taylor & Francis Group
PUBLIC MANAGEMENT REVIEW, 2017
http://dx.doi.org/10.1080/14719037.2017.1320044
However, OI approaches are challenging for public sector organizations: the
traditional innovation acquisition process is highly regulated and follows strict
rules and regulations, while OI processes are by design open and have few rules.
Arundel et al. provide a three-prong model: bottom-up, policy-dependent, or knowl-
edge-scanning innovation methods of public sector agencies in their analysis of over
3,000 European government agencies (2015). Government organizations are either
working with a group of pre-approved vendors and contractors who are responding
to requests for proposals, which are then internally vetted before a solution proposed
by a vendor is implemented. Or, innovations are driven by policy mandates as a
source of innovation in the public sector. The Affordable Care Act in the United
States for example led to the creation of a new online marketplace, HealthCare.gov, a
technological as well as procedural innovation that was not available in government
before. Selected government agencies, such as the Defense Advanced Research
Projects Agency (DARPA), are designed to create research collaborations with out-
siders, including University partners to increase their access to innovations (Colatat
2015). These forms of internal innovation creation following the standards of the
bureaucratic governance process is called ‘closed innovation’(Felina and Zenger
2014). Stepping outside of the formalized innovation acquisition process with con-
tractual relationships and opening up the innovation process to amateur problem
solver is therefore a challenge for public sector organizations.
This article builds upon the existing OI literature in the private sector by adding a
new empirical context to enhance our understanding of barriers to innovative
e-governance practices. E-governance in this case is analysed to understand a tech-
nological innovation –the Challenge.gov platform –and a managerial innovation –a
change from traditional contract- and grant-based innovation acquisition to crowd-
sourcing of innovative problem solutions. The longitudinal study spans the time from
2010 to 2014 to understand how public managers perceive factors that hinder or
support the implementation process. The result is a multilevel analytical framework
of barriers that work against and drivers that foster the implementation of a techno-
logical and managerial innovation.
The key contributions of this article are to add a new empirical case that has not
been developed in the literature: OI in the context of government; and a multi-
dimensional analytical approach that not only takes in-house barriers of e-govern-
ment in a single organization into account, but also reviews drivers and barriers at
the intra-, inter-, and extra-organizational levels of analysis focusing on all federal
agencies engaged in OI practices.
The current OI literature focusses almost exclusively on private sector organiza-
tions. However, as Meijer (2015) states, the literature on public innovation and
e-governance needs to go hand-in-hand. Both bodies of literature aim to understand
how government organizations can respond to societal challenges, changing demands
by citizens, and the need to provide services in a more efficient and effective manner.
The research questions include: What are the empirical factors that hinder the
implementation of OI practices in public sector organizations? And, What are the
empirical factors that drive the implementation of OI practices in public sector
organizations? To answer these questions, the article first provides a review of the
current state of the OI literature and then applies the OI concept to the public sector
context. The research design outlines the data collection and analysis steps combining
data from the online platform Challenge.gov and interviews with public managers.
2I. MERGEL
The findings are organized along three levels of analysis and provide insights into
intra-, inter-, and extra-organizational factors for OI.
Literature review: OI in the public sector
OI is an umbrella term that describes processes, outcomes, and business models of a
new form of innovation creation. OI approaches appeared first in the private sector
allowing organizations to explore new clientele, market segments, or the invention of
new products and designs. Current OI studies focus predominantly on research and
development-intensive industry organizations (Murray et al. 2012; Villarroel 2013).
Practices and research have only minimally been expanded to the public sector.
The OI process
Chesbrough defined the OI process as a new form of purposive inflows and outflows
of knowledge to accelerate internal innovation and expand the markets for the
external use of innovation (2003). OI approaches are using crowdsourcing and peer
production processes to invite the public to co-create solutions for organizational
problems (Benkler 2006). Using Howe’s‘wisdom of the crowds’notion to add
citizen’s knowledge in incremental pieces to the overall solution, crowdsourcing
processes need to be supported by an online platform to advertise and collect
solutions from a geographically distributed crowd (Howe 2006; Surowiecki 2004).
Some contest processes expand the innovation collection phase and include crowd
judging or crowd improvement of the selected solutions (Mergel 2015).
Factors influencing the adoption of OI processes
Factors influencing the adoption of OI processes in the private sector include the role
of individuals and teams in designing and managing contests (Du, Leten, and
Vanhaverbeke 2014; Ettlie and Elsenbach 2007). While in the private sector, employ-
ees sometimes aim to protect their current inventions, public sector employees are
not hired to constantly search for new markets, ideas for new products, or even to
actively seek out return customers to ensure the survival of the organization.
The openness versus closeness characteristics of the search process oftentimes
influences the outcomes or the extent of OI adoption. Salge et al. found that too little
and too much openness in the search process is related curvilinear to the outcomes of
OI processes and the definition of the goal of the contest is oftentimes problematic
for organizations (2013). If problem solvers do not feel that they can contribute a
meaningful solution, or the solution is too simple to attract the right problem solvers,
the contest might not lead to the expected outcomes (Jeppesen and Lakhani 2010;
Lakhani and Jeppesen 2007; Dahlandera and Piezunkab 2014). In public sector
organizations, this might not even be a problem of designing a sophisticated and
appropriate search process, instead regulatory government organizations in the
United States are prohibited by law to actively seek citizen input and are limited to
certain types of regulated interactions. This can in turn reduce the adoption and
implementation of OI processes in some government agencies.
At the organizational or system level, existing research has focussed on the general
cultural climate that supports innovation acquisition. Opening the organizational
PUBLIC MANAGEMENT REVIEW 3
boundaries and subsequently agreeing to support the initial costs to design and (re)
organize the organizational procedures to allow an OI approach requires a paradigm
shift of top management as well as organizational units involved in the implementa-
tion process (Murray et al. 2012). In the public sector, changes in standard operating
procedures need to be vetted, tested, and approved by top management and flexibility
in adjusting organizational rules rarely exists (DeHart-Davis, Chen, and Little 2013).
Similarly, the existing OI literature looks at these issues from an inter-organiza-
tional perspective in which OI occurs, such as policy development and the need for
implementation of new governance approaches to OI –deciding whether the system
as a whole allows an open or closed innovation approach (Felina and Zenger 2014).
These decisions are rarely made bottom-up in the public sector. Instead all processes
and interfaces with the public or other organizations are designed to ensure account-
ability. In government, public managers perceive the existing red tape –a set of rules
and regulations governing an organization’s behaviour –as hindrances for the
adoption of innovations (Rainey, Pandey, and Bozeman 1995). However, acquisition
procedure following a request for proposals process and formal contracts with
contractors and vendors lead to a maximum of accountability and reduce risk for
contract managers.
Extra-organizational factors focus on the role of stakeholders located outside the
organization and include the role of users and consumers of OI processes, the
changing needs of society, and recent technological developments. For example, are
there external problem solvers that are willing to participate in the process or is there
an existing OI network and community that might be interested in taking on
problem solving for the organization (West and Lakhani 2008; Piezunka and
Dahlander 2015). Ropera et al. have shown that opening the organization for knowl-
edge in-flows can in turn have positive externalities and influence a positive climate
and image of the organization (2013). As a result, openness incentivizes problem
solvers to provide their knowledge and help create a solution (Fu 2012). However, the
relationship between the solution seeker and solvers can result in an unbalanced
power relationship with different incentives for each participating party
(Gambardella and Panico 2014).
Other extra-organizational factors –outside the direct influence sphere of the focal
organization –include larger societal factors, such as the trend of using crowdsour-
cing and crowdfunding approaches as an acceptable interaction mechanism between
organizations and stakeholders (Howe 2006; Shirky 2008). This aspect is closely
coupled with the development of public participation platforms and increases the
general digital literacy that allows stakeholders to participate in the process. In
addition, societal aspects can also be considered as new forms of industry or
contractor relationships.
Previous research on OI in the public sector found several important benefits for
public sector organizations. Mergel and Desouza found that OI approaches are used to
increase awareness of changes in policy or aim to recruit eligible parties to gain their
attention to programmes (2013). The main goal is to reach otherwise disconnected
parts of the population who are never in contact with government but might have
valuable insights on how public service delivery should be designed to reach them.
Contests submitted by public managers on Challenge.gov mostly focus on aware-
ness raising for societal problems and the availability of public services and pro-
grammes, public service improvement to create citizen satisfaction, in form of speed
4I. MERGEL
or reliability of public services, as well as knowledge-seeking initiatives or the call for
technical solutions (Mergel et al. 2014). Rarely do OI processes lead to radical or
disruptive innovations in the public sector. The most innovative outcomes are
realized in science and technology-oriented agencies, including NASA which ran a
contest to capture meteors or change their trajectory to avoid collision with earth
(NASA n.d.).
Other related engagement concepts in the public sector focus on co-production of
public services (Joshi and Moore 2004), co-development of public policy in form of
e-rulemaking processes (Coglianese 2004), or collaborations with the public in
budgeting processes (de Sousa Santos 1998). These procedures invite citizens to
create a public good and with their early participation in governance processes
increase trust, accountability, and ultimately higher degrees of buy-in for the final
budget, policy, or service. However, OI processes by design go beyond localized
communities who are directly affected by a policy, instead the recruitment of
problem solvers is designed to access knowledge from mostly nonprofessional pro-
blem solvers who are usually not part of the participation processes. In addition, OI
projects incentivize participation with monetary prizes that can be of smaller
denomination but oftentimes reflect the value an organization would pay for a
contractor or vendor to provide the solution through a contract or grant.
As Lee et al. state, national-level policies for the implementation of OI
approaches have just recently been designed in the public sector and government
organizations are in the very early stages of the implementation process (2012). In
an environment without prior experience of opening up the formal innovation
creation process, with multidimensional innovative practices that include techno-
logical innovations, process innovations, and outcome innovations, it is therefore
important to understand how government organizations perceive the challenges of
a new governance instrument.
Equally important for the development of a theory of OI is an understanding of
the factors that foster the implementation. The research framework presented in this
article explores these factors at three different levels of analysis: (a) extra-organiza-
tional and societal factors; (b) inter-organizational or system-wide factors that influ-
ence government organizations as a whole and in their interactions with each other;
and (c) intra-organizational factors that are individual to each agency, its specific
political context, mission, and organizational culture.
Research design: data collection and analysis
The sample for studying OI in the public sector includes all US federal government
agencies using the crowdsourcing platform Challenge.gov from 2010 to 2014, the first
5 years of its existence. Agencies in this sample are comparable because they all
received the same Presidential mandate, operate in comparable political environ-
ments, and simultaneously have to find ways to implement the mandate. This full-
census approach not only allows the inclusion of the whole population of adopters
but also focusses the data analysis efforts on similar cases that are facing the same
technological, cultural, and organizational circumstances when they are adopting and
implementing a new policy. The range of participating agencies is wide enough to
understand systematic differences between agencies with a variety of missions and
PUBLIC MANAGEMENT REVIEW 5
diverse sets of stakeholders. The sample might have limitations, because it does not
include non-adopters.
The data collection was designed in two phases: in the first phase, contests posted by
federal agencies and departments during the first 5 years of existence of Challenge.gov
were extracted from the platform. A research team reviewed problem statements, target
problem solvers, monetary and non-monetary incentives, duration of the contests,
possible indications of different contest phases, and expected outcomes. In iterative
discussions with two research assistants, each contest was analysed, compared, and
categorized according to the researchers’judgments. Challenges were coded based on
similarities of problem statements and categorized into awareness, service, knowledge,
and technical solutions. The contests were sorted by agency and year posted. The result
of this exploration is that a total of thirty-six agencies used Challenge.gov to post their
public management problems representing fourteen of the fifteen departments of the
US federal government. The site itself allowed for limited data collection that repre-
sented the data points listed above and did not reveal any information about internal
decision-making or strategic thinking behind the use of the site.
To compensate for the limited availability of public data, in a second data collec-
tion phase, the agencies that posted contests on Challenge.gov were invited to
participate in a qualitative interview study to elicit the perceptions of public managers
in charge of designing and managing prizes and contests for their agencies. Thirty-six
public managers participated in semi-structured interviews that lasted between 60
and 90 min.
The interview outline was informed by the analysis of the contests and the existing
literature on OI. The questions focus on organizational factors influencing the
decision to follow the Presidential mandate, the internal strategic and managerial
decision-making processes that preceded posting of challenges online, marketing
strategies to reach the desired audience of problem solvers, intended and actual
outcomes, implementation of the contest outcomes, as well as lessons drawn from
the process. Each interview was recorded with the permission of the subjects,
transcribed verbatim, and hand-coded line-by-line.
The narrative analysis was designed to develop a case-oriented understanding of the
phenomenon itself, but from the viewpoint of public managers who are participants in
the social phenomenon. This type of interpretative approach helps to gain a deeper
understanding of intra-organizational decision-making processes, strategic conversa-
tions behind closed doors, managerial implementation problems, and trial and errors
that occur. These data are not publicly observable through an online ethnographic
approach and cannot be experimentally examined or measured using quantitative
metrics, such as intensity, amount, or frequency measures (Denzin and Lincoln
2011). The goal is not to derive causal relationships; instead the narrative analysis of
the interviews with programme managers helps to identify common themes among the
agencies included in the study (Miles, Huberman, and Saldana 2014).
Findings
The findings of how the US federal government agencies adopted the use of Challenge.
gov to post OI contests can be divided into two main factors: (a) drivers for the
adoption of OI approaches; and (b) barriers for the adoption of OI approaches. Each
set of factors is divided into intra-, inter-, and extra-organizational factors.
6I. MERGEL
Finding 1: drivers for OI
Except for NASA and DOD’s DARPA, none of the agencies posting challenges to
Challenge.gov expressed an initial internal need to solve certain types of problems.
As a matter of fact, all agencies said that they have in-house experts and in cases
they cannot find a solution, they hire contractors. Needs were created through
OSTP staff who went through the secretaries of departments or administrators of
agencies to convince top leadership to test the new policy instrument. The national
priorities of the White House’s Open Government initiative have driven initial
tests, such as mobile application challenges to encourage the use of datasets posted
to Data.gov:
We worked with the Executive Office of the President, which got us involved in the
challenges, and then Safety.data.gov. We are trying to build those bridges within our
organization and outside of our organization, to ideally help the field. It came about in a
very straightforward manner. We are meeting regularly with the CTO’s office of the
President. The then Technology Officer, Anish Chopra, told us that he wanted us to do a
challenge and he wanted it out in two weeks.
Findings show that the main driver for the adoption of OI approaches is top-down
mandates, which then lead to early experimentations with contests and prizes.
Rarely do agencies go beyond the formal mandate to adopt OI, except for science
and technology agencies like NASA, which had gained experiences inviting profes-
sional coders to solve technology problems using the OI platform TopCoder. All
other agencies jumped on the bandwagon as a result of several follow-up executive
orders and memos, such as the Innovator’s Toolkit (The White House 2010).
However, before any of them was able to post and promote their contests on
Challenge.gov, they had to overcome institutional, legal, managerial, and cultural
barriers. This process can take up to 2 years, before an agency is finally able to
post an OI contest.
Intra-organizational factors
The intra-organizational drivers emerge at two different stages of the process: (1)
before the policy was officially published, and (2) after the policy instrument was
available to agencies. Pre-implementation, few agencies used OI approaches proac-
tively to initiate external searches for problem solvers outside the traditional grants
and contracts instruments. Only one agency was already interested in exploring
opportunities to invite problem solvers into the organization:
We wanted to use innovations, or use the tools and resources available to us to do things that
are innovative to solve some of the [problems], meet some of our strategic goals, that we may
not be able to get at conveniently or expeditiously through more traditional mechanisms.
As a matter of fact, most government employees experienced the need to access
external knowledge free from the constraining context of the existing rules and
regulations.
Top-down management decision
In most of the agencies, the pressure to adopt OI approaches was pushed down from
the top of the agency to the implementers. Even the simple request to populate the
PUBLIC MANAGEMENT REVIEW 7
Challenge.gov platform came from top management, as one public manager
points out
I’m not sure if it came from our immediate supervisor, or the supervisor’s supervisor: but it
came down that Challenge.gov is launching; we need to have challenges to be there when it
launches, because we were one of the first challenges on Challenge.gov. So it was told us that
we needed to do a challenge, and to think of a challenge.
This shows that adoption was not an emergent bottom-up, experimental process,
instead public managers were told to align their procedures with the policy requests.
The extent to which agencies then started to internally experiment with a crowd-
sourcing process is driven by the mission –the core task of an agency. As an example,
a public management problem occurs that influences all subagencies and teams
within a large department and still cannot be solved internally. As a result, an idea
emerges, needs top-down approval to be posted on Challenge.gov, and then manifests
itself in an actual contest. One public manager describes the process in a public health
agency:
Actually the assistant secretary was the one who initiated both of them. She is the sole
approval authority within [our agency], so she is the only one who can sign off on anybody
doing a prize challenge in [our agency]. The most recent Twitter challenge, the now trending
challenge hashtag ‘Health in My Community’, was proposed by our operational data analytics
team that’s called ‘The Fusion Cell.’They had spoken with state, local regional, tribal and
territorial representatives over the course of several months, investigating ways to utilize the
different data sources that were available via various social networking mechanisms.
Another public manager added: ‘I mean the impetus came from the White House to
our administrator to us. Our administrator was very supportive of us using
[Challenge.gov].’
Strategic alignment with the organizational mission
For most agencies, the support of the organizational mission and alignment with
their own innovation strategy are key drivers to use an OI approach instrument.
Mission alignment ranges from reputational gains, for example being seen as an
innovator, to creating awareness and educating the public about available public
programmes. One manager explains: ‘[Our agency] benefits just in general from
running competitions and getting our name out there and [being] associated with
things that are innovative and working with startups and early stage companies.’
Agencies did not have opportunities to fulfil this part of their strategic innovation
plans using the existing instruments. The Prizes and Contests policy allowed them to
reach out in innovative ways.
Connect to new communities of problem solvers
The traditional innovation acquisition instruments limit government’saccess to
problem solver communities that are not already pre-approved on acquisition sche-
dules or solutions to problems cannot simply be solved by an existing long-term
contract. Agencies stated that they were in a holding pattern: knowing that they had
to reach out to emergent communities of problem solvers to get access to potentially
non-standard ideas, they did not have an instrument that allowed them to interact
with these problem solvers. One public manager stated the need to improve the
current innovation practices:
8I. MERGEL
I think there is just an overall embrace in general about looking at new ways of doing work
and having outside people help us with the work that we do.
OI approaches provide government with innovative ways to reach out to the aca-
demic community and especially student problem solvers who might have already
been working on similar projects. The other larger stakeholder group includes soft-
ware developers who were encouraged to collaborate with government to create
mobile applications using open data.
Inter-organizational drivers
The General Service Administration (GSA), the agency in charge of providing the
Challenge.gov platform, recognized early that the Presidential mandate was not
immediately implemented. GSA initiated roadshows, phone conferences, and webi-
nars to provide trainings and best practices that helped to initiate intra-organizational
conversations about prizes and challenges. In collaboration with OSTP, federal
agencies were trained to adopt contests, as one programme manager explained:
OSTP set up a center of excellence at [agency] for challenges and prizes and we worked with
[name] in particular for the Big Data ideation challenge. It is a government-wide opportunity
that I see OSTP has taken the lead in making that all of the agencies understand that it’s legal
to do competition and prizes, where probably before we would have felt we didn’t have the
right or the authority to do such a thing. They clarified the rulings, the legalities of all of that,
and then they gathered best practices. This was trying to educate the folks at the agencies
about competitions and prizes.
Existing technological platform Challenge.gov
In addition to providing procedural guidance, GSA was also in charge of setting up
the shared online platform Challenge.gov. A central approach to solving the techno-
logical problems for all participants has encouraged agencies to use the platform
knowing that legal barriers regarding the review of the Paperwork Reduction Act
were already prescreened and preapproved. One public manager explained that ‘we
have resources now and connectivity that we could not have envisioned five years
ago.’The platform is made available at no-cost to the agencies.
Organizational mimicry
Agencies that are adopting prizes and challenges slower than others reported that
they tend to mimic already existing behaviour from agencies that were able to jump
onto the bandwagon earlier:
Outside of [the interviewee’s agency], NASA has done some fairly innovative things, and
GSA. Other government agencies that I have seen as being innovative, or doing [open]
innovation, or using [open] innovation to solve their problems: the office of the National
Coordinator is very forward leading in that area.
Mimicry does not only flow from the outside in (mimicking NASA’s or GSA’s
innovative behaviour), but also within a larger department, public managers copy
each other’s behaviour and learning from each other.
PUBLIC MANAGEMENT REVIEW 9
Extra-organizational and societal drivers
Extra-organizational or societal drivers include factors that occur outside government
as an institution. Public managers are picking up general technological or behavioural
trends that allow them to rethink their internal needs or existing mechanisms, which
might lack opportunities to initiate innovations, or even to reach out to those parts of
their stakeholders they usually cannot reach through the existing channels. With the
previous wave of e-government, using social media to reach out to citizens, public
managers have already made positive experiences collaborating with citizens and feel
comfortable to structure these outreach mechanisms through a central platform. A
public manager explained his agency’s positive experiences that drove the Challenge.
gov approach:
We ran probably the largest and one of the earliest sort of video contests, during the H1N1
flu issue. We ran a contest where we solicited videos from the public to talk about prevention
of H1N1. A significant amount take on that and participation of the public.
Another public manager explained the cultural shift he observed and how his
organization picked up the general trends:
This is a new construct: the whole idea of ideation and innovation as a practice or a cultural
norm is. There may be elements of it within our culture, we don’t formally recognize, or don’t
formally encourage, but we’re moving [into OI], since now that there is awareness, there’s the
move to formally embrace it, encourage it and ‘inculturate’it.
The more experience agencies gain with crowdsourcing mechanisms to engage large
amounts of problem solvers, the more they recognize the shift among citizens who
want to be in contact with the government. One public manager explains:
We saw that there was a demand and a desire for students to engage with the State
Department, but we weren’t able to meet that demand through the virtual internship
program. We thought let’s look at other ways that we can have a lot of the college students
engaging with the State Department. Looking at Challenge.gov and other sites, we thought
hey, using a challenge type thing is a way to have even more students be involved, and aware
of the work that we’re doing.
Other agencies explained that the broad participation of problem solvers who were
never in touch with government showed them the potential of existing slack capacity
among external problem solvers as an indicator for broad societal acceptance of free
or low-prized contributions beyond citizens’help to solve government problems.
Finding 2: barriers for OI
Barriers for the adoption of online OI approaches include legal barriers, uncertainty
about the process and its outcomes, technological barriers to design crowdsourcing
processes, and most importantly cultural factors that prevent or delay the adoption
decisions. Agencies were instructed to find money in their existing budgets to
conduct prize contests. The monetary requirements include dedicated personnel,
such as department-level point of contacts, and prize money to pay for the winning
solutions. In order to get to the point where agency representatives interviewed for
this research project launched contests, the institutional barriers had to be addressed.
Some of the barriers are rooted in the specific context of government organizations;
other barriers are inherently connected to the perceptions and tasks of individuals
10 I. MERGEL
involved in the OI process. Working through these barriers delayed the process for
years as one interview explains:
We had to figure out an internal mechanism for making [challenges] happen. That’s taken
over a year. But, I think it’s a year well spent. The America Competes Act supposedly
authorizes federal agencies to give prizes. But many agencies don’t feel confident in it that
actually does provide the right coverage for agencies. Our legal staff definitely did not feel that
it was appropriate.
Intra-organizational barriers
Intra-organizational barriers refer to factors that occur inside each organization and
can be solved by the organization itself. These factors include legal, cultural, techni-
cal, and institutional barriers.
Legal barriers
Public managers perceived the absence of a legal framework as risky. One manager
mentioned that prizes and contests ‘gave everything kind of an air of uncertainty’and
that the America Competes Act only provides a vague umbrella framework that is
still open for interpretation. Agencies coped with the vagueness and absence of
existing rules by borrowing rule structures from agencies that had already imple-
mented prizes and contests and had set a precedent. The existing rules were then
adapted to the local contexts.
Examples of legal uncertainty focus on management and targeting of external
audiences. These processes are closely regulated through the existing ‘cookie’
policy (tracking of online user behaviour), measurement, and surveys. Especially,
when it comes to personally identifiable information collected by a government
agency, section 208 privacy provisions of the e-government act require agencies to
conduct a privacy impact assessment (2002). These provisions needed to be
adapted to allow new technologies and citizen interactions. However, agencies
that target specific audiences struggle with the implementation. One public man-
ager explains:
We had to figure out how kids could be involved. Could they submit something? Do they
have to get parental permission? At what age? The second part of that was, we had to
customize the entry form on Challenge.gov to ask questions about age to be able to support
that, because Challenge.gov, assumes that you are of a certain age.
The second aspect of legal considerations includes intellectual property (IP) rights.
While IP issues are arranged by contract in the traditional innovation acquisition
processes, these issues had to be re-evaluated for the use of voluntary submissions of
knowledge by citizens. In the private sector, several models of co-ownership of
submitted solutions or shared patents have evolved (see for example: Belderbosa
et al. 2014). In the public sector, however, one public manager explained:
There is a debate going on right now about whether or not, and if someone wins a prize and
the government pays them however much money for their best technology solution, if that’sa
direct pathway for that person to be able to work with government. Can that be used for a
sole source justification, can we just get the technology to use as a license, or do we actually
have to release another contract and compete for the solution that the government actually
buys? There’s a lot of disagreement about what the government can do with the technology,
especially if we don’t write in the rules that there’s a government right to the intellectual
PUBLIC MANAGEMENT REVIEW 11
property. But often times, if it’s bigger market simulation prizes, people won’t participate if
the government is going to own their IP.
These legal barriers prevent government organizations to proactively seek out more
complex solutions, especially if they are not able to actually implement the results of
the contest.
Cultural factors
The organizational culture factors include several different aspects of the OI process:
(a) type of agency and political context, (b) acceptance of external innovations, and
(c) the lack of top-management support and buy-in.
Individual-level factors depend on the type of agency, as an example, science
and technology agencies are more likely to abandon the idea to invite external
non-professional problem solvers into their innovation process. The R&D-
oriented teams are trained (and hired) to develop answers in-house and might
feel that it is their core job to know it all. The result is oftentimes that solutions
from non-professional problem solvers are not accepted (Katz and Allen 1982).
One public manager in a science agency described the general rejection of the new
process:
We’re an R&D and technology development organization. People here believe that they are
the best engineers in the world, and so if they can’t solve it, nobody can.
Another public manager adds that it is difficult for government employees not to be
able to fulfil the requirements of their job: This significant cultural change in the way
innovations are acquired challenges employee attitudes:
It is a big shift, because in some ways it is admitting and acknowledging maybe that we the
government don’t have all the answers and that we can benefit from other people’s perspec-
tive. It is a cultural shift for people to think that way.
One public manager explained how his agency worked to explore the cultural
barriers. Instead of changing the mindsets of those people who were against the
implementation of prizes and contests, he says:
Actually, we didn’t really work around the culture. We basically found the people who were
leading the way, and promoted their data the most and encouraged others to follow the same
path.
Technological barriers
Public managers pointed to the fact that an innovative approach comes with initial
resistance due to the newness of the technology itself, and new ways of defining
and soliciting input for public sector problems: ‘So they look easy, but they are
not simple.ʼThe standard way to communicate the need for solutions is directly
tied to the existing RFP (request for proposals) process, using language that is
industry standard or terms only known to professionals. However, problem state-
ments in the OI process need to be written in plain language –not only closed
endedenoughtomakethemunderstandabletoamateurproblemsolvers,butalso
open ended enough to allow for innovations the agency has not thought about
itself. OI provides a new infrastructure and reverses the highly structured contract
process:
12 I. MERGEL
The more fuzzier aspect, the fungible aspect of prize challenges that isn’t addressed in the
[department’s’guidance] is what goes into developing a problem statement, and what goes
into developing a successful challenge. That’s far more challenging.
Previous research on OI processes in the private sector has shown that the degree of
openness in defining problems is crucial for the success of a contest. Salge et al.
showed that too much openness does not support the process and that a guided
approach is oftentimes better (2013). The technology itself was not flexible enough to
support these steps:
One of the unexpected things is the Challenge.gov site itself is very rigid, in terms of what you
can put in there. I thought we were going to be able to have a system where people actually
entered their plans online. Instead, what we ended up with was a system where we have a
Microsoft Word document as a template. They complete their template separately, and then
they upload that. So that was a surprise to me, for the way the Challenge.gov worked. The
technical limitations, word limits, letter limits.
Uncertainty about innovation outcomes
Public managers confronted with the use of contests and challenges need to adapt to
anew organizing process that challenges their current standard operating structure,
such as contract management. In the standard acquisition process, expectations,
framing, contractual and legal statements are clearly designed and have been codified
as organizational knowledge in handbooks and operating manuals.
In addition to the concerns about the design elements of an OI process, public
managers are reluctant about innovation outcomes. Public managers, trained to
explicitly define the deliverables for a contract or grant, now have to accept that OI
is designed to find answers for problems that do not have a predefined solution. In
combination with uncertainty about the boundaries and capabilities of the solver
community, public managers therefore expressed high degrees of uncertainty about
the innovation outcomes:
We didn’t know how to solve the problem. Not that we could have created an RFP, where we
were confident would have been the right way to solve the problem. We knew the problem
that needed to be solved was proliferating the use and availability of Blue Button Personal
Health Records. What we didn’t know was what the best way to go about doing that. We
wouldn’t have been able to say to industry: Here’s what we’re looking for: This particular way
to proliferate the availability of use.
Institutionalization barriers
After agencies overcame policy, process, and legal barriers, organizational responsi-
bilities needed to be redefined to institutionalize OI in the existing organizational
structure. One public manager explains how difficult it was for his agency to start to
incorporate contests into the standard operating procedures:
This is because the concept is not to set up an office of innovation, and not have a place that
does innovation, and do this with more of an [institutionalization], and making it part of the
[department’s] culture. There is no office, and there’s also no chief innovator, and there’sno
one who hashes it out. There is no concrete structure that people can go to and say oh good,
the Office of Innovation helped me with this.
This ad hoc process shows how agencies are struggling with the institutionalization and
the lack of organizational structure in early phases of the adoption process. Previous
research has shown that organizational structure will eventually follow strategies (or
PUBLIC MANAGEMENT REVIEW 13
policies in the government context) (Burgelman 1983). The initial lack of organizational
structure has led to dedicated organizational roles in other parts of the government
system. As an example, the City of Boston and the City of Philadelphia have institutio-
nalized an Office of Innovation and a Chief Innovation Officer role.
Similar to institutionalization of organizational roles, the organizational culture is
slowly changing from a closed innovation paradigm to an open paradigm where
experimentation is part of a trial and error process. One public manager explains that
while there is top management support, lower ranks might not be on board and
hinder institutionalization:
The top levels of leadership that would be the assistant secretary and the deputy assistant
secretary; there is support, and enthusiasm. The senior leadership below them and then some
of the leadership in the different management levels however getting them to embrace this
construct and this culture of innovation, which is a little different than some of them may be
used to. They, being innovative and doing innovative things implies a certain amount of
inherent risk, and they are by definition risk adverse. That is counter intuitive to the way their
structure operates.
As a result, public managers avoid experimentation. Government’s existing culture
does not allow what one interview partner called a safe environment ‘to take smart
risks’and he called for ‘a place to take risks, a place where failing is ok. Don’t punish
people for failing if you told them you wanted them to take a risk. You can’t
encourage smart risk taking and then punish failure.’
Inter-organizational barriers
Hurdles that cross different government organizations occur only when two or more
agencies have to actively collaborate and interview partners reported mostly barriers
in the initial set-up. Legal barriers expand beyond intra-organizational needs to start
to use OI and have to be resolved in a time consuming manner. Collaborative
contests with shared promotion and outcomes include for example the First Lady’s
collaboration with the Department of Agriculture to run the Apps for Healthy Kids
challenge, or the Department of Health and Human Services’collaboration with the
Environmental Protection Agency and their My Air, My Health challenge. One
public manager explained the process:
Interagency bureaucratic challenges of doing an Interagency Memorandum of
Understanding, an MOU, have proven to be quite challenging. Each agency gets its own
funds; it is charged with investing in the program sand projects that will help it achieve its
mission. Everyone believes that if we could work better across agencies that we would be
better coordinated and we might more quickly overcome some of the challenges that we have
in terms of clean energy and wellbeing and so forth. But there are still a lot of bureaucratic
hurdles that are just there. I want to simplify the process so that any agency that is interested
in doing a Big Data challenge can. I could make the process as simple as possible for them to
engage, rather than it being a horrendous process. So every time we want to do a new contest,
we have to set up a whole new MOU framework.
These inter-organizational barriers highlight the need for directions from the Office
of Management & Budget as agencies gain more experience and recognize opportu-
nities for inter-organizational collaboration.
14 I. MERGEL
Extra-organizational and societal factors
Extra-organizational barriers are far less prevalent than extra-organizational dri-
v
ers.Barriersinthiscontextmostlyfocusonmindfulnessofhowchangesin
internal processes and government experimentation with innovative policies
might be perceived by the public. In traditional procurement processes, account-
ability is built into the process with exact guidelines, reporting structures, and
accountability measures that are designed to keep a close watch of appropriateness
of managerial actions:
We as federal employees are good stewards of the taxpayers’money that we’re entrusted with
using; especially if we’re using it in a contract mechanism to procure goods and services from
outside the federal government. That being said, it’s not a clean line of sight. It’s not a straight
line between Point A and Point B as far as how we do [OI].
Furthermore, monetary incentives and the publicness of the selection and judging
process provide ground for concern:
How we manage money is [regulated in the America Competes Act], specifically how we can
use appropriated funds as a cash prize in a contest. It also says we can work with other
agencies and departments to do this, and it says we can partner not just within the federal
government, but with nonfederal partners to do that. So that makes it more clear cut for us. It
helps make the line a little straighter between Point A and Point B. Competes actually made
challenges easier to use because it more clearly defined the rules for us as federal employees
using you know, taxpayer money.
Discussion and conclusions
This article provides empirical evidence for factors driving and hindering govern-
ment organizations to adopt and implement a new policy instrument called Prizes
and Challenges in the US federal government. To gain a deeper understanding of
how the internal strategic and managerial decision-making processes are influenced
by these factor and ultimately lead to the outcomes observable on Challenge.gov, this
article provides a framework on three dimensions: intra-, inter-, and extra-organiza-
tional factors. The data include agencies involved in this form of innovation acquisi-
tion and adds to our understanding of how OI online practices diffuse in the public
sector.
OI was introduced as a new umbrella concept for inviting external, non-profes-
sional problem solvers to help government find solutions for problems they weren’t
able to solve internally or with the help of the standard innovation instruments grants
and contracts. The current OI literature focusses on private sector organizations (for
a review, see West and Bogers 2014). Based on one embedded case study –the use of
the online platform Challenge.gov –and a policy instrument ‘Prizes and Contests’of
the US federal government, the barriers and drivers for the implementation and use
of the platform are analysed. This article shows the complexity of issues that public
sector organizations have to deal with when they are confronted with a political
mandate to innovate as opposed to private sector organizations. In addition, OI
approaches don’t only need a crowdsourcing platform, but public managers need
to be empowered to take risks and step outside the risk-free request for proposal
process in traditional acquisition routines.
PUBLIC MANAGEMENT REVIEW 15
Barriers occur on the inter- and extra-organizational levels making it increasingly
complex for government organizations to adopt a new technology. The summary
overview in Table 1 shows the need to study the resulting implementation strategies
of those agencies that did not reject the adoption of OI. The drivers provide insights
about the changing needs of government organizations for a pathway to access
innovative ideas from outside of government. Existing innovation acquisition instru-
ments as part of the innovation paradigm limit government organizations to involve
those problem solvers who are not part of the formal acquisition process. Open
advertising possibilities and collection of the crowdsourced innovations allow gov-
ernment to use e-governance mechanisms that were not previously available.
This article also adds to the literature on co-production and co-design of public
services (Joshi and Moore 2004; Osborne, Radnor, and Strokosch 2016; Boivard
2007). While previous research on co-production has mostly focussed on inclusion
of citizens in form of public participation processes to gain their support, trust, and
insights in structured decision-making processes, OI opens a new conduit to include
external knowledge into the problem-solving processes. OI is however significantly
different from previous waves of co-production and leads to innovative ideas and
solutions by providing monetary incentives and using online platforms to broaden
the inclusion of ideas and not only local participation.
In addition to the existing OI literature, previous research on e-government has
mostly focussed on internal barriers or drivers. This article shows that external
barriers and drivers are out of the control of a single agency and equally important
in adoption and implementation decisions of an e-government innovation.
The findings highlight the opportunity to review what the innovations outcomes
are and what their potential impact is on the innovation process itself and the
broader innovation culture in government. Most of the current e-government and
OI research focusses on the diffusion and adoption decision; however, there is no
research that looks at short-term innovation products and long-term changes in the
organization. It is important to study the outcomes of the OI process beyond the
number of solutions submitted from citizens during the crowdsourcing process,
which can only indicate interest in a given topic, but says little about the value of
the innovation. Indirect outcomes, such as legitimization of an innovative acquisition
process, public value creation, or the changing climate towards external innovations
in government, can be studied to understand the extent of the impact on government.
Limitations
This research is clearly limited by its focus on a specific layer of the US federal
government’sexecutive branch situated in a specific political climate: President
Obama’s administration’s push for an agile government with the use of new technol-
ogies. As one of the first studies on OI in the public sector, the framework developed
here can help researchers design their own OI inquiries or conduct comparative public
administration research. The inclusion of a variety of government agencies with diverse
sets of missions and mandates contributes to our understanding of how government
agencies implement OI given their own political context, differences in top-manage-
ment sentiments towards innovation, and diverse organizational cultures.
The interview partners selected for this study do not only focus on successful
cases. The nature of the study helps to understand what makes OI approaches
16 I. MERGEL
unsuccessful for some government actors, as opposed to others who given the
alignment of OI policy instrument and e-government platform with their organiza-
tional mission made it very successful.
In conclusion, OI –even several years after it was officially introduced through a
Presidential mandate in the US federal government –is still in its early stages. Each
agency has to work through not only legal barriers, but also cultural internal barriers.
Table 1. Summary of findings: drivers and barriers of open innovation in the public sector.
Level Drivers Barriers
Intra-organizational Pre-policy time frame:
●Internal experimentation
Post-policy time frame:
●Top-down management decision
●Strategic alignment with mission
●Necessity to change existing innova-
tion acquisition framework
●Recognizing unsolvable internal
public management problems
●Opportunity to connect with new
community of problem solvers
Type of agency:
●Adoption speed
●Regulatory barriers (extent to which
agencies are allowed direct citizen
interaction)
Legal aspects:
●Properties of submitted solutions
●Collection of PII through contests
●Time lag from decision to
implementation:
(a) Working through organizational bar-
riers: 2 years
(b) Simple →complex public manage-
ment problems
Cultural aspects:
●NIH syndrome
●Inertia: waiting out policy waves, lea-
dership changes (‘trickle down’)
Procedural aspects:
●Understanding crowdsourcing
processes
●Defining public management problems
●Designing contest phases
●Recruit judges
●Monetary/Non-monetary contests
Inter-organizational
factors
●GSA & OSTP guidance, best practices,
innovators toolkit, roadshows,
webinars
●Organizational mimicry of existing
practices
●Existing technological platform to
promote challenge and collect ideas
●Legal: MOUs
Extra-organizational
factors
●Introduction of new policy instru-
ment through Presidential mandate
National Priorities (e.g. eco-
nomic development)
●Digital Government Strategy
●Societal acceptance of crowdsour-
cing processes
●Recruiting of ‘unreachables’: weak ties
to otherwise disconnected citizens
●Expectations of professional problem
solvers vs. general citizenry
●Power distance/cultural distance: we vs.
them mentality
NIH: Not invented here; PII: personally identifiable information.
PUBLIC MANAGEMENT REVIEW 17
Only in cases where there is clear alignment with the organizational mission and
openness of the organizational culture, agencies attempt to adopt (oftentimes low
risk) prizes and contests. While some agencies have waited for a Presidential mandate
like to allow them to innovate in an open format, clearly for other agencies, the
mandate is a burden and they reject a sustainable implementation of OI.
Disclosure statement
No potential conflict of interest was reported by the author.
Notes on contributor
Ines Mergel is full professor of public administration at the Department of Politics and Public
Administration at the University of Konstanz. Her research focusses on digital transformation,
innovative use of new technologies, and networked governance. She currently serves as the
Associate Editor of Government Information Quarterly.
ORCID
Ines Mergel http://orcid.org/0000-0003-0285-4758
References
Arundel, A., L. Casali, and H. Hollanders. 2015.“How European Public Sector Agencies Innovate:
The Use of Bottom-Up, Policy-Dependent and Knowledge-Scanning Innovation Methods.”
Research Policy 44 (7): 1271–1282. doi:10.1016/j.respol.2015.04.007.
Belderbosa, R., B. Cassimana, D. Faemsa, B. Letena, and B. Van Looy 2014.“Co-Ownership of
Intellectual Property: Exploring the Value-Appropriation and Value-Creation Implications of Co-
Patenting with Different Partners.”Research Policy 43 (5): 841–852. doi:10.1016/j.
respol.2013.08.013.
Benkler, Y. 2006.The Wealth of Networks: How Social Production Transforms Markets and Freedom.
New Haven, CT: Yale University Press.
Boivard, T. 2007.“Beyond Engagement and Participation: User and Community Coproduction of
Public Services.”Public Administration Review 67 (5): 846–860. doi:10.1111/j.1540-
6210.2007.00773.x.
Burgelman, R. A. 1983.“A Model of the Interaction of Strategic Behavior, Corporate Context, and
the Concept of Strategy.”Academy of Management Journal 8 (1): 61–70.
Chesbrough, H. 2003.Open Innovation: The New Imperative for Creating and Profiting from
Technology. Cambridge, MA: Harvard Business Press.
Chesbrough, H., and A. K. Crowther. 2006.“Beyond High Tech: Early Adopters of Open Innovation
in Other Industries.”R&D Management 36 (3): 229–236. doi:10.1111/j.1467-9310.2006.00428.x.
Coglianese, C. 2004.“E-Rulemaking: Information Technology and the Regulatory Process.”
Administrative Law Review 56 (2): 353–402.
Colatat, P. 2015.“An Organizational Perspective to Funding Science: Collaborator Novelty at
DARPA.”Research Policy 44 (4): 874–887. doi:10.1016/j.respol.2015.01.005.
Congress, 110th. 2007.“America COMPETES Act.”In H.R.2272, edited by 110th Congress.
Washington, DC: Library of Congress.
Dahlandera, L., and H. Piezunkab. 2014.“Open to Suggestions: How Organizations Elicit
Suggestions through Proactive and Reactive Attention.”Research Policy 43 (5): 812–827.
doi:10.1016/j.respol.2013.06.006.
de Sousa Santos, B. 1998.“Participatory Budgeting in Porto Alegre: Toward a Redistributive
Democracy.”Politics & Society 26 (4): 461–510. doi:10.1177/0032329298026004003.
18 I. MERGEL
DeHart-Davis, L., J. Chen, and T. D. Little. 2013.“Written versus Unwritten Rules: The Role of Rule
Formalization in Green Tape.”International Public Management Journal 16 (3): 331–356.
doi:10.1080/10967494.2013.825193.
Denzin, N. K., and Y. S. Lincoln. 2011.“Introduction: The Discipline and Practice of Qualitative
Research.”In The Sage Handbook of Qualitative Research, edited by N. K. Denzin and Y. S.
Lincoln, 1–19. Thousand Oaks, CA: Sage.
Du, J., B. Leten, and W. Vanhaverbeke. 2014.“Managing Open Innovation Projects with Science-
Based and Market-Based Partners.”Research Policy 43 (5): 828–840. doi:10.1016/j.
respol.2013.12.008.
E-Government Act of 2002, Public Law 107–347—DEC. 17 2002, Issuing Organization: 107th U.S.
Congress, Washington, DC.
Ettlie, J. E., and J. M. Elsenbach. 2007.“The Changing Role of R&D Gatekeepers.”Research-
Technology Management 50 (5): 59–66.
Felina, T., and T. R. Zenger. 2014.“Closed or Open Innovation? Problem Solving and the
Governance Choice.”Reearch Policy 43 (5): 914–925. doi:10.1016/j.respol.2013.09.006.
Fu, X. 2012.“How Does Openness Affect the Importance of Incentives for Innovation?”Research
Policy 41 (3): 512–523. doi:10.1016/j.respol.2011.12.011.
Gambardella, A., and C. Panico. 2014.“On the Management of Open Innovation.”Research Policy
43 (5): 903–913. doi:10.1016/j.respol.2013.12.002.
Howe, J. P. 2006.“The Rise of Crowdsourcing.”Wired Magazine Online 14 (6). https://www.wired.
com/2006/06/crowds/
Jeppesen, L. B., and K. R. Lakhani. 2010.“Marginality and Problem-Solving Effectiveness in
Broadcast Search.”Organization Science 21 (5): 1016–1033. doi:10.1287/orsc.1090.0491.
Joshi, A., and M. H. Moore. 2004.“Institutionalised Co-Production: Unorthodox Public Service
Delivery in Challenging Environments.”Journal of Development Studies 40 (4): 31–49.
doi:10.1080/00220380410001673184.
Katz, R., and T. J. Allen. 1982.“Investigating the Not Invented Here (NIH) Syndrome: A Look at the
Performance, Tenure, and Communication Patterns of 50 R & D Project Groups.”R&D
Management 12 (1): 7–20. doi:10.1111/radm.1982.12.issue-1.
Lakhani, K. R., and L. B. Jeppesen. 2007.“Getting Unusual Suspects to Solve R&D Puzzles.”Harvard
Business Review 85: 5.
Lee, S. M., T. Hwang, and D. Choi. 2012.“Open Innovation in the Public Sector of Leading
Countries.”Management Decision 50 (1): 147–162. doi:10.1108/00251741211194921.
Meijer, A. 2015.“E-Governance Innovation: Barriers and Strategies.”Government Information
Quarterly 32 (2): 198–206. doi:10.1016/j.giq.2015.01.001.
Mergel, I. 2015. "Opening government: Designing Open Innovation Processes to Collaborate with
External Problem Solvers." Social Science Computer Review 33 (5): 599-612.
Mergel, I., and K. Desouza. 2013.“Implementing Open Innovation in the Public Sector: The Case
of Challenge.Gov.”Public Administration Review 73 (6): 882–890. doi:10.1111/puar.2013.73.
issue-6.
Mergel, I., S. I. Bretschneider, C. Louis, and J. Smith. 2014.“The Challenges of Challenge.Gov:
Adopting Private Sector Business Innovations in the Federal Government.”In Proceedings of the
2014 47th Hawaii International Conference on System Sciences (HICSS), 2073–2082. Washington,
DC: IEEE.
Miles, M. B., A. M. Huberman, and J. Saldana. 2014.Qualitative Data Analysis: A Methods
Sourcebook. 3rd ed. Thousand Oaks, CA: Sage.
Murray, F., S. Stern, G. Campbell, and A. MacCormack. 2012.“Grand Innovation Prizes: A
Theoretical, Normative, and Empirical Evaluation.”Research Policy 41 (10): 1779–1792.
doi:10.1016/j.respol.2012.06.013.
NASA. n.d. “Asteroid Initiative: NASA’s Asteroid Redirect Mission and Grand Challenge.”http://
www.nasa.gov/mission_pages/asteroids/initiative/-.U-T1bPldXTo
Osborne, S. P., Z. Radnor, and K. Strokosch. 2016.“Co-Production and the Co-Creation of Value in
Public Services: A Suitable Case for Treatment?”Public Management Review 18 (5): 639–653.
doi:10.1080/14719037.2015.1111927.
Piezunka, H., and L. Dahlander. 2015.“Distant Search, Narrow Attention: How Crowding Alters
Organizations’Filtering of Suggestions in Crowdsourcing.”Academy of Management Journal 58
(3): 856-880.
PUBLIC MANAGEMENT REVIEW 19
Rainey, H. G., S. Pandey, and B. Bozeman. 1995.“Research Note: Public and Private Managers’
Perceptions of Red Tape.”Public Administration Review 55 (6): 567–574. doi:10.2307/3110348.
Ropera, S., P. Vahterb, and J. H. Lovec. 2013.“Externalities of Openness in Innovation.”Research
Policy 42: 1544–1554. doi:10.1016/j.respol.2013.05.006.
Salge, T. O., T. Farchi, M. I. Barrett, and S. Dopson. 2013.“When Does Search Openness Really
Matter? A Contingency Study of Health-Care Innovation Projects.”Journal of Product Innovation
Management 30 (4): 659–676. doi:10.1111/jpim.12015.
Shirky, C. 2008.Here Comes Everybody: The Power of Organizing without Organizations. New York:
Penguin.
Surowiecki, J. 2004.The Wisdom of Crowds. New York, NY: Knopf Doubleday.
The White House. 2009.The Open Government Initiative. edited by Executive Office of the
President. Washington, DC: White House.
The White House. 2010.Guidance on the Use of Challenges and Prizes to Promote Open
Government Edited by Office of Management and Budget. Washington, DC: Exectuvie Office
of the President.
Villarroel, J. A. 2013.“Strategic Crowdsourcing: The Emergence of Online Distributed Innovation.”
In Leading Open Innovation, edited by A. S. Huff, K. M. Moeslein, and R. Reichwald, 171–200.
Cambridge, MA: The MIT Press.
West, J., and K. R. Lakhani. 2008.“Getting Clear about Communities in Open Innovation.”Industry
& Innovation 15: 223–231. doi:10.1080/13662710802033734.
West, J., and M. Bogers. 2014.“Leveraging External Sources of Innovation: A Review of Research on
Open Innovation.”Journal of Product Innovation Management 31 (4): 814–831. doi:10.1111/
jpim.2014.31.issue-4.
20 I. MERGEL