Content uploaded by Keegan Mcbride
Author content
All content in this area was uploaded by Keegan Mcbride on Sep 13, 2017
Content may be subject to copyright.
*Please cite this paper as: McBride. K, Aavik. G, Kalvet. T, Krimmer. R. (2018). “Co-creating an Open Government
Data Driven Public Service: The Case of Chicago’s Food Inspection Forecasting Model”. 2018 51st Hawaii
International Conference on System Sciences (HICSS). IEEE, Forthcoming.
Co-creating an Open Government Data Driven Public Service:
The Case of Chicago’s Food Inspection Forecasting Model
Keegan McBride, Gerli Aavik, Tarmo Kalvet, Robert Krimmer
Tallinn University of Technology, Tallinn, Estonia
{Keegan.McBride | Gerli.Aavik | Tarmo.Kalvet | Robert.Krimmer@ttu.ee}
Abstract
Large amounts of Open Government Data (OGD)
have become available and co-created public services
have started to emerge, but there is only limited
empirical material available on co-created OGD-
driven public services. To address this shortcoming
and explore the concept of co-created OGD-driven
public services the authors conducted an exploratory
case study. The case study explored Chicago’s use of
OGD in the co-creation of a predictive analytics
model that forecasts critical safety violations at food
serving establishments. The results of this exploratory
work allowed for new insights to be gained on co-
created OGD-driven public services and led to the
identification of six factors that seem to play a key role
in allowing for a OGD-driven public service to be co-
created. The results of the initial work also provide
valuable new information that can be used to aid in the
development and improvement of the authors’
conceptual model for understanding co-created OGD-
driven public service.
1. Introduction
In current e-Government literature, there are two
topics that have been receiving increased interest and
focus: open government data (OGD) and co-creation
[1]. Increasing evidence is appearing on OGD’s
benefits and potential [2] as well as the barriers
preventing its usage [3], [4]. The second topic, co-
creation, emerges from the concept of Coproduction,
brought into the spotlight by Elinor Ostrom in 1972
[5]. A previous paper has linked these two topics and
discussed the idea of a “co-created OGD-driven public
service” [6]. This concept emerges from a new
understanding of what a public service is, “public
services are any services which are offered to the
general public with the purpose of developing public
value, regardless of the role that the public sector plays
in the process” [6], [7].
The co-created OGD-driven public service has two
main components. Firstly, when talking about the co-
creation of new public services, co-creation may be
understood as the involvement of outside, non-typical,
stakeholders in the initiation, design, implementation,
and evaluation of the public service [6]. There is a
difference between coproduction and co-creation, this
was highlighted in a recent work where it was stated
that “all public services are coproduced, but not all
public service are co-created” [8]. In co-created OGD-
driven public services, the process in which co-
creation takes place must be understood, for this
purpose the framework put forth by [6] is used to
provide an initial understanding.
Another interesting concept that allows a bridge to
be built between the concepts of OGD and co-creation
is the notion of Government as a Platform (GaaP).
GaaP as a means for understanding the relationship
between OGD and co-creation was brought forth by
Linders in [9]. The core idea behind GaaP is that there
is a large amount of governmentally held and
generated data, dissemination of said data is becoming
less difficult, and that this data is able to aid and drive
the creation of new and innovative activities [9]. In the
GaaP model, the government is providing OGD and
this data may be used or exploited by any actor or
stakeholder to create public value. This use and
exploitation of the data may be understood as co-
creation as the government is providing the data and,
if the resulting applications produce public value, a
new public service has been driven by OGD and was
co-created.
There has been a clear increase in interest in the
topics of co-creation and OGD, and some authors have
worked on further conceptualizing the relationship
between the two ideas as well as provided some
understanding of how co-created OGD-driven public
services may come into being [10]. However,
currently, there is limited empirical work that looks at,
and examines, co-created OGD-driven public services
in the real world. There are two reasons for this, the
first is because it is a relatively new concept, and the
second is due to a general lack of real-world examples
of co-created OGD-driven public services. This is an
interesting research gap and it was further explored by
conducting an analysis of data-analytics and OGD
programs; an empirical example has the potential to
aid and assist the current understanding of co-created
OGD-driven public services. Because of this analysis
and exploration, an interesting example made itself
known. The service involved multiple stakeholders
(city governmental agencies, private sector
companies, NGOs, and citizens), was developed using
open source code and is still freely available, heavily
utilized OGD, and convincingly produced public
value. Additionally, previous work has been done on
OGD in the selected city that demonstrated the
effectiveness of the OGD portal there [11]. This
combination of factors seemed to allow the service to
be titled a co-created OGD driven public service and
it was thus selected for further analysis.
This paper aims to explore Chicago’s use of OGD
for a new predictive analytics model that allows the
Chicago’s Department of Public Health (CDoPH) to
forecast critical safety violations at food serving
establishments. Because of the exploration, new
insight has been gained that can later be used to further
develop the understanding of co-created OGD-driven
public services. The importance and relevance of this
case was summed up in a succinct manner by Tom
Schenk, the Chief Data Officer of Chicago, in a report
he authored on the service: “collaboration was a key
component of this project… and each variables used
in the model was available on Chicago’s open data
portal” [12]. Later in the report it was stated that “the
portal was an effective tool to allow collaborative
research”, and that “this project was able to leverage
Chicago’s key data assets: its large volume of data, the
transparency and size of its open data portal, and its
ability and willingness to conduct research to improve
city services, introduce savings, and increase
engagement with Chicago-area businesses” [12]. The
stakeholders involved in this collaborative effort were
Chicago’s Department of Innovation and Technology
(CDoIT), members of Allstate Insurance’s Data
Science Team, CDoPH, the Civic Consulting Alliance
(CCA), and, finally, citizens also have played a role in
structuring the new public service.
In order to understand better the process of co-
created OGD-driven public services, an exploratory
case study was conducted. This paper presents the
case, reflects on the process, and discusses how the
findings from the case grow and aid the current
understanding of co-created OGD-driven public
services. The paper is structured as follows. Chapter 2
will provide a brief overview of the methodology that
was used to conduct the case study; this will be
followed by a presentation of the case in Chapter 3.
Chapter 4 will discuss the findings that have emerged
from the case. During the discussion, initial
propositions will also be put forth to reflect back on
the current theory and our understanding of co-created
OGD-driven public services. Finally, in Chapter 5, the
paper will be concluded and avenues for future
research will be put forth.
2. Methodology and Conceptual Model
In the previous section, the case was briefly
introduced. It was stated that the model utilizes
multiple sources of OGD, and that collaboration
between many different stakeholders was key for this
model to be completed and implemented. It has also
been said that the OGD portal is what allowed these
different stakeholders to come together and exploit
OGD to co-create this new OGD-driven public
service. For these reasons, the Chicago food predictive
analytics model was selected as the case for this paper.
This holistic exploratory case study [13] aims to
explore the process that was undergone to move the
co-created OGD-driven public service from ideation
through development and into its current stage.
Though this may be defined as a critical and unique
case, it is still only one case thus providing a lower
level of generalizability. However, it should still allow
an initial study to be conducted that provides insight
into the inner workings of a co-created OGD-driven
public service.
For the initial understanding of OGD-driven
public service co-creation, the framework presented
by [6], will be used. The aim of this paper is to explore
a co-created OGD-driven public service and gain new
insight, but the model is presented as it allows for a
starting point in looking at co-created OGD-driven
public services. Observing the process of the case at
hand allow new insights to be gained in regards to
what factors influence the co-creation of OGD-driven
public services, potentially provide new insights into
the conceptual understanding of co-created OGD-
driven public services, and look at the different roles
stakeholders played in this process. This new insight
may then be used in future development and
improvement of the model. In order to gain initial
insight into the case, newspaper articles, source code,
and a report on the model’s GitHub page were
consulted. With an initial foundational understanding
in place, semi-structured interviews were conducted to
delve into the case and understand better the dynamics
at play.
Six semi-structured interviews were conducted
with stakeholders representing different parties; one
person was interviewed from CDoIT, CCA, Allstate,
Montgomery County, whereas two members were
interviewed from the CDoPH. These interviews were
conducted during April and May 2017 over the phone
or through Skype and lasted from between 15 to 40
minutes each; all interviews were recorded and then
transcribed. The first interview conducted was with
Tom Schenk, the main person behind the case, and
then, using snowballing, other interviewees were
selected. The interviewee from Montgomery County
Department of Innovation was selected due to the
county’s relationship with the project (Montgomery
County implemented Chicago’s code with the help of
a private sector partner), though they were not directly
involved in the initial model development.
The interview questions aimed to provide a better
understanding of the interviewee’s role in the project,
how they got involved, how the process unfolded,
what went well and what did not go well, and then at
the end participants were asked to add in any
comments that were not discussed during the
interview. The responses from the interviewees are
presented and discussed in section 4, commonly
mentioned themes and facts will be further used to
draft initial propositions on what seems to influence
the success of a co-created OGD-driven public
service, as well as what factors seem to be needed to
allow OGD-driven public service co-creation to take
place.
2.1 Conceptual Model
A recent paper [6], has proposed that public
services can be created through an innovation process
based on the ideas of co-production and agile
development. The model, shown in Figure 1, argues
that in an environment where OGD and tools for data
analytics, exploitation, and co-production are made
widely available, any actor can take the lead and
initiate or co-create data-driven services that create
public value. To do that, it is important to focus on
service user, be agile, develop quickly, listen to the
service user, and be able to adapt quickly to changing
needs. This service innovation process can be
summarized through four points:
1. The government and citizens should be
partners at all stages from ideation to creation
to implementation of the new data-driven
public service.
2. There should be an initial release of the public
service at an early stage, or an ‘MVP’ of the
public service, which allows the cycle to be
started as quickly as possible.
3. The public service should be able to respond to
user feedback from the initial launch.
4. User input should be sought and utilized at all
stages of the public service creation.
The model argues that the traditional government-
driven top-down waterfall-like method of public
service production no longer meets the needs and
expectations of the citizens and new collaborative and
data-driven approaches are the way to go. The model
follows a four-phase cycle of open government data
driven co-initiation, co-design, co-implementation and
co-evaluation.
Figure 1:Conceptual Model [6]
3. The Case
This section will present the case of the Chicago
predictive food analytics model. It starts in section 3.1
by presenting the relevant contextual information
surrounding the case. Following this, in section 3.2,
the process of the development and implementation of
the case will be discussed in two stages, the initial
iteration (3.2) and the second iteration (3.3); these
sections will also include the role of stakeholders, the
processes of development, and the role of OGD and
co-creation). The final aspect of the case to be
presented in 3.4 is the impact of the new co—created
OGD-driven public service and potential direction for
the future.
3.1. Context
The context surrounding the case must be
presented so that the case may be better understood.
When looking at the relevant contextual factors for
this case, there seems to be four main variables: access
to a functioning OGD portal, previous experience in
the realm of predictive analytics, a grant received from
Bloomberg Philanthropies Mayors’ Challenge, and a
law requiring the inspection of establishments that
serve food. These factors form the core contextual
foundation for the case and their importance is
presented in the following paragraphs
Chicago’s OGD portal was initially developed in
2010, but in 2012 its importance was reinforced by an
order issued by Mayor Rahm Emanuel. This order
stated that Chicago must establish and maintain an
OGD portal, and that every city agency must “make
available online… at a level of granularity acceptable
to DOIT (Department of Innovation and Technology),
all appropriate datasets and associated metadata under
such agency’s control” [14]. When discussing the
motivation for this executive order, the Mayor
explained that OGD could be used to “create
application that will improve service delivery and lead
to greater quality of service for residents and more
public engagement in City government”. The OGD
portal has since grown rapidly and currently provides
access to over 550 datasets, applications built by the
city and private developers, provides tutorials on how
the available data may be exploited or analyzed,
provides tools that allow for easy visualization of data,
and has been visited over 38 million times [15]. This
portal is run and maintained by the CDoIT.
In the introduction it was discussed how the
concept of GaaP allows us to understand the
relationship between OGD and co-creation.
Interestingly, this was also pointed out by Brett
Goldstein, former Chief Data Officer of Chicago,
where he stated that the idea of GaaP is a core part of
the success of the Chicago OGD portal as they are able
to “be the platform… and support the innovative ideas
cultivated by various communities” [16].
Bloomberg Philanthropies organized a
competition that would “inspire American cities to
generate innovative ideas that solve major challenges
and improve city life – and that ultimately can be
shared with other cities to improve the wellbeing of
the nation” [17]. The City of Chicago entered this
competition and was awarded a grant for one million
USD to develop a new “SmartData” platform that
would allow government agencies easier access to
predictive analytics tool; one condition of this grant
was that all software developed would be open source
[18]. Specifically, Chicago was selected to “create an
open-source platform to harness the power of data to
understand underlying trends and better direct limited
resources” [17]. This grant provided the CDoIT
funding to begin to undertake more ambitious OGD-
driven predictive analytics models.
One of the initial models that emerged from the
CDoIT was a model that could be used to predict when
and where outbreaks of rodents would occur so that
these outbreaks could be prematurely stopped [19].
This model was developed in cooperation with
Carnegie Mellon University’s Event and Pattern
Detection Laboratory and then was put into production
by Chicago’s Department of Streets and Sanitation
[19]. The model was well known throughout the City
government agencies, and was cited by some of the
interviewees as being one reason they were willing to
participate in and allow Chicago’s predictive food
analytics model to be developed.
The final contextual factor to present is the legal
requirements for inspecting establishments that serve
food. The CDoPH’s Food Protection Division is
required to perform inspections of establishments that
serve food, this authority comes from the City of
Chicago’s Food Service Sanitation Municipal Code
and the Rules and Regulations promulgated by the
Chicago Board of Health [20]. At the time of writing
this article there were around 16,000 food
establishments in the City of Chicago (there was over
15,000 when the predictive food analytics model was
initially developed) [20]. These establishments have
different requirements, but, generally, food
establishments within the city must be inspected twice
a year to make sure that they are incompliance with the
aforementioned regulations on food safety. When
inspections are carried out, one of the most important
findings is whether a critical violation is taking place.
Critical violations are those that have a high chance of
starting or spreading food borne illnesses; the presence
of a critical violation leads to a failure, the violation
must then be fixed and the establishment re-inspected
and reapproved by the CDoPH [12]. The results of
these food inspections are also freely available on the
Chicago OGD portal.
3.2. Initial Development
The City of Chicago and the CDoIT wanted to
continue to expand their use of OGD and predictive
analytics, thus increasing the efficiency of some
agencies’ day-to-day operations and provide increased
public value. In order to do this, an initial list of
potential use cases where OGD-driven predictive
analytics capabilities could be used was drafted in
2014. Though the City of Chicago was interested in
OGD and predictive analytics, the CDoIT still lacked
a full data science team, thus outside help was needed.
In order to find this outside help, the CDoIT reached
out to a local organization, the CCA. The CCA is an
organization that aims to improve the quality of life in
Chicago by bringing together stakeholders from
public, private, and non-profit sectors to work on new
and innovative solutions for problems facing the city;
the CCA roughly provides “fifteen million USD in pro
bono services every year” [21]. The CCA had relations
with the data science team at Allstate Insurance and
approached them with the list of potential use cases
from the City of Chicago. The members of the Allstate
Data Science Team had experience with the Chicago
OGD portal and knew that there was large amounts of
data on food inspections within the city. The members
also had a direct interest in the topic of food safety as
they lived in the City of Chicago and thought it would
be interesting to try to improve the food safety of the
food serving establishments within the City. Thus,
they got back to the CCA and said that they would be
willing and interested to work on developing a
predictive analytics model for the CDoPH’s food
inspections.
The policy that allowed for members of Allstate’s
Data Science Team to participate in this pro bono
project is quite interesting; the company’s “bluelight”
policy allowed employees to spend up to 10% of their
working time on pro-bono data science projects [22].
The logic behind this policy is that working on non-
typical or new projects will boost their employees’
skillset and expose them to new tools and
technologies, ultimately benefiting Allstate, the
Employees, and the Partner(s) receiving their
assistance.
During 2014 when the initial development began,
the City of Chicago had over 15,000 food
establishments, to inspect these establishments the city
had 36 food inspectors, and these establishments
needed to be inspected twice a year, some
establishments had to be inspected less and some
more, but twice a year seems to be the general rule.
This roughly translates to about one inspector for
every 470 food establishments, due to the high
workload and lack of optimization many critical food
violations were going unnoticed or were being
detected too late to stop or prevent outbreaks from
starting/spreading [12]. Though there was a logic to
how food inspectors were assigned, it was believed by
the CDoIT that this process could be improved and
made more effective through the adoption of an OGD-
driven predictive analytics model. Though originally
hesitant, the head of the CDoPH was willing to test a
newly developed model as she had heard about the
success of a previous model; this was the model
mentioned previously in section 3.1.
To begin, the CCA organized meetings between
the relevant parties and acted as a project manager
(Allstate, CDoIT, and CDoPH). At these meetings, the
business requirements of the CDoPH were discussed
and presented to the developers and data scientists.
Allstate’s team ended up using multiple variables from
Chicago’s OGD portal and constructed a General
Linear Model that would allow the highest risk food
establishments to be inspected first. In essence, the
model works by predicting what food establishments
are the most likely to have a critical food violation, and
then assigns these establishments to be inspected first;
previously these assignments had been made
following a business and risk based approach, but it
still seemed somewhat random and inefficient.
However, due to a misunderstanding of one variable,
the first iteration of the model ended up being incorrect
and needed to be adjusted.
This failure ended up being a major learning point
for all involved stakeholders and emphasized the
importance of communication early on as well as the
importance of continuously communicating
throughout development.
3.3. Second Iteration
Though the first implementation was not
successful, it was improved upon and the
misunderstanding was addressed by the CDoIT and
CDoPH. This second attempt at the predictive
analytics model is open source and the code is freely
available on GitHub.
The model was tested over a two-month period
(September and October 2014), during this time
assignments were given out following normal
operations, but the model was running simultaneously
to see how it would compare to normal operations.
After the testing had been completed and validated, the
model was made operational by February of 2015.
When looking at the model, many different
predictive features were tested, but currently the
following nine predictors are utilized by the model:
1. “Establishments that had previous critical or
serious violations.
2. Three-day average high temperature.
3. Nearby garbage and sanitation complaints.
4. The type of facility being inspected.
5. Nearby burglaries.
6. Whether the establishment has a tobacco
license or has an incidental alcohol
consumption license.
7. Length of time since last inspection.
8. The length of time the establishment has been
operating.
9. And the assigned Inspector.” [12].
The new code also utilized different predictive
classification models, such as random forest, to try to
get better results.
The predictive food analytics model uses the
aforementioned predictors to classify which food
establishments are the most likely to have a critical
food violation. The individual in charge of assigning
food inspectors to establishments accesses the
predictions through a Shiny Application (Shiny is a
package in R that allows for easy development of web
pages and user interfaces). Food inspectors are then
assigned to establishments that have been predicted or
put forth by the model. The CDoIT GitHub page for
the predictive food analytics model put forth Figure 2
to demonstrate better how the model works. In
essence, food establishments with the highest risk of
critical violations are inspected first.
Figure 2: Optimized food inspection process [12]
3.4. Impact and Future
In 2014, the model was trained and evaluated over a
two-month period. The results from the model were
compared to the results of the actual food inspections
occurring at the same time. This comparison allowed
a clear advantage to be seen if the data-driven model
had been used instead of the traditional approach. The
model had allowed for critical food violations, on
average, to be found 7.5 days earlier [12]. Thus, this
would allow for potential food borne illness outbreaks
to be prevented, or have their severity limited, as the
violations responsible were being caught and
addressed earlier. However, the improvement of the
process did not stop here. As the second attempt was
open source, citizens and outside stakeholders have
also been able to get involved. The best example of
this as follows: one individual made a pull request on
Feb 3, 2017 demonstrating how the XGBoost model
was finding critical violations, on average, 7.79 days
earlier; this represented an improvement on the current
model in use. Four days later the Chief Data Officer of
Chicago had commented on it and a code review has
been initiated and stated, “If the results hold, we will
incorporate your contributions to the model that drives
food inspections in the city. Thank you and we will be
in touch soon” [24]. This provides a clear
demonstration of how outside stakeholders are able to
play a role in the co-creation of OGD-driven public
services. The model is still in use by the CDoPH today
and it is still actively maintained by the CDoIT, and
stakeholders are still able to suggest improvements to
the model through GitHub.
One result of the code being open is that it has been
possible for other stakeholders to take, adopt, and
change the code for their own uses. The best-known
example of this is that of Montgomery Country,
Maryland. Montgomery County had hired Open Data
Nation, a private sector data analytics company, to
take Chicago’s code and adopt it for Montgomery
County’s needs. However, this trial has been stopped
due to political reasons. This is an interesting fact, and
the reasons why the model is able to work in Chicago,
but not other areas, will be discussed in the next
section of this paper.
This case is truly interesting as it represents one of the
only examples that the authors’ found that seems to
represent a co-created OGD-driven public service. It is
also a service that is able to continue to provide value
moving into the future. As more data is generated, the
model is likely to become more accurate in predicting
critical violations. It will be interesting to follow up on
this case in the future to see how the co-creation of the
service progressed as well as how the accuracy
improves over time.
4. Discussion
This section will discuss what seems to be relevant,
and what not, in regards to the current understanding
of co-created OGD-driven public services. The
discussion aims to reflect back on the conceptual
model by comparing it to what emerged in the case.
Furthermore, propositions for co-created OGD-driven
public services will be put forth.
While conducting the interviews for this case,
stakeholders highlighted a multitude of factors that
allowed for the co-created OGD-driven public service
to be implemented; it was also stated by many
interviewees that the process seemed to be a “perfect
storm”. This “perfect storm” consisted of having
external funding, motivated stakeholders, innovative
leaders, proper communication channels, an existing
OGD portal, and developing the model in an agile way
that accepted the fact that mistakes would be made
throughout development.
External Funding
The City of Chicago had received a grant from
Bloomberg Philanthropies to develop a “SmartData”
portal. It was confirmed by the Chief Data Officer,
Tom Schenk, that external funding had allowed the
CDoIT to pursue actively more projects, such as the
project that this case focuses on. Though external
funding does appear to be an active driver for the co-
creation of OGD-driven public services, it should be
explored further to see what effect it has when a
government agency is not the main driving force
behind the services.
Motivated Stakeholders
The model was co-created by numerous different
stakeholders representing different sectors; Table 1
presents all stakeholders and their role in the project.
Stakeholder
Role in Project
CDoIT
Co-Creator of model
Maintains open
government data portal
CDoPH
Service user
Allstate Insurance
Data Science Group
Co-Creator of model
CCA
Initial project manager
Organized Allstate CDoIT
communication
Citizens
Model improvements and
pull requests on GitHub
Table 1: Chicago predictive food analytics model
co-creators and their Roles. Source: Authors.
The model was developed in close cooperation
between the CDoIT, Allstate, and CDoPH; this
interaction was brokered by the CCA. Interestingly,
one interviewee stated that the role of a mediating
stakeholder, the CCA, seemed to be quite important.
In the interviewee’s experience, public and private
sector organizations sometimes clash due to
organizational differences, but the CCA was able to
work as a mediator and help build a bridge and develop
the relations between private and public sector. The
final group of stakeholders is that of the citizens. As
there has been citizen input that improved the
efficiency of the current model, it does appear that
there is interest and motivation to play a role in the co-
creation of OGD-driven public services.
Innovative Leaders
While conducting interviews, two names were
always stated as playing a critical role in the success
of the project; Tom Schenk (Chief Data Officer of
Chicago) and Gerrin Butler (Director of Food
Protection for the City of Chicago). Tom was said to
be the main driving force behind the model and had it
in mind for the code to be open source since the idea
was conceived. Gerrin was the actor who agreed to go
ahead with Tom’s plan for data-driven food analytics.
Gerrin did not initially understand what or how a data
analytics model would work and improve current
operations, but was willing to try and played an active
role throughout. Without the work and willingness of
leaders to push for and try new things, this case would
not have been possible.
Proper Communication Channels
There were two iterations of development for the
predictive model. The first one failed due to a
miscommunication between the CDoPH and Allstate
of how the process of food inspections worked. This
was noted down, and in the second iteration, there was
a strong emphasis on appropriate communication
between parties so that all could be understood. One
interesting part of this communication was how
technical and non-technical requirements and
terminologies were understood and translated by
different involved parties. On the CDoPH side, a list
or annex of technical terms was developed so that
technical conversations could be followed. On the
development side, the requirements were asked for
multiple times and a member of the CDoPH team who
had experience in data analytics was able to effectively
translate their current process into one more
understandable for the data analysts working on the
project.
Existing OGD Portal
Chicago has had an OGD portal since 2010, but it
was improved greatly and made a legal requirement in
2012. It was stated by multiple interviewees that the
OGD portal allowed them to come up with the idea for
the new co-created OGD-driven public service, and
that there were no noticeable issues with data quality.
It is also important to note that a majority of the open
data sets that were used in the development of the
model were freely accessible to all on the OGD portal.
Thus, the OGD portal allowed a new service to be
thought up, and it could be created through the
exploitation of high quality and easily exploitable
OGD sources through Chicago’s OGD portal.
Agile Development
Though the service did not follow traditional agile
development methodology, some aspects of agile
development were present. The service was developed
and tested constantly, improvements were made and
tested throughout development, and if mistakes were
made, they were learned from and used to improve the
service quickly.
The Conceptual Model
It does appear that there is room for improvement
in the model that was proposed in section 2.1 based on
the aforementioned factors. One of the first things to
address is that it does appear that the model for co-
created OGD-driven public services varies depending
on the sector of the stakeholder(s) that are initiating
the service. In the case at hand, external funding was
one of the major drivers, whereas this may not
necessarily be true if a citizen or a company is taking
the lead in developing the service, but this should be
explored in further research. The role and importance
of communication and networks is not currently
highlighted in the framework, but from this case, it
does appear that communication and understanding
between different stakeholders has a large effect on
how well the co-creation of an OGD-driven public
service goes. However, the case also seems to validate
some aspects of the model. When looking at the case
it does appear to follow the co-initiation, co-design,
co-implementation, and co-evaluation cycle. The
model also proposed that OGD might act as a catalyst
to drive co-creation of OGD-driven public services;
this also seems to be supported by this case. The
Allstate team specifically chose the subject for this
case as there was OGD available, and this data was
easily accessible, exploitable, and of high quality.
Propositions
1. In an environment where open government
data and tools for data analytics, exploitation
and co-creation are made widely available, any
actor can take the lead and initiate or co-create
data-driven services that create public value.
2. When OGD is released and maintained, it
allows the Government to act as a platform.
This platform allows OGD sets to be exploited
and leads to increased levels and occurrences
of co-created OGD-driven public services.
3. A “perfect storm” consisting of sufficient
resources, innovative leaders, motivated
stakeholders, and access to OGD allows for
effective execution of co-created OGD-driven
public services.
4. Co-created OGD-driven public services appear
to have the potential to drive increased levels
of efficiency traditionally slow or outdated
processes.
5. Government as a Platform appears to be a
bridge that allows for the concepts of co-
creation and OGD to be merged together. If the
government makes data available, and this data
is used to create a new public service, then at a
minimal level there will always be co-creation
between the government and the one exploiting
the data for the OGD-driven public services.
From the case, six different factors were highlighted
that seem to play an important role in the co-creation
of OGD-driven public services. After presenting these
factors they were used to reflect back on our current
understanding of co-created OGD-driven public
services and the model provided in section 2.1. After
this reflection 5 propositions have been proposed that
deal with how co-creation of OGD-driven public
services occurs, how the idea of GaaP leads to co-
creating of OGD-driven public services, as well as
what benefits a co-created OGD-driven public service
may have.
5. Conclusion
The aim of this paper was to present the case of
Chicago’s predictive food analytics model so that new
insights into the concept of co-created OGD-driven
public services could potentially emerge. In section
3.1, the context surrounding the case was presented,
specifically, the role that external funding, a
functioning OGD portal, and previous experience with
OGD-based predictive analytics. These three factors
seem to have played an instrumental role in laying the
foundation for the co-creation of OGD-driven public
services in the city of Chicago. Chicago appears to
have a ‘platform’ that is based on their OGD portal,
this government platform thus allows for the
exploitation and co-creation of new OGD-driven
public services.
On the private sector side, an interesting policy
was discovered. Allstate Insurance’s “bluelight”
policy allowed their employees to participate in pro
bono data science work, thus providing the
opportunity for their staff to engage in co-creation
with the City of Chicago. The role of Allstate also
seems to demonstrate that there is interest from those
with experience in data science to participate in pro
bono work and in the co-creation of new OGD-driven
public services.
Section 4 provided a discussion on the findings
from the case. Firstly, six factors were outlined as
playing a key role in allowing the co-creation of an
OGD-driven public service to take place: external
funding, motivated stakeholders, innovative leaders,
proper communication channels, an existing OGD
portal, and agile development practices. These factors
were then used to reflect back on one proposed
conceptual model for how the process of co-created
OGD-driven public services is understood. These
reflections allow for potential improvements to the
conceptual model to be made, but it also allows some
preliminary validation to take place of the model. It
does appear that the idea of co-created OGD-driven
public services has merit and does exist in the real
world. The way in which the service was developed in
Chicago also seems to match the four stages that were
proposed in the conceptual model. The final part of the
discussion was the presentation of some initial
propositions on co-created OGD-driven public
services. These propositions may be briefly
summarized as follows, availability of OGD and tools
for data analytics has the potential to enable the co-
creation of OGD-driven public services, governments
releasing OGD are acting as a platform and from this
platform the co-creation of new and innovative OGD-
driven public services may take place, and that the idea
of GaaP does appear to be an idea that allows for the
topics of co-creation and OGD to be merged together.
Though the case presented in this paper represents
an empirical example of a co-created OGD-driven
public service, it only represents one possible
combination of stakeholder roles as a governmental
agency was still playing a major role. As the notion of
a co-created OGD-driven public service implies that
the government need not play an active role in the
development, any examples of co-created OGD-
driven public services where a non-traditional
stakeholder is playing a leading role could provide
valuable insight into the formulation of the
understanding of co-created OGD-driven public
services. Secondly, this case study also only looks at
one type of a co-created OGD-driven public service (a
data analytics model), other types of services may
exist (such as web or mobile applications built on
OGD), and research should be further conducted on
the different types of co-created OGD-driven public
services.
The exploratory case study that was conducted for
this paper provides an initial empirical case on a co-
created OGD-driven public service and aims to
advance and encourage research into the topic of co-
creation of OGD-driven public services. The case
demonstrates that there is a link between co-creation
and OGD, and that this link may enable or drive a
change in the current understanding of public services.
Furthermore, the case also demonstrates that there is a
relationship between GaaP and OGD and that this
relationship is likely to encourage or enable co-
creation. This paper provides an initial stepping-stone
on the topic of co-created OGD-driven public services
and, as such, proposes that future research into the
topic is needed. Potential avenues of future research
include solidifying the definition of a co-created
OGD-driven public service, empirical work focusing
on different types of co-created OGD-driven public
services, studies that aim to understand the role that
different stakeholders as the leading service developer
have on the co-creation process of OGD-driven public
services, and also how the idea of GaaP influences our
understanding of co-created OGD-driven public
services and the bridge between OGD and co-creation.
Acknowledgements. This work was supported by the
European Commission (OpenGovIntelligence H2020 grant
693849) and Estonian Research Council (PUT773,
PUT1361).
6. References
[1] G. Galasso, G. Garbasso, G. Farina, T. Kalvet, F.
Mureddu, D. Osimo, and P. Waller, Analysis of the
value of new generation of eGovernment services
(SMART 2014/066). 2016.
[2] M. Janssen, Y. Charalabidis, and A. Zuiderwijk,
“Benefits, Adoption Barriers and Myths of Open
Data and Open Government,” Inf. Syst. Manag.,
vol. 29, no. 4, pp. 258–268, 2012.
[3] A. Zuiderwijk, M. Janssen, S. Choenni, R. Meijer,
and R. S. Alibaks, “Socio‑technical Impediments
of Open Data,” Electron. J. Electron. Gov., vol.
10, no. 2, pp. 156–172, 2012.
[4] M. Toots, K. McBride, T. Kalvet, and R.
Krimmer, “Open Data as Enabler of Public
Service Co-creation : Exploring the Drivers and
Barriers,” in Proceedings of the 2017
International Conference for E-Democracy and
Open Government (CeDEM 2017). IEEE
Computer Society, pp. 102-112, 2017.
[5] E. Ostrom, “Metropolitan reform: Propositions
derived from two traditions,” Soc. Sci. Q., pp.
474–493, 1972.
[6] M. Toots, K. McBride, T. Kalvet, R. Krimmer, E.
Tambouris, E. Panopoulou, E. Kalampokis, and K.
Tarabanis, “A Framework for Data-Driven Public
Service Co-Production,” in Electronic
Government: Proceedings of the 16th IFIP WG 8.5
International Conference, EGOV 2017, M. Janssen
et al., Eds. Springer, pp. 264-275, 2017.
[7] European Commission, “A vision for public
services,” p. 16, 2013.
[8] K. McBride, “Government as a Platform:
Exploiting Open Government Data to Drive Public
Service Co-Creation,” Tallinn University of
Technology, 2017.
[9] D. Linders, “From e-government to we-
government: Defining a typology for citizen
coproduction in the age of social media,” Gov. Inf.
Q., vol. 29, no. 4, pp. 446–454, 2012.
[10] V. Lember, “The role of new technologies in co-
production,” in Co-production and co-creation:
engaging citizens in public service delivery., T.
Brandsen, T. Steen, and B. Verschuere, Eds.
Routledge, forthcoming in 2018.
[11] M. Kassen, “A promising phenomenon of open
data: A case study of the Chicago open data
project,” Gov. Inf. Q., vol. 30, no. 4, pp. 508–513,
2013.
[12] T. Schenk, “Food Inspection Forecasting - City of
Chicago.” [Online]. Available:
https://chicago.github.io/food-inspections-
evaluation/. [Accessed: 21-Apr-2017].
[13] R. K. Yin, Case study research: Design and
methods, Rev. Newbury Park, Calif.: Sage
publications, 2013.
[14] Open Data Executive Order (No. 2012-2), City of
Chicago, 2012.
[15] “City of Chicago Developers,” City of Chicago
Open Data Developer Portal. [Online]. Available:
http://dev.cityofchicago.org/. [Accessed: 13-Jun-
2017].
[16] B. Goldstein, “Open Data in Chicago: Game On,”
Beyond Transparency, 2013. [Online]. Available:
http://beyondtransparency.org/chapters/part-
1/open-data-in-chicago-game-on/. [Accessed: 13-
Jun-2017].
[17] “Bloomberg Philanthropies Announces Mayors
Challenge Winners Providence, Chicago, Houston,
Philadelphia, and Santa Monica,” Bloomberg
Philanthropies, 2013. [Online]. Available:
https://www.bloomberg.org/press/releases/bloomb
erg-philanthropies-announces-mayors-challenge-
winners-providence-chicago-houston-
philadelphia-and-santa-monica/. [Accessed: 13-
Jun-2017].
[18] Ash Center Mayors Challenge Research Team,
“Chicago’s SmartData Platform - Pioneering Open
Source Municipal Analytics,” Data-Smart City
Solutions, 2014. [Online]. Available:
http://datasmart.ash.harvard.edu/news/article/chica
go-mayors-challenge-367. [Accessed: 13-Jun-
2017].
[19] S. Thornton, “Using Predictive Analytics to
Combat Rodents in Chicago | Data-Smart City
Solutions,” Data-Smart City Solutions, 2013.
[Online]. Available:
http://datasmart.ash.harvard.edu/news/article/using
-predictive-analytics-to-combat-rodents-in-
chicago-271. [Accessed: 13-Jun-2017].
[20] “Restaurant Inspection,” City of Chicago, 2017.
[Online]. Available:
https://www.cityofchicago.org/city/en/depts/cdph/
provdrs/inspections_and_permitting/svcs/food_pro
tection_program.html. [Accessed: 13-Jun-2017].
[21] “Civic Consulting Alliance,” CCA, 2017. [Online].
Available: http://www.ccachicago.org/. [Accessed:
13-Jun-2017].
[22] S. Thornton, “Delivering Faster Results with Food
Inspection Forecasting - Chicago’s Analytics-
Driven Plan to Prevent Foodborne Illness,” Data-
Smart City Solutions, 2015. [Online]. Available:
http://datasmart.ash.harvard.edu/news/article/deliv
ering-faster-results-with-food-inspection-
forecasting-631. [Accessed: 13-Jun-2017].