ArticlePDF Available

Crowdsourcing geographic information for disaster response: A research frontier. International Journal of Digital Earth, 3(3), 231-241


Abstract and Figures

Geographic data and tools are essential in all aspects of emergency management: preparedness, response, recovery, and mitigation. Geographic information created by amateur citizens, often known as volunteered geographic information, has recently provided an interesting alternative to traditional authoritative information from mapping agencies and corporations, and several recent papers have provided the beginnings of a literature on the more fundamental issues raised by this new source. Data quality is a major concern, since volunteered information is asserted and carries none of the assurances that lead to trust in officially created data. During emergencies time is the essence, and the risks associated with volunteered information are often outweighed by the benefits of its use. An example is discussed using the four wildfires that impacted the Santa Barbara area in 2007–2009, and lessons are drawn.
Content may be subject to copyright.
Full Terms & Conditions of access and use can be found at
International Journal of Digital Earth
ISSN: 1753-8947 (Print) 1753-8955 (Online) Journal homepage:
Crowdsourcing geographic information for
disaster response: a research frontier
Michael F. Goodchild & J. Alan Glennon
To cite this article: Michael F. Goodchild & J. Alan Glennon (2010) Crowdsourcing geographic
information for disaster response: a research frontier, International Journal of Digital Earth, 3:3,
231-241, DOI: 10.1080/17538941003759255
To link to this article:
Published online: 15 Apr 2010.
Submit your article to this journal
Article views: 12313
Citing articles: 324 View citing articles
Crowdsourcing geographic information for disaster response: a research
Michael F. Goodchild* and J. Alan Glennon
Center for Spatial Studies, University of California, Santa Barbara, CA 93106-4060, USA
(Received 18 January 2010; final version received 7 March 2010)
Geographic data and tools are essential in all aspects of emergency management:
preparedness, response, recovery, and mitigation. Geographic information created
by amateur citizens, often known as volunteered geographic information, has
recently provided an interesting alternative to traditional authoritative information
from mapping agencies and corporations, and several recent papers have provided
the beginnings of a literature on the more fundamental issues raised by this new
source. Data quality is a major concern, since volunteered information is asserted
and carries none of the assurances that lead to trust in officially created data.
During emergencies time is the essence, and the risks associated with volunteered
information are often outweighed by the benefits of its use. An example is discussed
using the four wildfires that impacted the Santa Barbara area in 20072009,
and lessons are drawn.
Keywords: emergency management; volunteered geographic information;
crowdsourcing; Web 2.0; neogeography; wildfire; Santa Barbara
1. Introduction
Recent disasters have drawn attention to the vulnerability of human populations and
infrastructure, and the extremely high cost of recovering from the damage they have
caused. Examples include the Wenchuan earthquake of May 2009, Hurricane Katrina
in September 2005, and the Indian Ocean Tsunami of December 2004. In all of
these cases impacts were severe, in damage, injury, and loss of life, and were spreadover
large areas. In all of these cases modern technology has brought reports and images to
the almost immediate attention of much of the world’s population, and in the Katrina
case it was possible for millions around the world to watch the events as they unfolded
in near-real time. Images captured from satellites have been used to create damage
assessments, and digital maps have been used to direct supplies and to guide the
recovery effort, in an increasingly important application of Digital Earth.
Nevertheless it has been clear in all of these cases that the potential of such data,
and of geospatial data and tools more generally, has not been realized, that the benefits
of such technology have fallen far short of expectation, and that research is needed on
several key issues if the situation is to improve. In many cases people living far from the
impacted area have been better informed through the media than those managing and
carrying out the relief effort. The impacted zone often loses power, Internet
*Corresponding author. Email:
ISSN 1753-8947 print/ISSN 1753-8955 online
#2010 Taylor & Francis
DOI: 10.1080/17538941003759255
International Journal of Digital Earth,
Vol. 3, No. 3, September 2010, 231241
connections, and computing capabilities, creating a donut pattern of access to relevant
information. A recent report of the US National Research Council (NRC 2007) has
documented these problems in detail, based on extensive discussions with responders
and emergency managers, and has made a series of recommendations for improving
the situation and for needed research.
The (report)s central conclusion is that geospatial data and tools should be an essential
part of all aspects of emergency management from planning for future events, through
response and recovery, to the mitigation of future events. Yet they are rarely recognized
as such, because society consistently fails to invest sufficiently in preparing for future
events, however inevitable they may be. Moreover, the overwhelming concern in the
immediate aftermath of an event is for food, shelter, and the saving of lives. It is widely
acknowledged that maps (and all forms of geospatial data) are essential in the earliest
stages of search and rescue, that evacuation planning is important, and that overhead
images provide the best early source of information on damage; yet the necessary
investments in resources, training, and coordination are rarely given sufficient priority
either by the general public or by societys leaders. (NRC 2007, p. 2)
This paper focuses on a specific and rapidly evolving area of geospatial data and tools,
a subset of social networking and user-generated web content that has been termed
volunteered geographic information (VGI; Goodchild 2007) and that is the focus of an
emerging body of research. The experience of a series of recent wildfire events in Santa
Barbara is used to examine the key issues associated with VGI and its potential role in
disaster management. The first major section of the paper provides a brief reviewof the
field of VGI, a survey of its evolving literature, and its relationship to more widely
recognized topics. The next section examines the specific issues of data quality in this
context, drawing on research on data quality in VGI and examining its relevance
to disaster management. This is then followed by a detailed discussion of the wildfire
disasters, and the lessons that can be learned from them. The paper closes with
a discussion of a vision for the future of VGI, and community activity more broadly, in
disaster management.
2. Volunteered geographic information (VGI)
Until recently virtually all geographic information was produced in the form of maps
and atlases, by mapping agencies and corporations, and dispersed as paper copies to
users *researchers, consultants, and members of the general public *through a
system of retail distribution. The geospatial technologies that began to appear in the
1960s did little to change this set of arrangements, since their major impacts were
on the acquisition of raw data through new and more efficient instruments, its
semi-automated compilation, and its use in such systems as GIS. The transition
from paper-based to digital dissemination, from paper map sheets to tapes and
eventually internet distribution, left most of the arrangements intact.
In the early 1990s, however, new technologies were emerging that would
fundamentally change these arrangements, creating what has been termed a post-
modern era (Goodchild et al. 2007) of geographic information production. First, it
became possible for the average citizen to determine position accurately, without
the professional expertise that had previously been limited to trained surveyors. This
could be done using a simple GPS, or by finding locations using one of a number of
services that became available on the Internet *conversion of street address using
232 M.F. Goodchild and J.A. Glennon
a geocoding service, reading a cursor position from an accurately registered map
or image provided by a service such as Google Maps, or converting a place name
to coordinates using a gazetteer service.
Second, it became possible for anyone to gain the ability to make maps from
acquired data, and to employ the kinds of cartographic design skills previously
possessed only by trained cartographers. Googles MyMaps service, for example,
allows anyone to produce a decent-looking map from custom data, and Open-
StreetMap will render raw data provided by the user into a cartographically acceptable
street map.
Central production of geographic information had been sustained over the
decades by two factors: the need for expertise in map-making and the high capital
cost of mapping equipment. By the turn of the new century both of these arguments
had essentially disappeared *the cost of entry into map-making had fallen to no
more than the cost of a simple PC, and the role of the expert had been replaced
by GPS, mapping software, and other technologies (Goodchild 2009).
At the same time individuals with in some cases no expertise in the mapping
sciences were suddenly able to perform many of the functions that had previously been
the preserve of experts. The term neogeography was coined (Turner 2006) to describe
this phenomenon, which can be defined as the breaking down of the traditional
distinctions between expert and non-expert, in the specific context of the creation of
geographic information, since all of the traditional forms of expertise can now be
acquired through the use of technology. The default settings of mapping software, for
example, now embody the recommendations of professionals, so that software rather
than education or instructional manuals becomes the means by which those
recommendations are disseminated and employed.
Many web sites have emerged in the past few years to encourage and facilitate
the actions of neogeographers. In essence these sites make it possible for the user-
generated content that increasingly dominates the Web to include digital material
that satisfies the requirements of geographic information *in other words it is
formed of facts about specified locations on or near the Earths surface. Never-
theless the content is asserted, in contrast to the authoritative output of the
traditional mapping agencies and corporations. It is likely not subject to any form
of quality control, and the element of trust that accompanies the products of a
mapping agency is typically missing. Popular sites include Flickr and its geo-
referenced photographs, the OpenStreetMap project described earlier, Wikimapia
and its large collection of user-described features, and numerous sites that collect
georeferenced observations of plant, animal, and bird sightings. Moreover, it is
increasingly common for the content of Twitter, Facebook, and many other social
networking sites to be georeferenced.
VGI is closely related to the concept of crowdsourcing (Howe 2008), which has
acquired two somewhat distinct meanings. On the one hand it can refer to the
proposition that a group can solve a problem more effectively than an expert, despite
the groups lack of relevant expertise; advocates of crowdsourcing cite many examples
where this proposition appears to be true. On the other hand, and more relevant to
VGI, is the notion that information obtained from a crowd of many observers is likely
to be closer to the truth than information obtained from one observer. Wikipedia
illustrates this meaning, since it relies on the principle that allowing people to edit an
entry can produce an accurate encyclopedia, and empirical evidence appears to
International Journal of Digital Earth 233
support this (Giles 2005). One implication of the crowdsourcing principle is that
information in which many people have an interest will be more accurate than
information that is of interest only to a few, which in the case of georeferenced
Wikipedia entries (a form of VGI) suggests that information about minor features in
remote areas will be less accurate than information about major features in heavily
populated areas. This pattern of data quality is sharply different from that of
traditional mapping agencies, which use quality control procedures to guarantee
uniform quality of a given product.
Academic interest in VGI dates from 2007, when a Specialist Meeting in Santa
Barbara brought together an international group of experts to discuss the state of
knowledge and develop a research agenda (,
and was also included in a book by Scharl and Tochtermann (2007). A special issue of
GeoJournal appeared in 2008 with a collection of research papers, and other
publications and research grants have followed. Several key issues form the skeleton
of a research agenda:
.What types of geographic information are most suited to acquisition through
the efforts of volunteers, and how is this related to the issue of subject-matter
.What factors determine the quality of VGI, how can quality be measured, and
what steps can be taken to improve it?
.What techniques can be developed for synthesizing VGI and conflating it with
other data, including authoritative data, and what methods are appropriate for
its analysis?
.Who creates VGI, and what are its impacts on society?
All of these questions have abroader context. For example, the question of who creates
VGI is related to the broader question of volunteerism in society, and why certain
people are willing to devote time and effort to tasks for which they receive no monetary
reward. Quality questions are related to broader questions of crowdsourcing and
collective intelligence, but also must be studied in the context of the special nature of
geographic information and what is already known about measuring and modeling its
quality (Guptill and Morrison 1995). Societal impacts should be addressed within the
broader context of participatory GIS and the role of information in empowering
individuals and communities. Budhathoki (2009, personal communication) has
developed a conceptual framework for VGI research that places it within many of
these broader contexts, and points to key references.
3. The quality of volunteered geographic information (VGI)
As noted above, geographic information can be defined as information linking a
property to a location on or near the Earths surface and perhaps a time. Because many
of these components must be measured, and because the potential amount of such
information is infinite, it is inevitable that all geographic information be subject to
uncertainty. While early literature on the topic (Goodchild and Gopal 1989)
emphasized accuracy, implying the existence of a truth to which a given item of
information could be compared, more recently the emphasis has been on uncertainty,
reflecting the impossibility of knowing the truth about many aspects of the geographic
world. Research over the past two decades has focused on the sources, measurement,
234 M.F. Goodchild and J.A. Glennon
and modeling of uncertainty; on its propagation into the products of analysis and
modeling; and on the relative merits of the theoretical frameworks of probability and
fuzzy sets (Zhang and Goodchild 2002). Standards of data quality exist for manyof the
products of the mapping agencies and corporations (Guptill and Morrison 1995), and
data quality is an important component of metadata.
Quality is perhaps the first topic that suggests itself to anyone encountering VGI
for the first time. If the providers of VGI are not experts, and if they operate under no
institutional or legal frameworks, then how can one expect the results of VGI creation
and publication to be accurate? Similar concerns are often expressed regarding many
other types of information provided by amateurs, reflecting the deep association
society makes between qualifications, institutions, and trust.
Nevertheless there are several grounds for believing that the quality of VGI can
approach and even exceed that of authoritative sources. First, there is evidence that the
crowdsourcing mechanism works, at least in some circumstances. In the case of
Wikipedia, for example, research has shown that accuracy is as high as that of more
traditional encyclopedias, according to some metrics (Giles 2005). Mention has
already been made of geographic effects on the crowdsourcing mechanism, an
argument that leads one to suppose that less important features, and features in little-
known areas of the planet, would be less accurately described, and preliminary results
from research on Wikimapia appear to bear this out. This topic is revisited below.
Second, geographic information is remarkably rich in context. Information about a
location x can always be compared to other information that is likely to be available
about the same location from authoritative sources, and to information about
the surroundings of x. Toblers First Law (Sui 2004) tells us that any location is likely
to be similar to its surroundings, so information that appears to be inconsistent with
the known properties of the location itself or of its surroundings can be subject to
increased scrutiny. Wikipedia, for example, uses elaborate mechanisms for flagging
and checking contributions that appear dubious, and these mechanisms are likely to be
more effective for geographic information than for many other kinds. Companies that
create and market street centerline databases for vehicle navigation and rely
increasingly on volunteered corrections and updates, have developed elaborate, fully
automated mechanisms for detecting doubtful contributions. Formalization of such
methods would be a suitable topic for future VGI research, since it could lead to the
ready availability of error-checking tools.
Third, discussions of geographic information quality (Guptill and Morrison 1995)
emphasize the importance of completeness or currency as a dimension of quality *the
degree to which the data are true and report all existing features at the time of use.
Unfortunately traditional methods of map-making by government agencies, which
required expert teams to travel to every part of the area and were constantly subject to
funding constraints, led to lengthy delays in the updating of maps, and as a result the
average map may have been years or even decades out of date by the time it was used.
By contrast VGI may be produced much more quickly, and may capturechanges in the
landscape almost as fast as they occur. In comparing VGI with authoritative sources,
therefore, one is often comparing current data with much older data. Moreover the
technologies of measurement, especially measurement of position, have improved
greatlyoverthe past decade, and the expectations of users have risen accordingly. Thus
a map made in 1980 at a scale of 1:24,000 may have a published positional accuracy of
International Journal of Digital Earth 235
12 m, but may pale in comparison with VGI acquired in 2009 using a differential GPS
with a positional accuracy of 1 m.
Studies of Wikimapia conducted by my group are detecting what may be the first
case of a VGI project life-cycle. Wikimapias mantra is Lets describe the whole world,
which it does by enabling volunteers to identify significant features of interest on the
Earths surface, and to provide descriptions and links to other information. In effect,
Wikimapia is a crowd-sourced gazetteer, the traditional authoritative form of place-
name index (Goodchild and Hill 2008). But in contrast to gazetteers, Wikimapia
has no limits to the richness of description that can be associated with a feature,
allows representation of the features full extent instead of a single point, and
accommodates both officially recognized (gazetted) features and unofficial ones. At
the time of writing the number of entries in Wikimapia exceeded 11 million, much
larger than authoritative gazetteers (see, for example, the products of the US Board on
Geographic Names,
Despite these benefits, once Wikimapia had reached a sufficient size and visibility,
it began to attract erroneous and sometimes malicious content. The complexity of the
project and the obscurity of many features made it difficult for crowdsourcing
mechanisms to work to correct errors. For example, it is tempting to look for an
unnamed feature in a remote part of the world and to give it a name, perhaps naming
it after oneself. In the Santa Barbara area, an entry was made in mid 2009 outlining
the nine-hole Ocean Meadows Golf Course, incorrectly identifying it as the 18-hole
Glen Annie Golf Course, and giving the feature a detailed description that matches
the latter and not the former. The distance between the two features is approximately
2 km. Although the entry was edited once in late 2009, the position had not been
corrected at the time of writing.
As the volume of errors increases and crowdsourcing mechanisms fail to assure
quality, the reputation of the site begins to deteriorate. Eventually the motivation to
maintain the site is impacted and the site fails. It seems that Wikimapia is now entering
this phase of decline and it will be interesting to see how it fares in the next few years.
Wikipedia, on the other hand, appears to have sufficient mechanisms in place to
avoid this problem. Wikipedia entries are reviewed by a hierarchy of volunteers who
employ well-defined criteria that are appropriate to crowdsourcing. Each entry is
assessed in relation to the size of the interested crowd, as represented by the number of
contributors and editors, and if that number is too small and interest fails to
materialize the entry is deleted as unimportant. Moreover each entry is assessed in
terms of general, permanent interest; entries describing newly coined terms (neolo-
gisms), for example, are actively discouraged. By contrast the size of the crowd
interested in a Wikimapia entry describing a small feature on the Earths surface will
often be very small and Wikimapia makes no effort to use geographic context to assess
the validity of entries. Thus, there may be no one sufficiently interested in the Glen
Annie Golf Course, or similar examples worldwide, to correct the kind of error
identified earlier, and no basis for doubting that such a golf course could exist at the
identified location. The difficulties experienced by Wikimapia seem to be due at least
in part to its focus on geographic features.
It is important to recognize that the quality of geographic information may have
different effects, depending on the use to which the information is to be put. For any
given level of uncertainty, there will be some applications for which uncertainty is not
an issue and some for which it is. A 15-m error in the location of an electrical
236 M.F. Goodchild and J.A. Glennon
transformer, for example, will have little effect on the operations of the utility that owns
it, except when the error results in it being assigned to the wrong property and thus to
land owned by someone else. Similarlya 15-m error in the position of a street may have
little effect on an in-vehicle navigation system, but will be glaring if the street is
superimposed on a correctly registered satellite image.
Suppose the existence of an event at a location, in other words a time-dependent
item of geographic information, is critical in determining a response. For example, the
event might be a chemical spill that requires evacuation of the surrounding
neighborhood. Two types of errors may exist in this situation: a false positive,in
other words a false rumor of a spill, or a false negative, in other words absence of
information about the existence of the spill. The information is also time-critical, and a
delay in its availability amounts in effect to a false negative. To reduce the chance
oferrors the information can be checked, by requiring independent verification or by
waiting for more accurate and more authoritative information to become available.
But this takes time.
In such situations decision-makers, including those residents who may need to
evacuate, are faced with a choice between acting on less reliable information and
waiting for more reliable information. Each has its costs, but in general acting
unnecessarily, in response to a false positive, is likely to have smaller costs than not
acting if the emergency turns out to be true *in other words false negatives are likely
more costly and less acceptable than false positives. The next section explores these
arguments in the context of a series of wildfire emergencies that impacted the Santa
Barbara area in 20072009.
4. Volunteered geographic information (VGI) and the Santa Barbara wildfires of
Wildfire has always been a part of life in Southern California, but from July 2007 to
May 2009 a series of four large and damaging fires occurred in rapid succession. The
Zaca Fire was ignited in July 2007 and burned for 2 months, consuming 120,000
hectares largely in the Los Padres National Forest, before finally being brought under
control. Although the fire threatened the city of Santa Barbara, especially should a
strong wind have developed from the north, in the end no inhabited structures were
destroyed and the fire never threatened the city, though the costs of fighting the fire ran
into the tens of millions. The long duration of the fire created ample opportunity for
the setting up of information kiosks, news releases, and other ways by which the
agencies dealing with the fire communicated with the general public.
Santa Barbara lies to the south of a large wilderness area, and many homes are
built in close proximity to areas covered by the local scrub, known as chaparral. The
lack of wildfires in the immediate vicinity in the past 40 years had allowed large
amounts of combustible fuel to accumulate. Moreover, many chaparral species contain
oils that make them highly flammable, especially after a series of comparatively dry
winters. In July 2008 the Gap Fire ignited in the hills immediately north of the city,
threatening many homes at the western end of the urbanized area. This fire was
brought under control in 7 days, but led to evacuation orders for a number of
homes. The short duration of the fire and the severity of the threat meant that
approaches used to inform the public during the Zaca Fire were no longer adequate
International Journal of Digital Earth 237
and instead numerous postings of VGI, using services such as Flickr, provided an
alternative to official sources.
Traditionally the community at large has remained comparatively mute in such
emergencies. Citizens rely on official agencies to manage the response, to issue and
enforce evacuation orders, and to issue authoritative information. But the ease with
which volunteers can create and publish geographic information, coupled with the
need for rapid dissemination, has created a very different context. Agencies that are
responsible for managing information are often under-funded and under-resourced,
and compelled to wait while information can be verified, whereas volunteers are today
equipped with digital cameras, GPS, digital maps, and numerous other resources.
Multiply the resources of the average empowered citizen by the population of the city
and the result is an astounding ability to create and share information.
In November 2008 the Tea Fire ignited in the hills behind Santa Barbara, and
spread extremely rapidly, driven by a strong, hot Santa Ana wind from the northeast.
VGI immediatelybegan appearing on the web in the form of text reports, photographs,
and video. Although search services such as Google take several days to find and
catalog information from all but the most popular sites, the community had by this
point learned that certain local sites and repositories were effective ways of
disseminating information. These included sites run by the local newspapers and by
community groups, as well as services that immediate update their own catalogs,
making it possible for users to find new content quickly rather than wait for Googles
robots to find and catalog it. Moreover, citizens were able to access and interpret
streams of data from satelliteswith comparatively fine temporal and spatial resolution
such as Moderate Resolution Imaging Spectroradiometer (MODIS). Several volun-
teers realized that by searching and compiling this flow of information and
synthesizing it in map form, using services such as Google Maps, they could provide
easily accessed and readily understood situation reports that were in many cases
more current than maps from official sources. The Tea Fire burned for 2 days and
destroyed 230 houses.
In May 2009 the Jesusita Fire ignited, again in the chaparral immediately adjacent
to the city, burning for 2 days and consuming 75 houses. Several individuals and
groups immediately established volunteer map sites, synthesizing the VGI and official
information that was appearing constantly. For example, the officially reported
perimeter of the fire was constantly updated based on reports bycitizens. By the end of
the emergency there were 27 of these volunteer maps online, the most popular of which
had accumulated over 600,000 hits and had provided essential information about the
location of the fire, evacuation orders, the locations of emergency shelters, and much
other useful information (Figure 1).
In all of these activities it is clear that users were conscious of the need to balance
the rapid availability of VGI with the unverified nature of much of its content. A
homeowner who evacuated based on information from an online map created by a
volunteer might be responding to a false positive, and by waiting for official, verified
information might have avoided the need to evacuate altogether. But on the other
hand the delay in acquiring information from official sources might have made the
difference literally between life and death.
Several lessons can be learned from the experience of these four fires. First, due to
lack of resources, the need to verify, and imperfect communication, authoritative
information is much slower to appear than VGI. Citizens constitute a dense network of
238 M.F. Goodchild and J.A. Glennon
observers that is increasingly enabled with the devices and software needed to acquire,
synthesize, and publish information. Second, asserted information is more prone to
error, and there were many instances during the fires of false rumors being spread
through Web sites. These probably led to errors on synthesized maps and to many
unnecessary evacuations. Crowdsourcing mechanisms may have led to correction in
some of these cases, but not all.
Third, it is clear from these experiences that the costs of acting in response to false
positives are generally less than the costs of not acting in response to false negatives.
Lack of information is often the origin of false negatives, as when a resident fails to
receive an evacuation order, or an agency fails toverify and report a new areaof fire. In
essence, emergencies such as this create a time-critical need forgeographic information
that is quite unlike the normal, sedate pace with which geographic information was
traditionally acquired, compiled, and disseminated. VGI, with its dense network of
observers, is ideally suited to fill the need for near-real time information.
Fourth, in practice it is difficult during emergencies for citizens to distinguish
between authoritative and asserted information. During the fire, VGI data collection
and dissemination had no consistent or expected form. For example, local authorities,
news media outlets, and community members all used Google MyMaps and Twitter to
offer fire information, whatever its source. Each map or Twitter post was bounded by
the user-interface choices of the original Google and Twitter software designers, and
Figure 1. Screen shot of one of the Web map sites created by amateurs during the Jesusita
Fire of May 2009, showing information synthesized from a wide range of online sources,
including Tweets, MODIS imagery, and news reports.
International Journal of Digital Earth 239
thus possessed very similar visual characteristics. An elaborate system of feature-level
metadata would be needed in order to distinguish information based on its source and
Finally, because of the mapssimilar appearances, map popularity appears to have
been bolstered by perceptions about the size and energy of a maps underlying
community. During the Jesusita Fire, the most popular online VGI channels were
those that provided both information and a parallel interactive discussion forum. The
characteristic chaotic nature of the crowdsourced maps and their associated discussion
boards may have yielded the appearance of a more active, uninterrupted channel, thus
bolstering their popularity.
5. Conclusion
Agencies are inevitably stretched thin during an emergency, especially one that
threatens a large community with loss of life and property. Agencies have limited staff,
and limited ability to acquire and synthesize the geographic information that is vital to
effective response. On the other hand, the average citizen is equipped with powers of
observation, and is now empowered with the ability to georegister those observations,
to transmit them through the Internet, and to synthesize them into readily understood
maps and status reports. Thus the fundamental question raised by this paper is: How
can society employ the eyes and ears of the general public, their eagerness to help, and
their recent digital empowerment, to provide effective assistance to responders and
emergency managers?
Many aspects of the data quality problem need further research. As noted earlier,
an important item for further research is the formalization of rules that permit
contributed geographic information to be assessed against its geographic context, and
the prototyping of software tools that would implement these rules. Research is also
needed to interpret what is known about trust and volunteerism in the specific context
of crowdsourced geographic information, to devise appropriate mechanisms and
institutions for building trust in volunteer sources.
The recent experience of the Santa Barbara fires suggests that a community can
indeed contribute effectively. There are risks, of course, and more research is urgently
needed to understand and minimize them. Not discussed in this paper, but also of
critical importance, is the role of the citizen in those parts of the world that lie beyond
the digital divide,where the Internet and its services are largely unavailable. It is clear,
however, that society has now entered a new erawhere geographic information will not
only be used by all, but created by all, or at least by a dense and distributed networkof
observers. This leads to an entirely new vision for Digital Earth, one of a dense
network of distributed, intelligent observers who are empowered to create geographic
information, and particularly the types of geographic information that remote sensing
and other large-scale acquisition systems are unable to produce. Protocols and
institutions will be needed to ensure that the result is as reliable and useful as possible.
This work is supported by grants from the US National Science Foundation and the US Army
Research Ofce.
240 M.F. Goodchild and J.A. Glennon
Notes on contributors
Michael F. Goodchild is Professor of Geography, University of California, Santa Barbara,
CA, USA and Director of Center for Spatial Studies. He received his BA degree from
Cambridge University in Physics in 1965 and his PhD in Geography from McMaster
University in 1969. He was elected as a member of the National Academy of Sciences and
Foreign Fellow of the Royal Society of Canada in 2002, and member of the American
Academy of Arts and Sciences in 2006, and in 2007 he received the Prix Vautrin Lud. His
current research interests include geographic information science, spatial analysis, and
uncertainty in geographic data.
J. Alan Glennon is a doctoral candidate in the Department of Geography at the University of
California, Santa Barbara, CA, USA and a Research Associate in its Center for Spatial
Studies. He received a masters degree from Western Kentucky University. His research
concerns the representation and analysis of networks in GIS, and in addition he is a leading
authority on geysers.
Giles, J., 2005. Special report: internet encyclopaedias go head to head. Nature, 438, 900901.
Goodchild, M.F., 2007. Citizens as sensors: the world of volunteered geography. GeoJournal,
69 (4), 211221.
Goodchild, M.F., 2009. Neogeography and the nature of geographic expertise. Journal of
Location Based Services, 3 (2), 8296.
Goodchild, M.F., Fu, P., and Rich, P., 2007. Sharing geographic information: an assessment of
the geospatial one-stop. Annals of the Association of American Geographers, 97 (2), 249265.
Goodchild, M.F. and Gopal, S., 1989. Accuracy of spatial databases. London: Taylor and
Goodchild, M.F. and Hill, L.L., 2008. Introduction to digital gazetteer research. International
Journal of Geographical Information Science, 22 (10), 10391044.
Guptill, S.C. and Morrison, J.L., 1995. Elements of spatial data quality. New York: Elsevier.
Howe, J., 2008. Crowdsourcing: why the power of the crowd is driving the future of business.
New York: McGraw-Hill.
National Research Council, 2007. Successful response starts with a map: improving geospatial
support for disaster management. Washington, DC: National Academies Press.
Scharl, A. and Tochtermann, K., 2007. The geospatial web: how geobrowsers, social software,
and the web 2.0 are shaping the network society. London: Springer.
Sui, D.Z., 2004. Toblersrst law of geography: a big idea for a small world? Annals of the
Association of American Geographers, 94 (2), 269277.
Turner, A., 2006. Introduction to neogeography. Sebastopol, CA: OReilly.
Zhang, J-X. and Goodchild, M.F., 2002. Uncertainty in geographical information. New York:
Taylor and Francis.
International Journal of Digital Earth 241
... Recent studies have highlighted the potential of social media data in facilitating disaster response through real-time updates and the extraction of insights from public discourse during crises (Fan et al., 2020;Karami et al., 2020). This approach harnesses the concept of "citizen-as-the-sensor" (Goodchild & Glennon, 2010) and has the advantages of rapidity, large quantity, potentially spatial resolution, and near real-time information (Li, Bensi, et al., 2021;Li, Ma, et al., 2021;Reuter & Kaufhold, 2018;Wu & Cui, 2018). Previous studies have predominantly focused on sentiment analysis (Beigi et al., 2016;Ragini et al., 2018;Yuan et al., 2020) and text classification (Karimiziarani & Moradkhani, 2022;Li, Ma, et al., 2021Xing et al., 2019) when analyzing social media data. ...
Full-text available
Effective disaster response is critical for affected communities. Responders and decision-makers would benefit from reliable, timely measures of the issues impacting their communities during a disaster, and social media offers a potentially rich data source. Social media can reflect public concerns and demands during a disaster, offering valuable insights for decision-makers to understand evolving situations and optimize resource allocation. We used Bidirectional Encoder Representations from Transformers (BERT) topic modeling to cluster topics from Twitter data. Then, we conducted a temporal-spatial analysis to examine the distribution of these topics across different regions during the 2020 western U.S. wildfire season. Our results show that Twitter users mainly focused on three topics:"health impact," "damage," and "evacuation." We used the Susceptible-Infected-Recovered (SIR) theory to explore the magnitude and velocity of topic diffusion on Twitter. The results displayed a clear relationship between topic trends and wildfire propagation patterns. The estimated parameters obtained from the SIR model in selected cities revealed that residents exhibited a high level of several concerns during the wildfire. Our study details how the SIR model and topic modeling using social media data can provide decision-makers with a quantitative approach to measure disaster response and support their decision-making processes.
... Whether analyzing spatiotemporal data from a single-scale or multi-scale perspective, it is common for the spatiotemporal attributes of the studied object to be single-scale, known, or precise. However, it is crucial to consider the presence of uncertainty in geographical information, which is inevitable in many cases [16]. The multiscale nature of spatiotemporal data reflects one aspect of this inherent uncertainty. ...
Full-text available
In the era of big data, a significant volume of spatiotemporal data exists in a multiscale format, describing diverse phenomena in the objective world across different spatial and temporal scales. While existing methods focus on analyzing the features and connections of spatiotemporal data at various scales, they often overlook the consideration of uncertainty in spatiotemporal information within the context of multiscale meaning. To effectively harness the potential of spatiotemporal data, it becomes crucial to capture the fuzzy spatiotemporal information inherent in multiscale datasets. This paper proposes a novel multiscale spatiotemporal correlation method that accounts for and quantifies the uncertainty of spatiotemporal information. Spatiotemporal information is categorized into two types, explicit information and implicit information, based on respective levels of uncertainty. The method employs spatiotemporal cubes to interpret the spatiotemporal items within the data, followed by the introduction of a benchmark scale to determine the certainty of each spatiotemporal item based on its range and topological relationships. Subsequently, spatiotemporal confidence and correlation index are proposed to gauge the significance of geographical elements and their interrelationships. To validate the proposed method, a multiscale spatiotemporal transaction dataset is generated and utilized in the experiment. The experimental results demonstrate that the proposed method effectively captures spatiotemporal implicit information and enables better utilization of multiscale spatiotemporal data. Notably, the importance of each object of study varies when analyzed using different benchmark scales, providing valuable insights for professionals to identify novel objects and associations worthy of consideration. The obtained results can be used to construct spatiotemporal knowledge graphs.
Full-text available
Digital technologies are increasingly being incorporated in the management and governance of urban forests to provide the information needed for sustainable and more livable cities. However, there is scarce information on the documented lessons from applying these digital technologies for urban forestry management in many developing countries. This study addressed this challenge using a literature review in the case of Nairobi, Kenya, and Kampala, Uganda. The results substantiate that urban forests are important city assets enhancing ecological stability and sustainable development. As such, the digital technologies of urban forest management practices are rapidly expanding in the two cities to enhance urban forestry and create new opportunities for sustainable development. Both cities have differentiated integration of digital technologies in the vegetative, community support, and resource management components of urban forest governance and management, with important information and lessons being generated for city authorities and policymakers. In general, the technology implementation level in Kampala city is higher than in Nairobi City. This differentiation could be attributed to differences in the socio-political contexts of the two cities, which present different enablers and barriers to technology application in urban forestry. Nevertheless, more location-specific practices and experiences with a focus on how to diversify opinions and actors in digital technologies should be pursued.
Spatial crowdsourcing engages individuals, groups, and communities in the act of collecting, analyzing, and disseminating urban, social, and other spatiotemporal information. This new paradigm of data collection has been shown to be useful when traditional means fail (e.g., due to disaster), are censored, or do not scale in time and space. The wide applicability of spatial crowdsourcing primarily became possible due to the broad availability of mobile devices. With spatial crowdsourcing, the goal is to efficiently outsource a set of spatiotemporal tasks (i.e., tasks related to time and location) to a set of workers, which requires the workers to perform the tasks by physically traveling to those locations. Hence, spatial crowdsourcing strategies must be designed to take advantage of large populations of human workers for ad hoc spatiotemporal tasks – they must consider the environment's dynamism (i.e., tasks and workers come and go) and scale as well as user considerations such as trust (i.e., not all workers are trustworthy) and privacy (i.e., not all workers want to share their location information). Here efficient spatial crowdsourcing task assignment strategies considering both trust and privacy are discussed.
Critical GIScience encompasses a range of geographic information systems (GIS) research that focuses not so much on algorithmic or computational advances, but rather on the social implications of and social biases inherent in the science, technology, and their deployments. First posited as a separate strand of research in 1999, it slowly evolved into a recognized component of GIScience with its own themes, researchers, and advocates. Theoretically, critical GIScience should be integrated into mainstream implementation; in practice, it exists as a separate stream of investigation – owing to historical precedents as well as the difficulty of integrating social theoretical insights into technological practice. As Web 2.0 and big data evolve and affect the discipline of Geography, a new critical GIScience is morphing into an umbrella term that encompasses many aspects of change to GIS and mapping, including location‐based services, volunteered geographic information, big data, and potential loss of privacy.
Volunteered geographic information (VGI) refers to geographic information that is acquired and made available to others through the voluntary activities of either individuals or groups. The growth of VGI in the past 10 years is a result of several related technological advances and scientific practices, such as Web 2.0, Geoweb, mobile technologies, neogeography, citizen science, crowdsourcing, and open science. VGI often differs from conventionally produced forms of geographic information, including the types of information produced and the approaches used to acquire it, the methods and techniques for working with it, and the social processes that mediate its creation. As VGI is rapidly merging with the stream of big data, issues regarding its quality assurance, methodology of use, and legal/ethical implications have been raised. As an alternative data source that in many ways complements traditional sources such as those produced by national mapping agencies and cartographic corporations, VGI has been shown to be a valuable data source for a variety of applications, especially in low‐resource areas and time‐sensitive situations.
Full-text available
The Southeastern United States has high landscape heterogeneity, with heavily managed forestlands, developed agriculture, and multiple metropolitan areas. The spatial pattern of land use is dynamic. Expansion of urban areas convert forested and agricultural land, scrub forests are converted to citrus groves, and some croplands transition to pine plantations. Previous studies have recognized that forest management is the predominant factor in structural and functional changes forests, but little is known about how forest management practices interact with surrounding land uses at the regional scale. The first step in studying the spatial relationships of forest management with surrounding landscapes is to be able to map management practices and describe their proximity to various land uses. There are two major difficulties in generating land use and land management maps at the regional scale by any method: the necessity of large training data sets and expensive computation. The combination of crowdsourced, citizen-science mapping and cloud-based computing may help overcome those difficulties. In this study, OpenStreetMap is incorporated into mapping land use and shows great potential for justifying and monitoring land use at a regional scale. Google Earth Engine enables large-scale spatial analysis and imagery processing by providing a variety of Earth observation datasets and computational resources. By incorporating the OpenStreetMap dataset into Earth observation images to map forest land management practices and determine the distribution of other nearby land uses, we develop a robust regional land-use mapping approach and describe the patterns of how different land uses may affect forest management and vice versa. We find that cropland is more likely to be near ecological forest management patches; few close spatial relationships exist between land uses and preservation forest management, which fulfills the preservation management strategy of sustaining the forests, and production forests have the strongest spatial relationships with croplands. This approach leads to increased understanding of land-use patterns and management practices at local to regional scales.
NeoGeography has been defined as a blurring of the distinctions between producer, communicator and consumer of geographic information. The relationship between professional and amateur varies across disciplines. The subject matter of geography is familiar to everyone, and the acquisition and compilation of geographic data have become vastly easier as technology has advanced. The authority of traditional mapping agencies can be attributed to their specifications, production mechanisms and programs for quality control. Very different mechanisms work to ensure the quality of data volunteered by amateurs. Academic geographers are concerned with the extraction of knowledge from geographic data using a combination of analytic tools and accumulated theory. The definition of NeoGeography implies a misunderstanding of this role of the professional, but English lacks a basis for a better term.
Humans have always exchanged geographic information, but the practice has grown exponentially in recent years with the popularization of the Internet and the Web, and with the growth of geographic information technologies. The arguments for sharing include scale economies in production and the desire to avoid duplication. The history of sharing can be viewed in a three-phase conceptual framework, from an early disorganized phase, through one centered on national governments as the primary suppliers of geographic information, to the contemporary somewhat chaotic network of producers and consumers. Recently geolibraries and geoportals have emerged as mechanisms to support searches for geographic information relevant to specific needs. We review the design of the Geospatial One-Stop (GOS), a project sponsored by the U.S. Federal Government to provide a single portal to geographic information, and reflecting the current state of the art. Its design includes a portal to distributed assets, accessible through a simple Web browser, a catalog based on the widely used Federal Geographic Data Committee metadata standard, services to assess and validate potential accessions, directories to available geographic information services, and automated metadata harvesting from registered sites. GOS represents a significant technological advance, however, its potential to provide a general marketplace for geographic information beyond government data has not been realized. Its future will be driven in part by technological advances in areas such as searching and automated metadata harvesting, as well as by clearer definition of its domain, either as the geoportal for U.S. data or as a broader geoportal with appropriate international or private partners. Incorporation of informal and heuristic search methods used by humans appears to offer the best direction for improvement in search technologies.
Neogeography combines the complex techniques of cartography and GIS and places them within reach of users and developers. This Short Cut introduces you to the growing number of tools, frameworks, and resources available that make it easy to create maps and share the locations of your interests and history. Learn what existing and emerging standards such as GeoRSS, KML, and Microformats mean; how to add dynamic maps and locations to your web site; how to pinpoint the locations of your online visitors; how to create genealogical maps and Google Earth animations of your family's ancestry; or how to geotag and share your travel photographs.