ChapterPDF Available

Crowdsourcing Fundamentals: Definition and Typology


Abstract and Figures

Crowdsourcing is a problem-solving and task realization model that is being increasingly used. Thanks to the possibility of harnessing the collective intelligence from the Internet; thanks to the crowdsourcing initiatives people can, for example, find a solution to a complex chemical problem, get images tagged, or get a logo designed. Due to its success and usefulness, more and more researchers have focused their interest on this concept. This fact has shown that the concept of crowdsourcing has no clear boundaries, and although over time the concept has been better explained, some authors describe it differently, propose different types of crowdsourcing initiatives, or even use contradictory crowdsourcing examples. In this paper, an integrated definition and typology, developed in 2012, are analyzed to check whether they are still valid today or whether need a reformulation.
Content may be subject to copyright.
Chapter 3
Crodwsourcing Fundamentals: Definition and Ty-
Enrique Estellés-Arolas1, Raúl Navarro-Giner and Fernando González-
Abstract Crowdsourcing is a problem solving and task realization model that is
being increasingly used. Thanks to the possibility of harnessing the collective in-
telligence from the Internet, thanks to the crowdsourcing initiatives people can for
example find a solution to a complex chemical problem, get images tagged or get
a logo designed. Due to its success and usefulness, more and more researchers
have focused their interest on this concept. This fact has shown that the concept of
crowdsourcing has no clear boundaries, and although over time the concept has
been better explained, some authors describe it differently, propose different types
of crowdsourcing initiatives or even use contradictory crowdsourcing examples. In
this paper, an integrated definition and typology, developed in 2012, are analyzed
to check if they are still valid today or whether need a reformulation.
Keywords: crowdsourcing, typology, definition, crowd, task, web, collective in-
3.1. Introduction
The development of Web 2.0 has led to the emergence of new models for
business, for communication, for personal relationships, for learning, etc. One of
these models, related to business and innovation, is known as crowdsourcing.
The term "crowdsourcing" was first coined in 2006 by American journalist Jeff
Howe. In a first attempt to conceptualize the term, Howe (2006) defined it as “the
act of a company or institution taking a function once performed by employees
and outsourcing it to an undefined (and general large) network of people in the
form of an open call. This can take the form of peer-production (when the job is
performed collaborative), but is also often undertaken by sole individual”.
1 Enrique Estellés-Arolas (!)& Raúl Navarro-Giner - Catholic University of Valencia -
Valencia, Spain
2 Fernando González-Ladrón-de-Guevara- Technical University of Valencia-Valencia,
Thanks to the collaborative nature of Web 2.0, crowdsourcing allows a person,
institution or company to benefit from the work, ideas or wisdom of the crowd of
Internet. This crowd, usually heterogeneous, can be formed by amateurs, volun-
teers, experts, companies, etc. (Howe, 2008), which may or may not belong to a
specific user community (Brabham, 2012). The work of this crowd is rewarded in
some way: tangible (money, prizes, etc.) or intangible (recognition, entertainment,
prestige, etc.).
This model, which was born in the business environment, has evolved and
spread. Currently, crowdsourcing is being used for different purposes in fields as
diverse as medicine (King et al., 2013) or geography (See et al., 2014).
The problem is that the wide use of crowdsourcing has made many people to
use the term referring to any initiative in which a large number of people are re-
cruited through an open call that is usually distributed through Internet (Howe,
2008; Brabham, 2008; Estellés-Arolas & González-Ladrón-de-Guevara, 2012a;
Littman & Suomela, 2014).
For this reason, sometimes the boundaries of what is or is not crowdsourcing
are not completely clear. An example is the case of Wikipedia: taking it as a
crowdsourcing platform raises both defenders (Bazilian et al, 2012;. Ghani & Za-
karia, 2013) and detractors (Brabham, 2013, Estellés-Arolas & González-of-the-
Thief Guevara, 2012a). The proliferation of different crowdsourcing definitions
and typologies neither help too much.
To alleviate this situation, in 2012 Estellés-Arolas and González-Ladrón-de-
Guevara carried out a literature review with the objective of stating an integrative
crowdsourcing definition (2012a) and crowdsourcing typology (2012b).
Though both the typology and the definition proposed by Estellés-Arolas and
González-Ladrón-de-Guevara are correct and useful, the concept of crowdsourc-
ing continues evolving and being applied in different areas. This situation makes
necessary the review of both integrative proposals to test its validity.
For that purpose, this chapter contains the results obtained of repeating the lit-
erature review realized in Estellés-Arolas and González-Ladrón-de-Guevara
(2012a) in order to find new definitions and new typologies since 2012. The aim is
to check whether the definition and typology proposal remain valid or need to be
It is true that the crowdsourcing definition and typology mentioned above are
not the most used within the literature. The most used are the Howe’s (2008) and
Brabham’s (2008). But certainly counts in its favour that both, definition and ty-
pology, seek for consensus integrating different proposals.
3.2. Theoretical Backgroun
3.2.1. Towards an integrated definition
In 2012, Estellés-Arolas and González-Ladrón-de-Guevara, sought through a
literature review different crowdsourcing definitions (2012a). The purpose of their
research was to extract all the elements which would allow distinguishing between
crowdsourcing and any other Internet initiative.
After analyzing more than 200 documents, they found more than 40 different
definitions. The authors identified within these definitions eight fundamental ele-
ments that any crowdsourcing initiative must contain. These elements are:
1. There is a clearly defined crowd (E1).
2. There exists a task with a clear goal (E2).
3. The recompense received by the crowd is clear (E3).
4. The crowdsourcer is clearly identified (E4).
5. The compensation to be received by the crowdsourcer is clearly defined (E5).
6. It is an online assigned process of participative type (E6).
7. It uses an open call of variable extent (E7).
8. It uses the Internet (E8).
As a result of this research, its authors developed a definition of crowdsourc-
ing, which although being wordy, defines in detail the concept. The definition is as
follows: "Crowdsourcing is a type of participative online activity in which an in-
dividual, an institution, a non-profit organization, or company proposes to a group
of individuals of varying knowledge, heterogeneity, and number, via a flexible
open call, the voluntary undertaking of a task. The undertaking of the task, of var-
iable complexity and modularity, and in which the crowd should participate bring-
ing their work, money, knowledge and/or experience, always entails mutual bene-
fit. The user will receive the satisfaction of a given type of need, be it economic,
social recognition, self-esteem, or the development of individual skills, while the
crowdsourcer will obtain and utilize to their advantage that what the user has
brought to the venture, whose form will depend on the type of activity undertak-
en." (Estellés-Arolas & González-Ladrón-de-Guevara, 2012a)
3.2.2. Towards an integrated typology
Later, Estellés-Arolas & González-Ladrón-de-Guevara (2012b) conducted an-
other literature review searching for different crowdsourcing typologies. They ob-
tained six documents that reported a task-based typology (Reichwald & Piller,
2006; Howe, 2008; Brabham, 2008; Kleeman et al., 2008; Greets, 2008; Burger-
Helmchen & Penin, 2010). After comparing all different typologies (Codina,
1997; Pinto-Molina et al., 2007), an integrated typology was stated. It comprises 5
main types:
1. Crowdcasting. Contest-like crowdsourcing initiatives, where a problem
or a task is proposed to the crowd, being rewarded who solves it first or
do it better (i.e.: Innocentive).
2. Crowdcollaboration. Crowdsourcing initiatives in which communication
between individuals of the crowd occurs, while the initiator of the initia-
tive stay on the sidelines. There can be found two subtypes which differ
on the ultimate goal:
- Crowdstorming. Massive online brainstorming sessions, in which
different ideas are raised and the crowd can support those ideas
with their comments and votes (e.g.: IdeaJam).
- Crowdsupport. In this case, the customers themselves solve the
doubts and problems of other customers so they don’t need to con-
tact the official customer support (i.e.: Getsatisfaction).
3. Crowdcontent: In these crowdsourcing tasks, the crowd uses their labor
and knowledge to create or find content of various types but not in a
competitive way. Three subtypes can be found:
- Crowdproduction. Initiatives where the crowd should create con-
tent, as it is done individually when translating short pieces of text
or tagging images (i.e.: Amazon Mechanical Turk).
- Crowdsearching. Crowdsourcing initiatives where the crowd
searches for content on the internet for any purpose (i.e.: Peer to
Patent Review).
- Crowdanalyzing. Initiatives where the crowd searches but not in
the Internet but inside multimedia documents like videos or imag-
es (i.e.: Stardust@home).
4. Crowdfunding. In the crowdfunding initiatives, an individual, organi-
zacion or company seeks for funding from the crowd in exchange for a
reward (i.e.: Kickstarter).
5. Crowdopinion. In this case, the objective is to know the user opinions
about a particular issue or product through votes, comments, tags or even
sale of shares (i.e.: ModCloth, Intrade).
3.3. Methodology
3.3.1 Regarding the definition
The methodology used to verify whether the integrated crowdsourcing defini-
tion is valid is performed in three steps: a systematic review of the literature to
find documentation that include crowdsourcing definitions (as shown in Estellés-
Arolas & González-Ladrón-de-Guevara (2012a)), the identification of the defini-
tion elements following the Tatarkiewicz’s approach (1980) and the comparison
with the actual integrated definition.
In first place, the systematic review of the literature is done again following
the Delgado approach (2010) based on Petitti (2000) and Egger et al. (2008). Five
databases has been selected (SAGE, IEEE, ScienceDirect, Emerald and ACM) and
documents with the word ‘crowdsourcing’ in the title, abstract or keywords has
been consulted. Only those documents with an original definition for crowdsourc-
ing, and published from 2012, are selected.
Finally, found definitions will be analyzed. It will be checked whether some of
the 8 parameters published in the original article using Tatarkiewicz approach
(1980) appears, or if instead, any new characteristic should be taken into consider-
3.3.2 Regarding the typology
Regarding the typology, a similar literature review has been performed. Same
databases have been consulted, but search criteria have been modified. In this
case, any document containing the term “crowdsourcing” in its title, abstract or
keywords has been selected. Also, in the same fields, the terms “typology” or
“taxonomy” (either or both) should appear.
Once obtained the documents, those containing a general typology different to
the ones found out previously are selected. These typologies will be compared to
Estellés-Arolas & González-Ladrón-de-Guevara typology proposal (2012b), up-
dating it if necessary.
3.4. Results
In this section the results obtained by performing the reviews of the literature,
both in the search for new definitions as in the search for new typologies, are
3.4.1 Results on the definition
After searching documents in the five databases previously cited, a total of 777
documents were retrieved among journal papers, books reviews, books and pro-
ceedings (table 1). It should be noted that the most numerous documents found are
proceedings, fact that coincides with the results of the original review of the litera-
ture performed by Estellés-Arolas & González-Ladrón-de-Guevara (2012a).
Table 3.1. Summary table of the literature review
Science Direct
Among all the documents found, only 28 (the 3.86%) contains a definition not
citing explicitly others like Howe’s (2008) or Brabham’s (2008). In the table 2 this
definitions can be seen.
Table 3.2. New crowdsourcing definitions found through the literature review
Definition: Crowdsouring is...
(Folorunso & Mus-
tapha, 2014)
“referred to as human computation, a
methodology that lets humans process
tasks which are difficult to implement in
(Lee, Park, & Park,
“the practice of obtaining needed services,
ideas, or content by soliciting contributions
from a large group of people, particularly
from an online community, rather than
from traditional employees or suppliers.”
(Satzger, Psaier,
Schall, & Dustdar,
“a new paradigm for performing computa-
tions in Web-based environments by utiliz-
ing the capabilities of human workers. The
idea of crowdsourcing is sometimes re-
ferred to as human computation, a method-
ology that lets humans process tasks which
are difficult to implement in software.
Such tasks include transcription of docu-
ments, reviewing of articles or evaluating
the quality of ranking algorithms.”
(Sutherlin, 2013)
“the technological union of humans and
(Ambati, Vogel, &
Carbonell, 2012)
“is the process of farming out tasks to a
large user population on the Internet. These
tasks broadly belong to the language or vi-
sion community, where for a number of
tasks it is either impossible or challenging
and time-consuming for computers to
complete them, whereas only requires a
few seconds for a human to complete”
(Sprugnoli, 2013)
“the process of segmenting a complex task
into smaller work units and distributing
these among a large pool of non-expert
workers, usually via the web”
(Pedersen et al.,
“a collaboration model enabled by people-
centric web technologies to solve individu-
al, organizational, and societal problems
using a dynamically formed crowd of peo-
ple who respond to an open call for partic-
(Roopa, Iyer, &
“a technique wherein a task is outsourced
to a distributed group of people
(crowd).Thus crowdsourcing is a collabo-
rative or distributed problem solving mod-
el. Problems are broadcast to unknown
group of people asking for solutions. Users
(crowd) submit the solutions. The solutions
are consolidated by the “crowdsourcer”.
The crowd may be rewarded monetarily,
with prizes, with extra talk time or some
other form of recognition. In some cases,
the reward could be just intellectual satis-
(Wu, Zhong, Tan,
Horner, & Yang,
“ a process that involves outsourcing tasks
to a distributed group of people, which is
normally much cheaper than hiring ex-
et al., 2013)
“a distributed problem solving model
where a population of undefined size, en-
gages in the solution of a complex problem
for monetary or ethical (i.e., intellectual
satisfaction) benefit through an open call”
(Parvanta, Roth, &
Keller, 2013)
“a problem-solving approach that taps the
knowledge, energy and creativity of a
global, online community”
(Brabham, Ribisl,
Kirchner, & Bern-
hardt, 2014)
“an online, distributed, problem-solving,
and production model that uses the collec-
tive intelligence of networked communities
for specific purposes”
(Marjanovic, Fry, &
Chataway, 2012)
“under-researched type of open innovation
that is often enabled by the web”
(Lampe, Zube, Lee,
Park, & Johnston,
“online communities that could help with
issues of managing information and users,
including the ability to solicit small contri-
butions from a large number of users to
help provide important meta-data about
people or information”
(Britton, Level, &
Gardner, 2013)
distributed problem-solving technique
leveraging the efforts of a group, known as
“the crowd.” A project is defined and vol-
unteers are invited to contribute to its ac-
complishment. The volunteers are dis-
persed and may not even be members of
the organization
(Soleymani & Lar-
son, 2013)
“human computation techniques that ex-
ploit human intelligence and also take ad-
vantage of a large population of contribu-
tors. Crowdsourcing is fre- quently
facilitated by crowdsourcing platforms
where crowd- members can find and carry
out microtasks in exchange for a small
(Perera & A. Perera,
“a process of outsourcing tasks of an or-
ganization to general public, where the
term ‘crowd’ equals to ‘general public’”
(Gupta & Sharma,
“the act of outsourcing tasks, traditionally
performed by staff or a contractor, to an
undefined large group of people or crowd”
(Azzam & Jacob-
son, 2013)
“Paid recruitment of an independent global
workforce for the objective of working on
a specifically defined task or set of tasks”
(Demartini, Di-
fallah, & Cudré-
Mauroux, 2013)
“term used to define those methods to gen-
erate or process data asking to a large
group of people to complete small tasks. It
is possible to categorize different
crowdsourcing strategies based on the dif-
ferent types of incentives used to motivate
the crowd to perform such tasks
(Schumaker, 2013)
“another form of market efficiency where
groups of individuals perform forecasts on
provided information and results are aver-
aged for use as a predictive tool”
(Raford, 2014)
“Large-scale collective intelligence sys-
(Geiger & Schader,
“an umbrella term for approaches that har-
ness the diverse potential of large groups
of people via an open call for contribution
over the Web. Using information technol-
ogy as a facilitator, crowdsourcing organi-
zations implement socio-technical systems
to channel the contribution of human
workforce, knowledge, skills, or perspec-
tives into the generation of digital infor-
mation products and services. Such
crowdsourcing information systems have
recently gained in popularity for a variety
of organizational functions such as prob-
lem solving, knowledge aggregation, con-
tent generation, and large-scale data pro-
(Stanley, Win-
Onwordi, &
Kapuire, 2013)
rooted in the process of asking others to
help you with a problem that you cannot
resolve on your own. This may be due to
limited resources, skills, or time con-
(King, Gehl,
“Collective effort”
Grossman, & Jen-
sen, 2013)
(Tong, Cao, &
Chen, 2014)
“a service has a common framework: each
employer (a.k.a the task publisher) poses a
task, and then this task is responded or fin-
ished by many different and unknown
crowd employees. Thus, the “task-response
pairs” is the unique structure of
crowdsourcing data”
(Stol & Fitzgerald,
“an emerging and promising approach
which involves delegating a variety of
tasks to an unknown workforce the
(Chiu, Liang, &
Turban, 2014)
“can be viewed as a method of distributing
work to a large number of workers (the
crowd) both inside and outside of an or-
ganization, for the purpose of improving
decision making, completing cumbersome
tasks, or co-creation of designs and other
3.4.2 Results on the typology
The typology literature review results have been much less numerous. In fact,
after consulting the same databases, 40 documents have hardly been retrieved, of
which only one provides an innovative general typology.
Typologies focused on specific areas were found: Linders (2012) states a ty-
pology of the crowdsourcing initiatives that can be carried out in e-government;
and Gomes et al. (2012) states a crowdsourcing typology focused on musical sce-
The only document that provides a general Typology is the one proposed by
Geiger et al. (2012). These authors described an information system-based typolo-
gy that could sustain the crowdsourcing initiatives. It consists of 4 types:
1. Crowd processing, where the crowd produces a large amount of homoge-
nous contributions with equal value. Some examples are Re-captcha and
other micro-tasking tasks like the ones that can be found in AMT, or the
tasks done in citizen science projects (i.e.: Galaxy Zoo).
2. Crowd rating, where the crowd also produces a large amount of contribu-
tions, with equal value. In this case the value that emerges from the total
contribution is sought. This is the case of votes, reviews and opinions
(i.e.: “eBay reputation system”). It would also include prediction markets
(i.e.: “Hollywood Stock Exchange”).
3. Crowd solving initiatives seek value from heterogeneous contributions,
where each contribution has its own qualitative properties. This crowd
solving initiatives look for alternative or complementary solutions to a
given task or problem (i.e.: Goldcorp Challenge, Netflix prize or Inno-
4. Crowd creation initiatives, finally, seek the collective value arising from
the accumulation and relation of contributions. In this case, also each
contribution is important towards the creation of a collective result (i.e.:
3.5. Discussion
3.5.1 Regarding the definition
Comparing found definitions with the Estellés-Arolas y González-Ladrón-de-
Guevara (2012a) integrated definition proposal there can’t be found any relevant
difference. All of them meet some of the 8 proposed elements, and those aspects
that do not accord with the elements, makes reference to specific applications or
particular visions of crowdsourcing.
Definitions around other concepts or models like to open innovation (Marja-
novic, Fry, & Chataway, 2012), human computation (Demartini, Difallah, &
Cudré-Mauroux, 2013; Satzger, Psaier, Schall, & Dustdar, 2013) or collective in-
teligence (Raford, 2014) can be found. Other definitions are focused in specific
crowdsourcing types such as the crowdproduction using microtasking (Sprugnoli,
2013; Demartini, Difallah, & Cudré-Mauroux, 2013; Lampe, Zube, Lee, Park, &
Johnston, 2014) or the use of crowd contest for complex problem solving
(Zeinalipour-Yazti et al., 2013; Stanley, Winschiers-Theophilus, Onwordi, &
Kapuire, 2013).
It’s important to highlight that there are two definitions, those of Roopa Iyer &
Rangaswamy’s (2013) and Geiger & Schader’s (2014), which are highly general,
and in fact meet almost all elements of the integrated definition.
About the elements, it is important to notice that almost the totality of the defi-
nitions make reference to a crowd (E1) that undertakes a task (E2). Other elements
have been taken much less into account: 9 definitions refer to the use of Internet to
carry these initiatives out (E8) and 8 of them refer to a process that involves indi-
vidual online participation (E6).
The remaining elements are less reflected in the found definitions. This indi-
cates that, although those elements allow crowdsourcing identification, they are
not considered fundamental by authors.
3.5.2 Regarding typology
In this case, the 4 general types Geiger et al. (2012) proposes could be inte-
grated into the types contemplated by Estellés-Arolas y González-Ladrón-de-
Guevara (2012b). In fact, there is direct correlation between both typologies. Gei-
ger’s Crowdrating, Crowdcreation and Crowdsolving corresponds with Estellés-
Arolas’ Crowdopinion, Crowdproduction and Crowdcontest. In the case of Gei-
ger’s Crowdprocessing, this type corresponds with Estellés-Arolas’ Crowdsearch-
ing and Crowdanalysing.
3.5.2 Regarding the literature review results
Comparing the literature review carried out with the one performed by Estel-
lés-Arolas & González-Ladrón-de-Guevara (2012a), has allowed a limited study
of the evolution of crowdsourcing as a research topic.
First of all, it should be highlighted the difference between the number of pub-
lications found in 2012 (209) and the number found in 2014 (777). Applying the
same criteria and consulting the same databases, 372% more documents were
found. It’s also significant the increase in the amount of conference paper (127 in
2012 and 587 in 2014; 462% more) also in journal papers (68 in 2012 and 173 in
2014; 254% more).
These data show that crowdsourcing has gone from being an emerging issue,
which in 2012 still did not receive much attention, to an actual issue. Besides, this
indicates a consolidation in the scientific research on the subject.
Another fact that supports this statement is that, in the first literature review,
the 19.13% (40 of 209) of the documents found, used original definitions. In this
literature review, this percentage has fallen to the 3.86% (28 of 777). This shows
that authors are less interested in defining and conceptualizing crowdsourcing and
more interested in researching of concrete applications.
Regarding the definitions found, it is important to highlight some aspects.
1. Firstly, the vast majority of authors use already existing definitions:
mainly Howe’s (2006) y Brabham’s (2008). Estellés-Arolas & González-
Ladrón-de-Guevara (2012a) definition, because its integrative nature, is
also used although to a lesser extent.
2. Secondly, some documents doesn’t have any definition of the term.
Some, like Monohan & Mokos (2013) or Su et al. (2013) for example,
obviate the crowdsourcing definition when mentioning it. It’s assumed
that the topic already has its own identity or has become popular enough.
3. Occasionally, the verb “crowdsource” is being used naturally (Garrido &
Faria, 2012; Rana et al., 2014; Kalantari et al., 2014). Although the verb
does not exist as such in the dictionary, it is a term frequently used to de-
note the action of using crowdsourcing. This shows that crowdsourcing
use is increasingly widespread.
Concerning definitions which are not based in any other, those typically arise
from the need to define crowdsourcing from the point of view of an specific task.
Some authors define crowdsourcing relating it to another concepts or models.
King et al. (2013) define it as a collective effort, referring to tasks where every-
one's contribution is necessary. Others focus on the use of crowdsourcing for co-
creation activities (Gatautis & Vitkauskaite, 2014), open innovation (Feller et al,
2012; Ren et al, 2014.), collective intelligence (Garrido & Faria, 2012; Filippi et
al, 2013; Raford, 2014) or human computation (Satzger, 2013).
Crowdsourcing is also understood for example as a tool for customers' partici-
pation in product development (Djelassi & Decoopman, 2013), public participa-
tion (Hildebrand et al., 2013) and e-government (Linders, 2012), citizen science
(Harvey et al. , 2014), collecting data (Armstrong et al., 2012), search (Ren et al.,
2014) or microtasking (Chen et al., 2014).
It is also important to note that in the literature review carried out by Estellés-
Arolas and González-Ladrón-de-Guevara (2012a) there were found different pa-
pers referred to the theoretical basis of crowdsourcing. Brabham (2008) analyzes
and studies the motivations that move the crowd to participate; Geiger, Seedorf &
Schader (2011) propose a taxonomy of crowdsourcing activities; Schenk &
Guittard (2009) study what kind of tasks can be performed using crowdsourcing.
And so on.
In the literature review carried out in this work, it can be seen that there are
practically no such documents. Most items listed study the application of
crowdsourcing in some activity or specific area. Schriner & Oerther (2014) study
it as way to fight poverty. Brabham et al. (2014) analyze it in the area of public
health and medical domain. Related to this field, King et al. (2013) study the use
of crowdsourcing in skin self-examination for detecting cancer.
Other applications are establishing the fingerprint of past sea level changes
(Rovere et al., 2012), validate data to generate overall landcover maps (See et a.,
2014), urban surveillance (Monohan & Mokos, 2013) or animal identification for
ecological monitoring and conservation (Duyck et al., 2014).
3.6. Conclusions
Crowdsourcing refers to a problem solving and completing tasks model which
involves the participation of the Internet crowd. It represents just one of the many
ways to harness collective intelligence. Its use has spread increasingly, being used
in many areas: medicine, biology, astronomy, etc. being business area the one in
which it was born and in which has been more used.
The popularity of crowdsourcing has made different authors to define and con-
ceptualize crowdsourcing in different ways, even proposing different typologies
and definitions. In 2012, Estellés-Arolas y González-Ladrón-de-Guevara suggest-
ed, using a literature review, an integrated definition of crowdsourcing based in 8
elements (2012a) and also an integrated crowdsourcing typology (2012b). It is a
wordy definition, but it defines the concept in depth. The same applies to the ty-
In the present work the same literature review has been carried out. The objec-
tive is to see if the definition and the typology proposed remain valid. Both have
been specifically chosen because they share the intention to seek consensus on
what is crowdsourcing.
The results of the literature review points that both the definition and the ty-
pology remains useful and remain relevant. Firstly, none of the 28 new definitions
found identify a new differentiator element. Regarding to the typology, only one
new general typology has been found. This new typology integrates seamlessly in-
to the 2012b typology.
It is true that there is a limitation resulting from the limited number of data-
bases consulted and from using restrictive search criteria. Despite this fact, the lit-
erature review, compared to the one conducted by Estellés-Arolas y González-
Ladrón-de-Guevara (2012a and 2012b), has partially revealed the development of
crowdsourcing in the scientific field.
Crowdsourcing is clearly a researching field that is burgeoning, that now re-
ceives increased attention and that has passed from theoretical approaches to the
systematic study of its concrete applications in a wide number of fields.
Ambati, V., Vogel, S., & Carbonell, J. (2012). Collaborative workflow for crowdsourcing transla-
tion. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work
(pp. 1191–1194). ACM. doi: 10.1145/2145204.2145382
Armstrong, A. W., Harskamp, C. T., Cheeney, S., Wu, J., & Schupp, C. W. (2012). Power of
crowdsourcing: Novel methods of data collection in psoriasis and psoriatic arthritis. Journal of
the American Academy of Dermatology, 67(6), 1273-1281.e9. doi:10.1016/j.jaad.2012.05.013
Azzam, T., & Jacobson, M. R. (2013). Finding a Comparison Group Is Online Crowdsourcing a
Viable Option? American Journal of Evaluation, 34(3), 372–384.
Bazilian, M., Rice, A., Rotich, J., Howells, M., DeCarolis, J., Macmillan, S., … Liebreich, M.
(2012). Open source software and crowdsourcing for energy analysis. Energy Policy, 49, 149-
153. doi:10.1016/j.enpol.2012.06.032
Brabham, D. C. (2008). Crowdsourcing as a model for problem solving an introduction and cases.
Convergence: the international journal of research into new media technologies, 14(1), 75-90.
Brabham, D. C. (2012). A Model for Leveraging Online Communities. The participatory cultures
handbook, 120.
Brabham, D. C. (2013). Crowdsourcing. Mit Press.
Brabham, D. C., Ribisl, K. M., Kirchner, T. R., & Bernhardt, J. M. (2014). Crowdsourcing Applica-
tions for Public Health. American Journal of Preventive Medicine, 46(2), 179–187.
Brabham, D. C., Ribisl, K. M., Kirchner, T. R., & Bernhardt, J. M. (2014). Crowdsourcing Applica-
tions for Public Health. American Journal of Preventive Medicine, 46(2), 179-187.
Britton, C. J., Level, A. V., & Gardner, M. A. (2013). Crowdsourcing: divide the work and share
the success. Library Hi Tech News, 30(4), 1–5. doi:10.1108/LHTN-03-2013-0017
Burger-Helmchen, T., & Pénin, J. (2010). The limits of crowdsourcing inventive activities: What do
transaction cost theory and the evolutionary theories of the firm teach us. In Workshop on Open
Source Innovation, Strasbourg, France (pp. 1-26).
Chen, C., White, L., Kowalewski, T., Aggarwal, R., Lintott, C., Comstock, B.,Lendvay, T.
(2014). Crowd-Sourced Assessment of Technical Skills: a novel method to evaluate surgical
performance. Journal of Surgical Research, 187(1), 65-71. doi:10.1016/j.jss.2013.09.024
Chiu, C.-M., Liang, T.-P., & Turban, E. (2014). What can crowdsourcing do for decision support?
Decision Support Systems, 65, 40–49. doi:10.1016/j.dss.2014.05.010
Codina, L. (1997) Una propuesta de metodología para el diseño de bases de datos documentales
(parte II). El profesional de la Información, 6(12),pp. 20-26
Delgado, M. (2010) Revisión sistemática de estudios: Metaanálisis. Signo, Barcelona
Demartini, G., Difallah, D. E., & Cudré-Mauroux, P. (2013). Large-scale linked data integration
using probabilistic reasoning and crowdsourcing. The VLDB Journal, 22(5), 665–687.
Demartini, G., Difallah, D. E., & Cudré-Mauroux, P. (2013). Large-scale linked data integration
using probabilistic reasoning and crowdsourcing. The VLDB Journal, 22(5), 665–687.
Djelassi, S., & Decoopman, I. (2013). Customers’ participation in product development through
crowdsourcing: Issues and implications. Industrial Marketing Management, 42(5), 683-692.
Duyck, J., Finn, C., Hutcheon, A., Vera, P., Salas, J., & Ravela, S. (2014). Sloop: A Pattern Re-
trieval Engine for Individual Animal Identification. Pattern Recognition.
Egger, M., Smith, G. D., & Altman, D. (Eds.). (2008). Systematic reviews in health care: meta-
analysis in context. John Wiley & Sons.
Estellés-Arolas, E., & González-Ladrón-de-Guevara, F. (2012a). Towards an integrated crowdsour-
cing definition. Journal of Information science, 38(2), 189-200.
Estellés-Arolas, E., & González-Ladrón-De-Guevara, F. (2012b). Clasificación de iniciativas de
crowdsourcing basada en tareas. El profesional de la información, 21(3), 283-291.
Feller, J., Finnegan, P., Hayes, J., & O’Reilly, P. (2012). ‘Orchestrating’ sustainable crowdsourcing:
A characterisation of solver brokerages. The Journal of Strategic Information Systems, 21(3),
216-232. doi:10.1016/j.jsis.2012.03.002
Fienen, M. N., & Lowry, C. S. (2012). Social.Water—A crowdsourcing tool for environmental data
acquisition. Computers & Geosciences, 49, 164-169. doi:10.1016/j.cageo.2012.06.015
Filippi, F., Fusco, G., & Nanni, U. (2013). User Empowerment and Advanced Public Transport So-
lutions. Procedia - Social and Behavioral Sciences, 87, 3-17. doi:10.1016/j.sbspro.2013.10.590
Folorunso, O., & Mustapha, O. A. (2014). A fuzzy expert system to Trust-Based Access Control in
crowdsourcing environments. Applied Computing and Informatics.
Garcia Martinez, M., & Walton, B. (2014). The wisdom of crowds: The potential of online commu-
nities as a tool for data analysis. Technovation, 34(4), 203-214.
Garrido, P., & Faria, N. (2012). MODSSO–A Manager-centric Global Decision Support System for
Organizations. Procedia Technology, 5, 616-624. doi:10.1016/j.protcy.2012.09.068
Gatautis, R., & Vitkauskaite, E. (2014). Crowdsourcing Application in Marketing Activities. Proce-
dia - Social and Behavioral Sciences, 110, 1243-1250. doi:10.1016/j.sbspro.2013.12.971
Geerts, S. (2009). Discovering crowdsourcing: theory, classification and directions for use. unpu-
blished Master of Science in Innovation Management thesis, Eindhoven University of Techno-
Geiger, D., & Schader, M. (2014). Personalized task recommendation in crowdsourcing information
systems Current state of the art. Decision Support Systems, 65, 3–16.
Geiger, D., Rosemann, M., Fielt, E., & Schader, M. (2012). Crowdsourcing Information Systems-
Definition, Typology, and Design. In Proceedings of the International Conference on Informa-
tion Systems (ICIS 2012) Atlanta, Ga.
Geiger, D., Seedorf, S., Schulze, T., Nickerson, R., & Schader, M. (2011) Managing the Crowd:
Towards a Taxonomy of Crowdsourcing Processes. In Proceedings of the Seventeenth Americas
Conference on Information Systems
Ghani, A. T. A., & Zakaria, M. S. (2013). Business-IT Models Drive Businesses Towards Better
Value Delivery and Profits Making. Procedia Technology, 11, 602-607.
Gomes, C., Schneider, D., Moraes, K., & de Souza, J. (2012, October). Crowdsourcing for music:
Survey and taxonomy. In Systems, Man, and Cybernetics (SMC), 2012 IEEE International Con-
ference on (pp. 832-839). IEEE.
Gupta, D. K., & Sharma, V. (2013). Exploring crowdsourcing: a viable solution towards achieving
rapid and qualitative tasks. Library Hi Tech News, 30(2), 14–20. doi:10.1108/LHTN-01-2013-
Hamilton, J. F. (2014). Historical forms of user production. Media, Culture & Society, 36(4), 491–
507. doi:10.1177/0163443714523812
Harvey, D., Kitching, T. D., Noah-Vanhoucke, J., Hamner, B., Salimans, T., & Pires, A. M. (2014).
Observing Dark Worlds: A crowdsourcing experiment for dark matter mapping. Astronomy and
Computing, 5, 35-44. doi:10.1016/j.ascom.2014.04.003
Hildebrand, M., Ahumada, C., & Watson, S. (2013). CrowdOutAIDS: crowdsourcing youth pers-
pectives for action. Reproductive Health Matters, 21(41), 57-68. doi:10.1016/S0968-
Howe, J. (2006). The rise of crowdsourcing. Wired magazine, 14(6), 1-4.
Howe, J. (2008). Crowdsourcing: How the power of the crowd is driving the future of business.
Random House.
Kalantari, M., Rajabifard, A., Olfat, H., & Williamson, I. (2014). Geospatial Metadata 2.0 – An ap-
proach for Volunteered Geographic Information. Computers, Environment and Urban Systems,
48, 35-48. doi:10.1016/j.compenvurbsys.2014.06.005
King, A. J., Gehl, R. W., Grossman, D., & Jensen, J. D. (2013). Skin self-examinations and visual
identification of atypical nevi: Comparing individual and crowdsourcing approaches. Cancer
Epidemiology, 37(6), 979–984. doi:10.1016/j.canep.2013.09.004
King, A. J., Gehl, R. W., Grossman, D., & Jensen, J. D. (2013). Skin self-examinations and visual
identification of atypical nevi: Comparing individual and crowdsourcing approaches. Cancer
Epidemiology, 37(6), 979-984. doi:10.1016/j.canep.2013.09.004
Kleemann, F., Voß, G. G., & Rieder, K. (2008). Un (der) paid innovators: The commercial utiliza-
tion of consumer work through crowdsourcing. Science, Technology & Innovation Studies,
4(1), PP-5.
Lampe, C., Zube, P., Lee, J., Park, C. H., & Johnston, E. (2014). Crowdsourcing civility: A natural
experiment examining the effects of distributed moderation in online forums. Government In-
formation Quarterly, 31(2), 317–326. doi:10.1016/j.giq.2013.11.005
Lee, S., Park, S., & Park, S. (2014). A Quality Enhancement of Crowdsourcing based on Quality
Evaluation and User-Level Task Assignment Framework. Big Data and Smart Computing
(BIGCOMP), 60 - 65. doi: 10.1109/BIGCOMP.2014.6741408
Linders, D. (2012). From e-government to we-government: Defining a typology for citizen copro-
duction in the age of social media. Government Information Quarterly, 29(4), 446-454.
Linders, D. (2012). From e-government to we-government: Defining a typology for citizen copro-
duction in the age of social media. Government Information Quarterly, 29(4), 446-454.
Littmann, M., & Suomela, T. (2014). Crowdsourcing, the great meteor storm of 1833, and the foun-
ding of meteor science. Endeavour, 38(2), 130-138. doi:10.1016/j.endeavour.2014.03.002
Marjanovic, S., Fry, C., & Chataway, J. (2012). Crowdsourcing based business models: In search of
evidence for innovation 2.0. Science and Public Policy, 39(3), 318–332.
Monahan, T., & Mokos, J. T. (2013). Crowdsourcing urban surveillance: The development of ho-
meland security markets for environmental sensor networks. Geoforum, 49, 279-288.
Parvanta, C., Roth, Y., & Keller, H. (2013). Crowdsourcing 101 A Few Basics to Make You the
Leader of the Pack. Health Promotion Practice, 14(2), 163–167.
Pedersen, J., Kocsis, D., Tripathi, A., Tarrell, A., Weerakoon, A., Tahmasbi, N., … de Vreede, G.-J.
(2013). Conceptual Foundations of Crowdsourcing: A Review of IS Research (pp. 579–588).
IEEE. doi:10.1109/HICSS.2013.143
Perera, I., & A. Perera, P. (2014). Developments and leanings of crowdsourcing industry: implica-
tions of China and India. Industrial and Commercial Training, 46(2), 92–99. doi:10.1108/ICT-
Petitti, D. B. (2000) Meta-analysis, Decision Analysis and Cost-Effectiveness Analysis. (Oxford
University Press, New York)
Pinto-Molina, M.; Alonso-Berrocal, J.L. … Doucer, A. (2007) Análisis cualitativo de la visibilidad
de la investigación de las universidades españolas a través de sus páginas web. Revista española
de documentación científica, 27 (3), pp. 345-370
Raford, N. (2014). Online foresight platforms: Evidence for their impact on scenario planning &
strategic foresight. Technological Forecasting and Social Change.
Raford, N. (2014). Online foresight platforms: Evidence for their impact on scenario planning &
strategic foresight. Technological Forecasting and Social Change.
Rana, R., Chou, C. T., Bulusu, N., Kanhere, S., & Hu, W. (2014). Ear-Phone: A context-aware noi-
se mapping using smart phones. Pervasive and Mobile Computing.
Reichwald, R. & Piller, F. (2006) Interaktive wertschöpfung. Open innovation, individualisierung
und neue formen der arbeitsteilung. Wiesbaden: Gabler Verlag.
Ren, J., Nickerson, J. V., Mason, W., Sakamoto, Y., & Graber, B. (2014). Increasing the crowd’s
capacity to create: how alternative generation affects the diversity, relevance and effectiveness
of generated ads. Decision Support Systems, 65, 28-39. doi:10.1016/j.dss.2014.05.009
Roopa, T., Iyer, A. N., & Rangaswamy, S. (2013). CroTIS-Crowdsourcing Based Traffic Informa-
tion System (pp. 271–277). IEEE. doi:10.1109/BigData.Congress.2013.43
Rovere, A., Raymo, M. E., O’Leary, M. J., & Hearty, P. J. (2012). Crowdsourcing in the Quater-
nary sea level community: insights from the Pliocene. Quaternary Science Reviews, 56, 164-
166. doi:10.1016/j.quascirev.2012.09.014
Satzger, B., Psaier, H., Schall, D., & Dustdar, S. (2013). Auction-based crowdsourcing supporting
skill management. Information Systems, 38(4), 547–560. doi:10.1016/
Satzger, B., Psaier, H., Schall, D., & Dustdar, S. (2013). Auction-based crowdsourcing supporting
skill management. Information Systems, 38(4), 547-560. doi:10.1016/
Schriner, A., & Oerther, D. (2014). No Really, (Crowd) Work is the Silver Bullet. Procedia Engi-
neering, 78, 224-228. doi:10.1016/j.proeng.2014.07.060
Schumaker, R. P. (2013). Machine learning the harness track: Crowdsourcing and varying race his-
tory. Decision Support Systems, 54(3), 1370–1379. doi:10.1016/j.dss.2012.12.013
See, L., Schepaschenko, D., Lesiv, M., McCallum, I., Fritz, S., Comber, A., … Obersteiner, M.
(2014). Building a hybrid land cover map with crowdsourcing and geographically weighted re-
gression. ISPRS Journal of Photogrammetry and Remote Sensing.
Schenk, E. & Guittard, C. (2009) Crowdsourcing: what can be crowdsourced to the Crowd and
Why? Technical Report.
Soleymani, M., & Larson, M. (2013). Crowdsourcing for multimedia research (pp. 1111–1112).
ACM Press. doi:10.1145/2502081.2502234
Sprugnoli, R., Moretti, G., Fuoli, M., Giuliani, D., Bentivogli, L., Pianta, E., ... & Brugnara, F.
(2013, May). Comparing two methods for crowdsourcing speech transcription. In Acoustics,
Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on (pp. 8116-
8120). IEEE. doi:10.1109/ICASSP.2013.6639246
Stanley, C., Winschiers-Theophilus, H., Onwordi, M., & Kapuire, G. K. (2013). Rural communities
crowdsource technology development: a Namibian expedition (pp. 155–158). ACM Press.
Stol, K.-J., & Fitzgerald, B. (2014). Two’s company, three’s a crowd: a case study of crowdsour-
cing software development (pp. 187–198). ACM Press. doi:10.1145/2568225.2568249
Su, A. I., Good, B. M., & van Wijnen, A. J. (2013). Gene Wiki Reviews: Marrying crowdsourcing
with traditional peer review. Gene, 531(2), 125. doi:10.1016/j.gene.2013.08.093
Sutherlin, G. (2013). A voice in the crowd: Broader implications for crowdsourcing translation du-
ring crisis. Journal of Information Science, 39(3), 397–409. doi:10.1177/0165551512471593
Tatarkiewicz, W. (1980) A History of Six Ideas-An Essay in Aesthetics. ISBN 90-247-2233-0.
Tong, Y., Cao, C. C., & Chen, L. (2014). TCS: efficient topic discovery over crowd-oriented servi-
ce data (pp. 861–870). ACM Press. doi:10.1145/2623330.2623647
Wu, B., Zhong, E., Tan, B., Horner, A., & Yang, Q. (2014). Crowdsourced time-sync video tagging
using temporal and personalized topic modeling (pp. 721–730). ACM Press.
Xintong, G., Hongzhi, W., Song, Y., & Hong, G. (2014). Brief survey of crowdsourcing for data
mining. Expert Systems with Applications, 41(17), 7987–7994. doi:10.1016/j.eswa.2014.06.044
Zeinalipour-Yazti, D., Laoudias, C., Costa, C., Vlachos, M., Andreou, M. I., & Gunopulos, D.
(2013). Crowdsourced Trace Similarity with Smartphones. IEEE Transactions on Knowledge
and Data Engineering, 25(6), 1240–1253. doi:10.1109/TKDE.2012.55
... When it comes to the former, crowdsourcing can be divided into (Estellés-Arolas, Navarro-Giner, & González-Ladrón-de-Guevara, 2015) crowdcasting (the participant who comes up with the best solution to the problem receives a reward); crowdcollaboration (the community is collaborating to solve the problem); crowdcontent (the community independently creates the content requested); crowdopinion (the crowdsourcer is seeking opinions, ratings, and comments on a specific problem); and crowdfunding (funds are collected from community members). ...
The purpose of this chapter is to provide a comprehensive description of crowdfunding and to organise the current knowledge on participant behaviour and marketing management decisions made by campaign initiators. This chapter answers questions such as what is crowdfunding and how does it differ from crowdsourcing? How has crowdfunding evolved and what are the various crowdfunding models and the criteria for their selection by initiators? Additionally, the chapter explores the motivations and decision-making criteria of campaign participants, the different segments of crowdfunding participants, the components of the campaign product, the pricing and distribution strategies, and the most effective marketing communication strategies for achieving campaign objectives. By reviewing existing research, this chapter aims to provide a concise summary of marketing decisions in the area of crowdfunding, making it useful for initiators of campaigns beyond the realm of sports-related activities.
... Published work in Crowdsourcing defines Crowdsourcing as an open-call, free-to-choose mechanism that calls on individual contributors who are skillful, experienced and willing to contribute to a particular piece of work or service (Hosseini and Mahmoud, 2014;Estell es-Arolas et al., 2015;Kietzmann and Jan, 2017). This mechanism also involves controlling and rewarding participants of a work (Chandler and Mueller, 2013; Crowdsourcing in User Experience collection Goh et al., 2017;Cappa et al., 2019). ...
Purpose This paper aims to prove the following hypothesis Problem Statement: HYPOTHESIS (1) User Experience collection of mobile applications can be done using the Crowdsourcing mechanism; (2) User Experience collection of mobile applications are influenced by the mindset of Crowdmembers, culture/ethnicity/social background, ease of interface use and rewards, among other factors. Design/methodology/approach The authors of this paper, did a literature review first to find if Crowdsourcing was applicable and a used method to solve problems in Software Engineering. This helped us to narrow down the application of Crowdsourcing to the Requirements Engineering-Usability (User Experience) collection. User experience collection of two Malayalam language-based mobile applications, AarogyaSetu and BevQ was done as the next step. Incorporating findings from Study I, another study using AarogyaSetu and Manglish was launched as Study II. The results from both cases were consolidated and analyzed. Significant concerns relating to expectations of Crowd members with User Experience collection were unraveled and the purpose of Study was accomplished. Findings (1) Crowdsourcing is and can be used in Software Engineering activities. (2) Crowd members have expectations (motivating factors) of User Interface and other elements that enable them to be an effective contributor. (3) An individual’s environment and mindset (character) are influential in him becoming a contributor in Crowdsourcing. (4) Culture and social practices of a region strongly affects the crowd-participating decision of an individual. Originality/value This is purely self-done work. The value of this research work is two-fold. Crowdsourcing is endorsed significant in Software Engineering tasks, especially in User Experience collection of mobile applications. Two, the Crowd service requesters can be careful about designing the questionnaire for Crowdsourcing. They have to be aware and prepared to meet the expectations of the Crowd. This can ensure the active participation of potential contributors. Future researchers can use the results of this work to base their research on similar purposes.
... The use of crowdsourced data is on the rise in many industries due to the low density of conventional traffic detectors [9]. Google Maps Traffic Layer (GMTL) is based on the analysis of shared data from nearly 2.5 billion monthly users worldwide (according to Google statistics), with an accuracy rate of approximately 85% [10]. ...
... The more representative SDGs is number 3, 'good wealth and well-being', followed by the number 11. Jeff Howe and Mark Robinson first coined the term crowdsourcing in 2006 (Bass, Vermillion, & Putz, 2014). It describes a type of participative online activity in which an individual, an institution, a non-profit organisation or a company proposes to an undefined but large group of people the voluntary undertaking of a task through an open call (Brabham, 2008;Estellés-Arolas, Navarro-Giner, & González-Ladrón-de-Guevara, 2015;Howe, 2008). Crowdsourcing platforms allow us to exploit the power of people's collaborative work (crowd) to accomplish complex and modular tasks that were previously exclusive to a few specialists (Howe, 2008). ...
... Currently, Crowdsourcing is defined as a type of activity carried out via the Internet, in which an individual, institution, non-profit organization, or company proposes, on the basis of an open invitation, to voluntarily undertake a task and directs it to an unspecified number and heterogeneous group of people with varied levels of knowledge [7]. Crowdsourcing emphasizes the transfer of work results, knowledge, and information in exchange for financial gratification or other motivations (e.g., volunteering), whereas crowdfunding is associated only with the flow of financial resources to the project initiator motivated by an investment activity, with the expectation of a specific reward or philanthropy [8]. ...
Full-text available
The significance of crowdfunding platforms enabling the financing of innovative business projects via the Internet has been increasing in recent years. The aim of the article is to analyze the largest crowdfunding platforms in Poland in terms of implementing business initiatives. The article consists of three parts; the first part includes a review of the literature and a presentation of selected definitions of crowdfunding and its models. The second part presents the characteristics and results of a study of the five largest crowdfunding platforms in Poland, i.e.,,,,, and The final part indicates the similarities and differences of the largest crowdfunding platforms in Poland as well as showing the possibilities they provide for people looking for an alternative source of financing for business projects. The article uses the prevailing literature as well as an analysis and evaluation of documents, reports, and the websites of the largest Polish crowdfunding platforms.
Full-text available
The integration of online media techniques in strategic communication projects. Case study of strategic communication on Serbia-NATO relations.
Full-text available
The popularity of online crowdsourcing platforms was slowly increasing among language learners before the pandemic, but COVID-19 changed the educational systems worldwide. This study aims to uncover whether or not, and if ‘YES’, how the attitudes and habits of language learners concerning the use of crowdsourcing materials in Turkey, Bosnia and Herzegovina, the Republic of North Macedonia and Poland changed during the pandemic. To compare the pre-and during the covid crowdsourcing tool usage, the cross-culturally appropriate questionnaire utilised in the pre-COVID-19 period was used again. The collected data were analysed qualitatively and quantitatively to identify the differences between the periods. The study’s findings showed that the shift from face-to-face to online learning significantly affected the development of crowdsourcing platforms worldwide and their employment in the studied countries. The results also demonstrated that a combination of factors, such as reduced interactions with teachers and peers, an increase in workload, and a lack of support on the part of institutions, led to students taking responsibility for their learning. The number and characteristics of the popular platforms changed from country to country since expectations from students varied.
Full-text available
In an era of big data and fake news, museums’ collection practices are particularly important democratic cornerstones. Participatory technologies such as crowdsourcing or wikis have been put forward as a means to make museum collections more open and searchable, motivated by a desire for efficiency but also as a way to engage the public in the development of a more diverse and polyphonic heritage. However, there is a lack of a nuanced vocabulary to describe participatory technologies in terms of democracy. Without a deeper understanding of how technology shapes the overall structures, there is a risk that the tools instead undermine democratic ambitions. Addressing the need to conceptualize democracy in these contexts, we therefore develop a framework for participatory technologies with an eye toward the long-term development and preservation of cultural heritage. In this framework different democratic processes intersect with democratic values, from a liberal conception of democracy to a more deliberative democracy, to an agonistic pluralism emphasizing the importance of acknowledging conflict and diversity. To firmly ground our vocabulary in museum collection practices, we have investigated two cases from museums in the US that have opposite participatory strategies for enriching images with metadata; the Smithsonian Transcription Center, and the National Gallery of Art collection on Wikimedia Commons. These cases demonstrate how the framework can be used to identify patterns of participation showing the support for different values and processes. Furthermore, our conceptual investigation points out a contradiction in Human–Computer Interaction (HCI) research, between the pluralism and conflicts emphasized in more critical and participatory design perspectives used in the development of design , and the features in the actual design of participatory technologies , emphasizing consistency and access.
Full-text available
Businesses face a changing environment every day. The changing environment will increase the complexity of the businesses that deal with it. Thus, the Business Models have been used to address this problem. However, this paper proposes the Business- IT Models, which are formed by working closely with IT. There are several varieties of the Business-IT Models, and this paper addresses three of them, which are the Social Model, the Search Model and the Crowdsourcing Model. All these are proven models for today's challenging business environment and have already been implemented by successful companies.
This book is an introduction to three methods of quantitative synthesis-meta-analysis, decision analysis, and cost-effectiveness analysis. These methods are used widely to summarize information in order to guide the formulation of clinical recommendations and guidelines, and in clinical decision-making and health policy. The book gives step-by-step instructions on how to conduct studies that use each of the three methods, emphasizing the need for rigor. Important controversies about the statistical and mathematical theories that underlie the methods are highlighted, and key assumptions are identified. The methods are critically appraised and practices that should be avoided are identified. Despite the time that has elapsed between the last revision in 2000, the book remains a relevant and highly accessible source of information on how to conduct studies that use the three methods.
This paper focuses on customers' participation in a product development process through crowdsourcing practices. Results from five case studies of consumer goods companies suggest that the implementation of crowdsourcing operations affects the components of an existing business model and requires rethinking the marketing function. Moreover, despite some organizational constraints and fears, crowdsourcing generates a win-win relationship, creating value for both firms and customers. However, the findings reveal two negative consumer reactions to crowdsourcing practices, i.e., feelings of exploitation and being cheated, that may jeopardize their success. The results suggest the need to establish an open business model based on crowdsourcing.
Common conceptions of user production overstate its novelty while understating its variety. To respond to these lacks, this article develops an analytic map that enables a more nuanced historical and critical understanding and investigation of user production. Historical approaches are well suited for capturing the multifarious forms of user production while also avoiding the problem of a-historicism. The article develops a contextual typology of user production by theorizing it as a cultural form. Primary forms discussed include those of buyer; source, subject, interviewee; contestant; supplier; financier; inventor; activist; and member, all of which are related to their generative formations and conditions. The article concludes by suggesting the kinds of questions and studies that such a framework makes possible.