ArticlePDF Available

How Digital Platforms Organize Immaturity: A Sociosymbolic Framework of Platform Power

Authors:

Abstract and Figures

The power of the digital platforms and the increasing scope of their control over individuals and institutions have begun to generate societal concern. However, the ways in which digital platforms exercise power and organize immaturity—defined as the erosion of the individual’s capacity for public use of reason—have not yet been theorized sufficiently. Drawing on Bourdieu’s concepts of field, capitals, and habitus, we take a sociosymbolic perspective on platforms’ power dynamics, characterizing the digital habitus and identifying specific forms of platform power and counterpower accumulation. We make two main contributions. First, we expand the concept of organized immaturity by adopting a sociological perspective, from which we develop a novel sociosymbolic view of platforms’ power dynamics. Our framework explains fundamental aspects of immaturity, such as self-infliction and emergence. Second, we contribute to the platform literature by developing a three-phase model of platform power dynamics over time.
Content may be subject to copyright.
How Digital Platforms Organize Immaturity:
A Sociosymbolic Framework of Platform
Power
Martín Harracá
University of Surrey, UK
Itziar Castelló
City University of London, UK
Annabelle Gawer
University of Surrey, UK
The power of the digital platforms and the increasing scope of their control over
individuals and institutions have begun to generate societal concern. However, the
ways in which digital platforms exercise power and organize immaturitydefined
as the erosion of the individuals capacity for public use of reasonhave not yet
been theorized sufficiently. Drawing on Bourdieus concepts of field, capitals, and
habitus, we take a sociosymbolic perspective on platformspower dynamics,
characterizing the digital habitus and identifying specific forms of platform power
and counterpower accumulation. We make two main contributions. First, we
expand the concept of organized immaturity by adopting a sociological perspective,
from which we develop a novel sociosymbolic view of platformspower dynamics.
Our framework explains fundamental aspects of immaturity, such as self-infliction
and emergence. Second, we contribute to the platform literature by developing a
three-phase model of platform power dynamics over time.
Key Words: organized immaturity, autonomy erosion, digital platforms, power,
surveillance
Organized immaturity, defined as the erosion of the individuals capacity for the
public use of reason (Scherer and Neesham 2020), differs from other forms of
control in that it is a self-inflicted and emergent (as opposed to orchestrated) collec-
tive phenomenon in which autonomy-eroding mechanisms mutually reinforce each
other (Scherer and Neesham 2020, 9).
The phenomenon of autonomy erosion and increasing user control has been
discussed in the context of the dark side of digitalization (Flyverbom, Deibert,
and Matten 2019; Trittin-Ulbrich et al. 2021). Scholars have looked at how the
Business Ethics Quarterly (2023), pp. 133. DOI:10.1017/beq.2022.40
Published by Cambridge University Press on behalf of the Society for Business Ethics.
© The Author(s), 2023.
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
automation of interactions through algorithms can lead to an emergent manipulation
of choice and autonomy erosion (Alaimo and Kallinikos 2017; Beer 2017; Just and
Latzer 2017; Orlikowski and Scott 2015), but there is still little exploration of the
organizing role of platforms in this process.
Digital platforms have been described as organizational forms that orchestrate
activities between independent users through the use of digital interfaces (Gawer
2014,2021; Constantinides et al. 2018; Cusumano et al. 2019; McIntyre et al. 2021).
Increasingly, scholars denounce the negative effects of power accumulation by
digital platforms and platform owners. For example, studies of the structural con-
stitution of markets criticize gatekeeping positions that impose discriminatory
clauses or limit content access and creation, with consequences for userschoices
(Crémer et al. 2019; Jacobides 2021; Khan 2018). Other researchers, such as Kelkar
(2018), Stark and Pais (2020), and Flyverbom et al. (2019), discuss sociomaterial
perspectives on platforms and show how platform owners design the interfaces,
prescribing what is accessible to users and what choices they may enjoy in the digital
platform; this, again, restricts choice and creates negative psychological effects on
users (Seymour 2019; Wu et al. 2019). Lanier (2018) and Zuboff (2019) present
systems of surveillance promoted by the power of digital platforms that explain how
the datafication of human experience leads to increasing forms of domination.
These studies provide valuable explanations of how the increasing power of
platforms hinders freedom of choice and individual autonomy. However, their
explanations are partial, focusing either on the market mechanisms that limit con-
sumer choice or on the specific role of digital objects, such as algorithms, that
constrain the platform usersautonomy. The fundamental aspects of the organizing
of immaturity, such as the tension between organizing and emergence, and the
relationship between self-infliction and the power accumulation strategies of key
agents, such as platform owners, remain unexplored though. These tensions are
essentialto explaining how organized immaturity is created and reproduced. We claim
that there is a need to explain the power accumulation of the different agents of the
platforms and its relation to the mechanisms that lead to the delegation of autonomous
decision-making. Therefore, in this article, we ask, How do digital platforms organize
immaturity?
To tackle this issue, we build a sociosymbolic perspective of power accumulation
in digital platforms inspired by Bourdieus writings (Bourdieu and Wacquant 2007;
Bourdieu 1977,1979,1984,1987,1989,1990,1991,2005,2011,2014). A socio-
symbolic perspective supports building a dynamic conceptualization of power
accumulation based on agentspractices, positions, and strategies. The concepts
of field evolution and habitus allow further explanation of the emergence of imma-
turity and the mechanisms of self-infliction. By situating the concepts of fields,
capitals, and habitus in the context of digital platforms, we describe digital platforms
as organizations mediated by a digital infrastructure and a digital habitus in which
agents accumulate capitals by operating in a field. We explain the role of the digital
habitus in organizing immaturity, complementing prior literature on materiality and
affordances. We propose a framework of power accumulation in which the dynam-
ics of platform owner power accumulation and counterpower accumulation coexist.
2B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
The platform owner accumulates power in five forms: constitutional, juridical,
discursive, distinction, and crowd. There are two forms of counterpower: crowd
and hacking. We also explain the evolution over time of the power dynamics and
propose a three-phase model in which the forms of power operate. These phases are
platform formation, platform domination within the original field, and platform
cross-field expansion.
This framework makes two significant contributions. First, we build a theoretical
apparatus that explains the organizing dynamics of immaturity by explaining the
relations between the structure, the digital objects, and the platform owners power
accumulation strategies. From these, we can explain the tension of emergence and
self-infliction. With this framework, we draw on sociological perspectives to expand
the understanding of organized immaturity in digital spaces by focusing on describ-
ing the practices that constitute the webs of relations that configure the digital habitus
and the processes of power accumulation. Second, we contribute to the platform
literature by developing a three-phase model of platform power dynamics over time.
This model expands current views on platform power, providing a more holistic
scheme in which power is both accumulated and contested and highlighting how
agents other than the platform owner play a role in producing and exercising forms of
power. This article concludes by providing policy recommendations on how to
understand and tackle organized immaturity and highlighting potential avenues
for further research.
ORGANIZED IMMATURITY
Organized immaturity has been defined as a collective, albeit not necessarily orches-
trated, phenomenon where independent reasoning is delegated to anothers guidance
(Scherer and Neesham 2020). It is inspired by the Kantian principle that humans
should have intellectual maturity involving autonomy of judgment, choice, and
decision-making without the guidance of an external authority. It also relates to
the ability to use experience to reason and reflect critically and ethically on complex
or problematic situations and to challenge norms and institutions (Scherer and
Neesham 2020). The concept of organized immaturity differs from other forms of
control in two ways. First, it is a self-inflicted(Kant, as cited in Scherer and
Neesham 2020, 8) process, referring to harm done by humans to themselves, often in
a nonconscious manner. From this perspective, immaturityis therefore a condition
of the human being that arises when an individual defers or delegates their
own autonomous reasoning to external authorities (Dewey 1939). The second
way in which organized immaturity differs from other forms of control is that it is
an emergent (as opposed to orchestrated) collective phenomenon in which
autonomy-eroding mechanisms mutually reinforce each other (Scherer and Nee-
sham 2020, 9).
According to Scherer and Neesham (2020), the study of immaturity relates also to
its organizing elements. The perpetuation of modern forms of immaturity has been
associated to organizations and institutions that create the conditions for the self-
inflicted immaturity. Organized forms of immaturity have been addressed in the
3H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
critical analysis of bureaucratic organizations, where the individual is subject to
various forms of domination and control (Clegg 1989; Hilferding 2005).
The Fourth Industrial Revolution (Schwab 2017; Philbeck and Davis 2018) has
ushered in a consolidation of the globalized information and communication tech-
nologies that are driving the organization of economic life. However, the infrastruc-
tures and mechanisms behind these sociotechnological systems curb individual
liberties and impact peoples autonomy (OConnor and Weatherall 2019; McCoy,
Rahman, and Somer 2018).
The term organized immaturity is not explicitly used in most of the literature
studying forms of control related to digitalization (with the exception of Scherer and
Neesham [2020] and Scherer et al. [2020]), but scholars are increasingly analyzing
the dark side of digitalization(Flyverbom et al. 2019; Trittin-Ulbrich et al. 2021).
In particular, attention has been directed to the use of big data and systems based on
artificial intelligence and to how the automation of interactions through algorithms
can lead to an emergent manipulation of choice. Even the basic algorithmic function
of search and match creates power asymmetries, since the inspection or control of its
guiding principles presents technical challenges for both users and regulators (Beer
2017; Just and Latzer 2017). Biases might be found in the criteria for how results
are limited, displayed, and sorted (Faraj, Pachidi, and Sayegh 2018) and may even
amplify properties of the data used as input, as has been observed in the context of
racial biases (Noble 2018). Researchers are increasingly pointing at the importance
of unpacking the consequences of algorithms in conjunction with a socially struc-
tured analysis of the device (e.g., Beer 2017; Introna 2016; Orlikowski and Scott
2015). Through this, they show how the black box of algorithmic culture
(Orlikowski and Scott 2015; Pasquale 2015; Striphas 2010) creates a world of
secrecy that eschews questioning and abrogates responsibility (Introna 2016), erod-
ing autonomous decision-making.
However, this emphasis on the artificial intelligence tools, algorithms, and coding
processes that hinder autonomy in decision-making must be complemented by
research into the organizing structures of immaturity, that is, the key organizing
agents. Studying digital platforms can improve understanding about how organized
immaturity happens, as these platforms organize social interactions and transform
the power relations of the different agents who participate in the digital exchanges.
PLATFORMS AND THE ACCUMULATION OF POWER
Platforms as Organizing Agents
In the platform literature, digital platforms have been described as new organiza-
tional forms that orchestrate activities between independent users through the use of
digital interfaces (Gawer 2014; Kretschmer et al. 2022; McIntyre et al. 2021).
Platforms can be considered a particular kind of technology of organizing
(Gulati, Puranam, and Tushman 2012, 573) or hybrid structures between organi-
zations and markets(Kretschmer et al. 2022, 4), as they use a mixture of market and
hierarchical incentives to coordinate autonomous agents. Platform organizations
are distinct from hierarchies, markets, and networks (Gawer 2014) because, as
4B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
Kornberger et al. (2017, 81) argued, platform organizations question not only
extant organization designs but also, quite fundamentally, the [Coasian] idea of
the firm and of value creation processes.
Two fundamental characteristics define the digital platform as an organizing
agent: how its digital architecture is structured and how it coordinates interactions.
From an organizational perspective, platforms can be described by the common set
of design rules that define their technological architecture. This system is charac-
terized by a coreor center component with low variety and a complementary set of
peripheralcomponents with high variety (Tiwana, Konsynski, and Bush 2010).
The rules governing interactions among the parts are the interfaces (Baldwin and
Woodard 2009). Interfaces contribute to reduce a systems complexity by greatly
simplifying the scope of information required to develop each component (Gawer
2014). Together, the center, the periphery, and the interfaces define a platforms
architecture (Baldwin and Woodard 2009). The centerperiphery structure therefore
defines an asymmetric framework in which the participants collaborate and compete
(Adner and Kapoor 2010), under conditions set by the platform owners on two
elements: openness and governance rules (Gawer and Henderson 2007; Boudreau
2010).
Platforms coordinate transactions by creating multisided markets,in which
their owners act as intermediaries to bring together (match) and facilitate exchanges
between different groups of users by aligning market incentives (Rochet and Tirole
2003). Interactions occur in a networked structure, implying that the value derived
from platform usage increases exponentially with each additional user (Katz and
Shapiro 1985). As the value for participants grows with the size of the platform, it is
optimal for them to converge on the same platform, leading to the prediction that
platforms will tend to create concentrated markets organized by increasingly pow-
erful owners (Caillaud and Jullien 2003; Evans 2003).
PlatformsAccumulation of Power and the Consequences for Individuals
Autonomy Erosion
The characteristics of platforms described in the preceding section have facilitated
the accumulation of power by platform owners, leading to new forms of domination
and competition(Fuchs 2007, 7) that are increasingly eroding peoples capacity to
make independent decisions. The consequences of the platformspower accumu-
lation for manipulation of choice and autonomy delegation have been analyzed from
two perspectives: first, in relation to the structural constitution of markets and how
this structure can lead to manipulation of userschoices, and second, from a socio-
material perspective that looks at the interaction of digital objects (e.g., algorithms)
and the platform users.
From the perspective of the structural constitution of markets, the accumulation of
power and manipulation of choice is associated to the growing centrality of large
platforms in the economy. Consumers and business partners can have their choices
manipulated because of the specific intermediary role that platforms play. Once the
market has been tipped, this role provides the platform owner with a position from
which they can charge supramonopoly prices and define the rules of the market,
5H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
including who can access it and how the transactions occur (Busch et al. 2021; Khan
2018; Jacobides 2021). In this way, platforms are increasingly operating as gate-
keepers, imposing discriminatory clauses or limiting content access and creation
(Stigler Committee on Digital Platforms [SCDP] 2019; Furman 2019). Choice
making can also be limited due to market concentration driven by platforms, in that
a platform enhances its owners opportunities to leverage its assets (Khan 2018).
Thus the owner can entrench their (platforms) position in a market and enter an
adjacent one by creating economies of scale and scope (Khan 2018; Jacobides
2021). This brings the possibility of creating a dominant position in apparently
unrelated markets through practices like vertical integrations, killer buys, predatory
pricing, and self-preferencing (Crémer et al. 2019; Furman 2019). In addition, the
capture and control of transactional data may be used to improve platform services,
while also enabling the creation of entry barriers that fend off competition (Khan
2018).
Market-based analyses provide a view of power accumulation based on asset
control and market position. However, they have been criticized for overlooking the
impact of other noneconomic dimensions and for portraying power as relatively
unidirectional (Margetts et al. 2021; Lynskey 2017,2019). Such critiques recognize
that the deep social impact of platform power cannot be tackled from a market
perspective alone (Margetts et al. 2021; Lianos and Carballa-Smichowski 2022).
Sociomaterial perspectives place affordances and materiality of the digital objects
at the center of the platform interactions (Fayard and Weeks 2014; Kornberger 2017;
Curchod et al. 2019). In this perspective, digital objects, such as code, interfaces, and
algorithms, are described as central objects that can hinder autonomy. For example,
when platform owners design the interfaces, they define the category of user,
prescribing what is accessible to users and what choices they enjoy in the digital
platform (Kelkar 2018). Encoding, which comprises the rules for how offline objects
and actions are translated into a digital language (Alaimo and Kallinikos 2017), is
also defined by platform owners. Once codified, actions must be performed in
accordance with the rules established by the platform. Thus the affordances of
technology shape and mold the interactions of the users with the platforms
(Alaimo and Kallinikos 2017). Furthermore, algorithms and codes have been
denounced for their opacity (Etter and Albu 2021). The inspection and control of
a platforms guiding principles present technical challenges for both users and
regulators (Beer 2017), which enables manipulation. For example, Seymour
(2019) and Wu et al. (2019) describe how the manipulation design techniques
employed by platform firms like Facebook and Twitter are worrying not only
because they affect an individuals freedom of choice but also because they can
cause users to experience harmful psychological effects, such as addiction.
Yet the aforementioned studies of affordances and materiality offer a limited
understanding of how emergence and self-infliction of organized maturity are
patterned by the strategic choices of platform owners and other agents. To further
understand the organized immaturity of digital platforms, it is important to look at
how practices are shaped and organized by the relations between the technological
6B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
objects, the different usersstrategies, and structural elements that conform the
power accumulation of the platform.
Some scholars have begun to offer holistic models that explain the accumulation
of power by platform firms and its consequences for the authority erosion of different
agents. Lanier (2018) and Zuboff (2019) describe digital platformsdatafication of
human experience, which leads to increasing forms of domination in what they term
surveillance capitalism.Surveillance is enabled by the asymmetric positions of
platform owners and users, defined by technological architecture, and executed
through monetization strategies based on user data. Zuboff (2019) argues that
despite the explicit narrative of platforms as both positive and objectively inevitable,
their strategies and business modelsbased on voluntary data sharingare funda-
mentally connected to the extraction of economic rents. Surveillance reduces human
experience to free raw material for translation into behavioral data and prediction
products (Zuboff 2019), eroding individual autonomy and disrupting intellectual
privacy (Richards 2012). Surveillance has become a naturalized practice that we all
willingly or notperform (Lyon 2018). Surveillance theories therefore contrib-
ute to this debate by offering an understanding of the instrumental connection
between the business model and technological objects that constitute the platform
and the self-infliction aspects of immaturity processes.
Yet, we argue that further work is needed to understand not only the expansion of
immaturity through a system of economic surveillance but also how the everyday
practices of leading and participating in the platform relate to immaturity emergence.
Moreover, we argue that these views should be enriched with a theory of how agency
is constituted and transformed by platform power dynamics, how these dynamics
have an organizing role in producing and reproducing the delegation of autonomous
decision-making, and how the emergence of immaturity and the strategic power
accumulation by platform owners are connected.
A SOCIOSYMBOLIC PERSPECTIVE OF DIGITAL PLATFORMS
To further explain how platforms organize immaturity, we draw on Bourdieus
sociosymbolic theory and the concepts of field, capitals, and habitus. A sociosym-
bolic perspective situates the agents in a field and explores the power accumulation
dynamics of each agent. It takes materiality into consideration, but, through
the concept of habitus, it is able to explain how interactions are also mediated by
previous history and the networks of relations in a way that complements the notion
of affordances and its connotations for the perception of physical artifacts and
technology (Fayard and Weeks 2014). Furthermore, a sociosymbolic approach
allows us to build an integrative conceptualization of power accumulation and its
dynamics based on agentspractices, positions, and strategies. It shows how multiple
types of powers can coexist and accounts for how the relative positions of agents
shape their motivations and actions, explaining the practices of immaturity and its
relation to self-infliction. We explain this further, first by providing an overview
of how a sociosymbolic perspective generally explains power and its dynamics
through the concepts of field, capital, and habitus; we thus show how digital
7H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
platforms can be understood through these lenses. Second, we describe the dynamics
that lead to specific forms of power accumulation and explain how they can evolve
over time.
Fields, Capitals, and Habitus in Digital Platforms
Bourdieus sociosymbolic theory was developed to explain social stratification and
dynamics in (offline) societies by focusing on how agents (people, groups, or
institutions) produce, reproduce, and transform social structures through practice
(i.e., what they do in everyday life). Through practice, agents produce particular
social spaces with specific boundaries demarcated by shared interests and power
relations; these social spaces are termed fields of practice (Bourdieu and Wacquant
2007).
Fields
A field (champ) is a key spatial metaphor in Bourdieus work. It represents a
network, or a configuration, of objective relations between positions(Bourdieu
and Wacquant 2007, 97). These positions are objectively defined to field occu-
pants, agents or institutions by their present and potential position (situs) in the
structure of the distribution of species of power (or capital)(Bourdieu and Wac-
quant 2007, 97). Individuals, groups, or organizations can be agents in a given field,
and one individual may have different agencies (or roles), depending on their
situation in the field.
The concept of fieldcan be related to digital platforms in the sense that the
organization and production of practices situates the platform in relation to an
existing field. This may be the field of cultural production (e.g., Facebook) or the
field of goods exchange (e.g., Amazon). The fields have specific logics and struc-
tures that define them. Different agents can have multiple roles; for example, an
Instagram user may be both a contributor and a consumer of content. The relational
aspects of the fields are also very compatible with network-based perspectives
(Portes 1998) because the field in which the platform is embedded functions on
the basis of relations created during the practice of exchanges that constitute the
field. The technological infrastructure creates a centerperiphery structure, which
provides the foundation on which the practices occur, both enabling and regulating
them. This approach to platforms highlights the practice of the agent and its position
but also simultaneously shows how the platforms constitutive elements are deeply
interconnected. Taking Twitter as an example, the extent to which a specific content
generated by a user is reproduced depends on the users social position in the
network but also on the priorities defined by the platforms algorithms, which create
the structure in which the content is shared.
Multiple nested and overlapping fields can be found on any platform, just as they
are in any (offline) social context. For example, YouTube constitutes a huge field of
people broadly interested in sharing and viewing online video content. However,
YouTube also hosts a variety of other, more focused subfields, for instance, a field
centered on cryptocurrency videos. At the same time, platforms do not necessarily
constitute a field in its entirety, for while some online fields exist mostly in a single
8B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
platform, like the field of video content sharing on YouTube, competing platforms
have entered some subfields, such as gaming videos in Twitch. At the same time,
other online fields are embedded in larger fields of practice. For example, job seekers
would look at job opportunities in LinkedIn while engaging offline with the job-
offering companies.
Yet the creation of a digital platform can also be conceptualized as an attempt to
enclosepart of a field: an agent (the platform creator) designs a value creation
model for users (the specific practices to be performed by them within the field) and
develops the digital infrastructure that makes interactions possible. Digital platforms
enclose the field because they attempt to create exclusive control rights(Boyle
2003) over dimensions of practices that were previously in the public domain.
Consider Googles Street View, launched in 2007, which permits users to view
the fronts of buildings from a pedestrians viewpoint. The service utilizes photo-
graphs taken by Google of objects that are not covered by intellectual property
rights, albeit that the photographs were taken without the authorization or agreement
of the communities, and their use is monetized (Zuboff 2019). In this case, Google
Street View becomes not only a new service for users but also a new way of
exploiting value through dispossession of public goods and private data (Zuboff
2019).
A field enclosure by a platform also includes encoding social interactions defined
by more or less variable practices (e.g., hailing a taxi on the street) into a precisely
defined process in a controlled space (using a ride-hailing app). This appropriation is
produced through the codification of social interactions, control over the digital
space, and the data generated by these interactions. Moreover, by enclosing a field,
digital platforms modify both the practices and the agentsrelative positions. For
example, drivers and passengers are inscribed into a database owned by the platform
owner and are organized into groups from which they are picked and matched.
Furthermore, the creation of the platform can transform the scope of the field.
Digitalized practices often involve connecting with deeply intimate aspects of users
lives (Lupton 2016), such as private data exemplified in photos, comments, or
information about consumption habits. While typically regarded as private, the
encoding of these portions of experience puts them into the potential reach of a field
and exposes them to its specific field logic. Furthermore, because of the new ways of
performing certain practices, platforms collide with the established scopes of the
field, changing the agents and institutions involved in it. This is the so-called
disruptive nature (SCDP 2019) of the platform. Examples can be found in conflicts
around regulatory frameworks triggered by the introduction of platforms to some
industries, such as Ubers entry into the field of transportation and Airbnbs into
hospitality.
Capitals
Fields are dynamic spaces defined by the relations of power between players that
constitute the structure of the field (Bourdieu and Wacquant 2007). These relations
result from the possession and activation of resources that are both materially and
9H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
symbolically produced and perceived(Bourdieu 1989, 16). These resources are the
capitals.
The accumulation of capitals give access to the specific profits that are at stake in
the field, as well as by their objective relation to other positions (domination,
subordination, homology, etc.)(Bourdieu and Wacquant 2007, 97). In each of
the specific fields, the spaces of objective relations are the sites of a logic specific to
those who regulate the fields. This logic does not need to follow purely economic
rationalities to be described (Sandberg and Alvesson 2011). For example, TikTok
users who copy their nearest higher-status digital neighbors in a particular contest or
dancemight not be guided by economic rationality, but they do follow the logic of
the platform.
Capitals are therefore the resourcesscarce and socially valued stocks of inter-
nalized abilities and externalized resourcesthat each agent has. Bourdieu defines
three fundamental forms of capital through which power is accumulated: economic
capital (money and other assets), cultural capital (knowledge and familiarity with
accepted norms), and social capital (reflected in the actors creation of connections
and social networks) (Bourdieu 2011). To these, Bourdieu adds symbolic capital,
which is the form that one or another of these species takes when it is grasped
through categories of perception that recognize its specific logic, [that] misre-
cognize the arbitrariness of its possession and accumulation(Bourdieu and Wac-
quant 2007, 118), that is, the reflection in the relations of the field of accumulated
prestige, consecration, or honor (Bourdieu 1993). For Bourdieu, power struggles are
mainly symbolic, and agents who are willing to increase their power will ultimately
exercise the symbolic capital that will help them to be perceived and recognized as
legitimate(Bourdieu 1989, 17) in what Bourdieu (1984) also calls distinction.
Social dynamics in fields are centered on the generation of distinction(s) by
agents, who constantly work to differentiate themselves from their closest rivals
(Bourdieu and Wacquant 2007, 100), although the actorsparticipations in these
games are typically no more than unconscious or semi-conscious strategies
(Bourdieu 1969, 118). Distinction operates through the accumulation of capital that
matters to the field. Thus fields are spaces of conflict and competition in which the
hierarchy is continually contested. However, agents can attempt to convert one form
of capital into another or transfer it to a different space, depending on the specific
logic of the field (Levina and Arriaga 2014).
The concept of distinction can be assimilated to the concept of statusas it is used
to explain the means of interaction on digital platforms (Levina and Arriaga 2014).
For example, on digital platforms like YouTube, a users social network position and
cultural skills (e.g., their offline knowledge about a particular topic) combine with
their taste and the time and money they invest into the field. Together, these shape
which content gets noticed and which is ignored (Levina and Arriaga 2014) and
therefore which agents become influencersor agents with high status in the
network.
10 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
Habitus
Besides the description of how agents, through their collective actions, shape
emergent field structures and the understanding of which capital matters and how,
Bourdieu also looks at how structure shapes agency. Bourdieu uses the notion of
habitus to describe the socially learned schemata of perception and inclinations to
action (Bourdieu and Wacquant 2007). Habitus is the internalization of the logic of
the field. It is a set of historical relations incorporated within individual bodies in the
form of mental and corporeal schemata (Ignatow and Robinson 2017). These
relations, or the system of schemes of perception and appreciation of practices,
cognitive and evaluative structures,are acquired through the lasting experience of
a social position(Bourdieu 1989, 19); that is, they are acquired through interaction
with other social agents. The habitus includes related comportment (posture and
gait), aesthetic likes and dislikes, habitual linguistic practices, and ways of evalu-
ating oneself and others via categories. It forges not only actions but also desires and
aspirations (Ignatow and Robinson 2017). While cognitively embedded, it is also
embodied in gestures, postures, movements, and accents (Ignatow and Robinson
2017). Its reproduction depends mainly on institutions like family and school.
Mastery of the habitus tends to guarantee distinction and constancy of practice over
time (Bourdieu 1990).
Crucially, the constitution of the habitus is recursive: while agents can reshape
social distance and the ways it may be perceived, their own perception is likewise
framed by their own position in the social structure. This recursive cycle is the
process of constitution of the sociosymbolic space, where changes in position can be
understood as the outcome of symbolic struggle. Habitus is therefore a way of
conceptualizing how social structures influence practice without reifying those
structures (Costa 2006).
In his studies of class, taste, and lifestyles, Bourdieu (1984) illustrates how habitus
shapes taste in ways that make a virtue out of necessity. For example, working-class
people develop a taste for sensible, plain food, furnishings, and clothes, and they
shun fancy extravagances (Bourdieu 1984). Hence habitus leads to the choice of the
necessary,and in so doing, it tends to generate practices that ultimately reproduce
the original objective conditions, through which it functions as structure (Costa
2006). Thus, given a set of conditions, habitus affords an actor some thoughts and
behaviors and not others, making those thoughts and behaviors seem more appro-
priate, attractive, and authentic than others(Fayard and Weeks 2014, 245). Ulti-
mately, however, it is the actor who decides what to do. Often the decision occupies
no conscious thought, but, as Bourdieu (1990, 53) argues, it is never ruled out that
the responses of the habitus may be accompanied by strategic calculation tending to
perform in a conscious mode.
The concept of digital habitus has been used in the analysis of digital spaces (e.g.,
Levina and Arriaga 2014; Julien 2015; Ignatow and Robinson 2017; Romele and
Rodighiero 2020) to explain the ways of acting, namely, the social and technolog-
ically ingrained habits, skills, and dispositions that define the practices in the digital
field. Ignatow and Robinson (2017) argue that digital machines are not only the
11H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
crystallized parts of habitus but also habitus producers and reproducers. This is
because practices performed in digital platforms have technological and symbolic
mediations: they are digitizedcodedand they are performed through a constant
interaction with algorithms and the data that feed the learning of the algorithms. For
algorithms to constitute the habitus, they need the platform to be able to extract
increasingly large amounts of data and transform them into capital. In this context,
the data work as the culture that informs the knowledge about the social space. The
norms of the platform are constantly shaped by the interaction between the data, the
algorithm, and the agents. The capital created by this interaction can be appropriated
by certain agents who know how to use these results to their advantage.
The mechanism of the digital habitus has two consequences. As socialization is
increasingly done through digital platforms, the algorithmic logic becomes a norm
that everyone needs to learn to play by or with (Beer 2017), and thus it becomes part
of the habitus. It becomes the representation of the current taste of a social class or
group so that their decisions resemble each other. However, unlike the offline
habitus, it derives from code as well as from action; thus it is somehow defined
behind closed doors by the platform owners. Second, as Ignatow and Robinson
(2017) argued, the digital habitus becomes a (re)generator of the social group
because it is mediated by the property of the algorithmic practice that relates to
aggregation for prediction. The singularities of social agents are reduced to aggre-
gates of decisions, actions, desires, and tastes. This phenomenon has been called
personalization without personality(Ignatow and Robinson 2017, 100), person-
ality being the principle that gives unique style to each human process of individ-
ualization.
Having set the theoretical apparatus to explain how digital platforms can be
understood from a sociosymbolic perspective, we turn now to defining how digital
platforms accumulate power and how power accumulation increases the problem of
organized immaturity.
A Sociosymbolic Perspective of Power Accumulation and Its Consequences for
Organized Immaturity
Building on Bourdieus later writings on the State and its forms of power (Bourdieu
1989,2014) and in light of the latest developments of digital platforms and their
accumulation of power, we direct our analytic attention to the platform owner and its
relations with the other platform agents and sociodigital objects. Thus, we go beyond
the extant analysis of distinction in digital platforms done by scholars of digital
sociology (e.g., Julien 2015; Ignatow and Robinson 2017) which focuses on users, to
capture the mechanisms of field transformation led by platform owners in their
relationship with the other platform agents. We follow Bourdieu (2014) in terming
these mechanisms forms of powerand showing how these contribute to explaining
organized immaturity.
Drawing on Bourdieus writings (Bourdieu 1984,1989,1991), we define the
forms of power, distinguishing between two general dynamics. We first define five
forms of power (constitutional, juridical, discursive, distinction, and crowd) that
drive the accumulation of power within the platform. Second, inspired by recent
12 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
literature on platforms (Ziccardi 2012; Eaton et al. 2015; Krona 2015; Bucher et al.
2021), we show how counterpower can also be performed by end users and other
peripheral agents through crowd and hacking power. Crowd and hacking power are
not concepts derived directly by Bourdieus theory but provide a more comprehen-
sive view of power accumulation dynamics.
We then articulate the platform power dynamics through three phases of platform
evolution, which are derived from an interpretation of platform innovation research
(Cutolo and Kenney 2020; Kolagar, Parida, and Sjödin 2022; Rodon, Modol, and
Eaton 2021; Teece 2017): formation, where the platform is launched and starts to be
used by agents; domination, where the platform has been widely adopted and
operates under a relatively stable design within the original field; and cross-field
expansion, where the platform expands to other fields, leveraging their accumulation
of power. Although we describe for each stage the dominant forms of power and
counterpower accumulation that enable the transformation of the field, we acknowl-
edge that several forms of power coexist in these phases, that the evolution of
platforms is often nonlinear, and that not all platforms will become dominant.
Forms of Platform Power
Constitutional Power
Constitutional power is the ability to transform the objective principles of union and
separation, the power to conserve or to transform current classifications
(Bourdieu 1989, 23). Within the platform, this power comprises both the architec-
tural design (platform layers and modularity, design of user interfaces and experi-
ences) and the capacity to define the rules, norms, categories, and languages that
make up the digital interactions. Constitutional power shapes the digital medium for
interactions and defines what may and may not be accessed by each type of agent
within the platform.
Constitutional power is exercised mainly by the platform owner. As the provider
of the digital infrastructure upon which other agents collaborate, the owner defines
the symbolic space through code. Code symbolically creates the objects that con-
stitute the relations, being a neat, unified, and unambiguous language with no
openings for interpretation (Lessig 2009). In the digital realm, the actor who man-
ages the code can increase its symbolic imposition and therefore its legitimization.
As the legitimation process is unified, creation and transformation are delegated.
This legitimation is world-making(Bourdieu 1989), as it explicitly prescribes the
possible realities and actions. The platform owner is therefore able to hold a monopoly
over legitimate symbolic violence (Bourdieu 1989), having a differential capacity to
influenceand settle symbolic struggle. The possibility of obtaining and activating this
symbolic capital is associated with complex technological competences, which are
scarce and highly concentrated (Srnicek 2016;Zuboff2019).
The coherent body of code adopted by the symbolic space through constitutional
power is not a neutral technical medium (Beer 2017; Gillespie 2010), and it can
trigger autonomy eroding. Code is created and transformed in accordance with the
objectives of the platform owner and correspondingly managed toward these goals.
13H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
For example, Kitchens et al. (2020) show how the differences in platform design for
Facebook, Twitter, and Reddit create a differentiated impact on the diversity of news
and the type of content their users consume. Calo and Rosenblat (2017) and Walker
et al. (2021) find that the algorithmic design in Uber reduces driversinsights about
their working conditions and the competition they face, hindering their autonomy.
Even without assuming strategic manipulation, the limited symbolic and repetitive
action of users implies a delegation of usersown independent reasoning and the
emergent coordination of their actions by the platform.
Juridical Power
Along with the architecture definition, a second feature that is critical to the thriving
of the platform is its governance. While constitutional power has to do with the
design of governance, juridical power is the capacity to sanction via the created rules
and the authority to arbitrate in disputes (Bourdieu 1987,2005). Typically, it can
take a variety of forms, such as sanctioning rule infringement, reporting abuses, or
managing access to the platform (Adner and Kapoor 2010).
Digital technologies can enable increased participation and distribution of roles
among agents, which is why studies of governance in these contexts have favored the
idea that digitalization processes are highly democratizing (von Hippel 2006; Zit-
train 2009). However, the hierarchical structure of digital platforms facilitates the
creation of governance layers, meaning that the importance of those decisions can be
easily packaged, resulting in a limited distribution of power in the field. For example,
transaction-oriented platforms like Amazon, eBay, and Uber rely on user-based
rating systems to ensure good quality and sanction inadequate behavior; however,
the platform owner designs the rankings and retains control of other actions, such as
account activation and suspension (Gawer and Srnicek 2021).
This role division effectively creates and redistributes power and therefore
restricts the capacity of some agents to interact without the intervention of the digital
platform owner. Hence the definition and distribution of roles will interact with (and
eventually transform) the authority structure and the conflict management mecha-
nisms that preexist in the field, including regulation. For example, Valdez (2023)
explores how Uber uses what she calls infrastructuralpower to deploy a strategy of
contentious compliance,both adapting to and challenging existing regulation.
This strategy allows the company to exploit differences in regulation and regulatory
scrutiny to reduce usersaccess to information and acquired rights.
Discursive Power
A third distinctive form of power that characterizes agentsstrategic interplay is
discursive power. Discursive power is the power exercised in linguistic exchanges,
which are embodied and learned but also generative of the habitus (Bourdieu 1991).
The way agents talk about platforms and the words they use to explain themthese
discourses configure the collective narrative of what is possible on and valuable in a
platform.
Platforms are narrated as part of a broader, already-institutionalized rational-
technological narrative in which customer-centrism, effectiveness, and rationality
14 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
of the exchanges are dominant values (Gillespie 2010; Garud et al. 2022). Techno-
logical determinism discourses promoted by platform owners reinforce the idea that
platformsalgorithms are inscrutable and of a complexity unfathomable to the public
or the regulator (Martin 2022; Pasquale 2015). These discourses have led to a
broader narrative of a Manifest Destiny(Maddox and Malson 2020) of digital
platforms, where the user is explicitly asked to delegate their own reasoning to the
platform. This, alongside user dispersion, is a fundamental element that enables
prescribing actions. Critical to maintaining user dispersion is the narrative that users
are directly connected through the platform, which is presented as an agora of
exchanges. In actuality, platforms mediate that interaction, formatting it, regulating
it, or even suspending it.
Distinction Power
Distinction power is the creation of categories and the mechanisms of categorization
that drive choice in the platform. It builds on the concept of distinction proposed by
Bourdieu (1984). It defines the rules and practices that inhabit the habitus and
designates which of them are legitimated and considered by society to be natural.
The purpose of this type of power is to produce a behavioral response that serves
some agentsspecific accumulation of capital. The platform owner can influence
user behavior by modifying the interfaces, the encoding, and the algorithms, thereby
manipulating the users decision-making. At the same time, users can access and
activate this power through their digital habitus, allowing them to influence and
drive other userschoices.
On platforms, distinction power is often exercised through what Kornberger,
Pflueger, and Mouritsen (2017) call evaluative infrastructures. Evaluative infra-
structures are the different interactive devices, such as rankings, ratings, or reviews,
that establish an order of worth among the users of the platform, driving the
attention(Goldhaber 1997) of other users. They relate agents and their contribu-
tions with each other, but they are also instruments of power. They define not only
how agents are perceived and ranked in the community but also how the hierarchy is
monetized by the platforms owners (Kornberger et al. 2017). Status markers are
examples of how distinction power is exercised. As they define how user activity and
loyalty to the platform are rewarded, they become a fundamental element in guiding
agentsaccumulation strategies. For example, YouTube and Wikipedia changed
their strategy for recognizing content to stimulate newcomers (Kornberger et al.
2017). Ignatow and Robinson (2017) refer to this process as the übercapital.
Übercapital emphasizes the position and trajectory of users according to the scoring,
gradings, and rankings and is mobilized as an index of superiority that can have
strong reactive or performative effects on behavior (Ignatow and Robinson 2017).
A key feature of distinction power is that it is exercised heterogeneously over
different users through differences created by constitutional and juridical power.
Different types of users are granted different forms of agency, not only by the
platform designers but also by their own intervention on the platform (Levina and
Arriaga 2014). For instance, passive users may be granted agency through techno-
logical features. For example, YouTube gives agency to passive users by displaying
15H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
the number of views. Merely by viewing a piece of content, individuals cast a vote on
its value, which has significant consequences for the content producers. Other users
become judges or ratersand producers of information at the same time. For
example, retweeting on Twitter is both a contribution to the platform and an act
of evaluation. As well as users who are raters, there are often users who act also as
expert evaluators(users who have accumulated significant cultural capital). One
such example is the superdonoron crowdfunding platforms like Kickstarter,
whose expert evaluations influence which projects are funded. Expert evaluators
tend to form a tight-knit group within a field (Vaast, Davidson, and Mattson 2013;
Aral and Walker 2012). Other users might have what Bourdieu called institution-
alized consecration(Levina and Arriaga 2014), which is the formal authority to
evaluate content given by the platform designers. These are typically site moderators
and community managers, who have more power than others to judge contributions
(Levina and Arriaga 2014). In sum, these different types of agencies are designed by
the platform owners to orient usersactions and to promote and demote content
(Ghosh and Hummel 2014). They are typically linked to how the platform owner
designs revenue models (Zuboff 2019).
The forms of power presented so far tend to reinforce the power position of the
platform owner, but there are other forms of power that create the opposite tensions,
that is counterpower accumulation. These are crowd and hacking power.
Crowd Power
In the accumulation process, users are in a unique position in that they are the agents
who produce the platforms activity. Crowd power results from the influence that
users can exert on the platform by the sheer mass of their actions, which may or may
not be coordinated (Bennett, Segerberg, and Walker 2014; Culpepper and Thelen
2020). These practices are, in essence, the exercise of the digital habitus. The
exercise of the habitus can have a long-lasting effect on the platforms structure.
Practices can both inspire new functionalities and generate unexpected transforma-
tions to the value proposition, which the platform owner can recapture through
redesigning the code. For example, this has been observed in the sharing and creator
economies, in which, because the provider side of the platform is the main value
creatorfor example, graphic designers, programmersthe platform owner peri-
odically changes the design to facilitate delivery of that value (Bucher et al. 2018;
Bhargava 2022).
As Bourdieu (1990) argued, the agents ultimately decide what they do, and the
digital habitus may be accompanied by strategic calculation, even if most of the
practices are bound by parameters defined by the platform owners and managed
automatically by algorithms. This creates the opportunity for practices not aligned
with the value proposition to go viral, eventually posing challenges to the balance
envisioned in the platform design. For example, Krona (2015) uses the notion of
sousveillance”— an inverted surveillance from the bottomor from many to a
few”—to describe the novel use of an audiovisual sharing platform by social
movements during the Arab Spring uprising. This emergent use emphasizes the
emancipatory potential of users to create collective capabilities and decision-making
16 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
(Ziccardi 2012), which we designate as crowd platform powerchallenging forms of
power.
Yet, platform owners can attempt to use crowd power in their favor, in what we
call the crowd platform powerenhancing forms of power, through constitutional
power (architecture design, limiting the possibility of contact between users), juridical
power (policing and sanctioning users), and distinction power (by shaping the eval-
uative infrastructure). For example, Thelen (2018) shows how Uber weaponized
the volume of its users in a regulatory dispute by introducing a button on its
interface that would send a templated complaint email to local government on the
users behalf.
Hacking Power
Hacking power is the ability to identify the features and categories of digital spaces,
such as overlooked programming errors and ungoverned areas, that may be used for
a different purpose than the one originally intended (Jordan 2009; Hunsinger and
Schrock 2016). There are numerous examples in the literature of expressions of this
type of power in digital platforms. Eaton et al. (2015) have described the continuous
cycles of resistance and accommodation performed by groups of hackers and Apple
that surround the jailbreaking of each new release of the iOS. Bucher et al. (2021)
and Calo and Rosenblat (2017) have shown how workers learn to anticipate patterns
in algorithms that control their work processes and use this knowledge to defend
themselves from abuses.
Hacking power is the antithesis of individual immaturity, as it requires not only
the exercise of independent reasoning but also a degree of understanding of the
specific system in which the power is exercised. It is deliberate and purposeful,
unlike crowd power, which is independent of usersunderstanding because it stems
from the combined volume of their actions. At the same time, hacking power
necessarily operates in the margins or interstices of the platform. Furthermore,
hacking power can be thought of as opposed to the constitutional and juridical
powers; as such, it will be dispersed, under the radar, and is often considered illegal
(Castells 2011). This creates difficulties for creating and accumulating this power in
the field and consequently for using it to challenge other forms of power. Table 1
summarizes the different forms of power and provides further examples.
Platform Power Dynamics
By discussing platforms in the context of fields, we have shown how the relations
between the different key agents can be understood through dynamics of power
accumulation. On one hand, users activate their capitals through the production of
the practices that configure the digital habitus, which enhances their understanding
of the ways of participating on the platform. However, it is mainly the platform
owner who captures most of the value creation process through constitutional,
juridical, discursive, and distinction power. This uneven distribution facilitates the
creation of a leveled field upon which the relative positions can be consolidated
while, at the same time, enlarging the distance between agents and therefore their
capacity to decide in an autonomous way.
17H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
Table 1: Forms of Platform Power and the Organization of Immaturity
Form of power Definition Organization of immaturity Examples of the use of platform power
Constitutional Design and control of the platforms architecture
(modules, interfaces, layers, and algorithms) and
the capacity to define the rules, norms, categories,
and languages that make up the digital interactions
Explicitly prescribes realities and actions, granting the
platform owner a monopoly over legitimate
symbolic violence, leading to the delegation of
usersown independent reasoning and the
emergent coordination of their actions by the
platform.
Definition of user requirements to join (e.g.,
require proof of identity to join Airbnb)
Definition of possible interactions (e.g., the
introduction of the Likebutton on Face-
book)
Definition of allowed and forbidden user
actions (e.g., the impossibility of editing a
tweet on Twitter)
Juridical Capacity to sanction via the created rules and the
authority to arbitrate in disputes
Disciplines usersvoice and participation and align
them to the platforms interests and values
Sanction rule infringement (e.g., suspension
of a Lyft drivers account for using an alter-
native route)
Report abuses (e.g., users flagging inappro-
priate content on Instagram)
Management of access to the platform (e.g.,
restrict blacklisted users from using Tinder)
Discursive Power exercised in linguistic exchanges, which are
embodied and learned but are also generative of the
habitus
Shapes collective discourse; asks users to delegate
their own reasoning to the platform
Discourses of efficiency, technological
determinism, or complexity promoted by the
platform owners (e.g., accuracy and neutral-
ity of Google Search results)
Narratives created within userscommuni-
ties (e.g., the superiority of PC/Windows
gamers over Mac users)
Distinction Creation of categories and the mechanisms of
categorization that drive a users choice in the
platform
Enacts the platform owners capability to shape
behavior and the digital habitus
Definition of usersperformance, status, or
visibility metrics (e.g., definition of a prop-
ertys valuation metrics in Booking)
Creation of differentiated tools to define
hierarchies among users (e.g., ability to view
profiles while remaining anonymous for
LinkedIn premium users)
18 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
Table 1: continued
Form of power Definition Organization of immaturity Examples of the use of platform power
Crowd Usersinfluence on platforms by the shared mass of
their actions, coordinated or not; can become
manipulated by the platform owner
Implicitly contests immaturity when used against
platform power accumulation
User viralization of a message or practice
(e.g., coordination of a protest through a
Telegram channel)
Force a change in the platforms functional-
ities (e.g., introduction of feedback options
for service providers in UpWork)
Manipulation of users by covertly coordi-
nating their actions (e.g., mobilization of
Ubers users to settle a regulatory dispute)
Hacking Exploitation of a platforms features and categories
for a different purpose than the one originally
intended
Explicitly contests individual immaturity, as it
requires the exercise of independent reasoning and
the understanding of the platforms rules
Evade the platforms restrictions on func-
tionality (e.g., jailbreaking of Apples iOS)
Performance of forbidden practices (e.g., get
away with selling counterfeit products on
Amazon)
Abuse the logic of the algorithm in users
favor (e.g., disguise a users IP address to
access additional content on Netflix)
19H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
We articulate these power dynamics through the phases of platform evolution to
explain how platforms transform the agentsrelative positions over time and its
impact on organizing immaturity. Figure 1 depicts the evolution in three phases.
Phase 1: Platform Formation and Field Enclosure
In platform formation, the primary objective for the platform owner is to get users to
adopt the platform and regularly perform their practices on it. Constitutional, jurid-
ical, and discursive power are three of the forms of power through which platform
owners attempt to enclose the field organizing the emergence of immaturity, for
example, by designing a value creation model to create exclusive control rights
(Boyle 2003) over dimensions of practices that were previously in the public
domain. At the same time, these forms of power organize immaturity. First, consti-
tutional power (in the form of rules, norms, categories, and languages) defines how
and under what conditions interactions are performed and how the different agents
can express their preferences. Through juridical power, platform owners have the
capacity to define the sanctions that will promote or restrict an agents capacity to
operate on the platform, for example, who can exercise their voice on the platform
and who cannot and what sanctions are going to be applied to misbehavior. Finally,
discursive power creates a common narrative about the value of the platform,
restricting the capacity of agents to think beyond discourses that are presented as
truths.
Phase 2: Platform Domination within Original Field
Platform adoption and sustained use create the conditions for it to increasingly
occupy the field. The increasing participation of agents on the platform can change
the predominant accumulation logics of the different agents in the field, shaping the
digital habitus. The process of capital accumulation of different agents leveraged by
Figure 1: Platform Power Dynamics over Time
20 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
distinction power defines further how immaturity is organized by promoting the
processes of self-infliction of immaturity. Capital accumulation on platforms is
expressed as more data and the levying of fees, and the influx of users is repurposed
as capital to develop the platform. In turn, users invest different combinations of
capitals (data about their practices, social networks, money, and other assets) with
logics of both consumption (purchasing, sharing digital content) and profit and
accumulation (influencer, merchant, driver). To thrive in the accumulation dynam-
ics, agents must increasingly invest themselves in the platform, adapting their
strategies so they align with those that are relevant to the platform and embedded
in the digital habitus. Users adapt their practices to learn the specific logics. This
brings user practices closer to their archetypical category, and because they are better
rewarded, it further legitimizes the practices of the digital habitus. When users grasp
the critical elements of the digital habitus that correspond to their type, their practices
experience and enjoy a viral thrust that characterizes the platform logic (e.g., they
may become social media superusers or influencers). This success in increasing the
capital leveraged by the mechanism of distinction power, such as rating in the
platform, calls for higher investment, increasing the usersdependence on the
platform and thus contributing to the self-inflictive process of immaturity.
At the same time, the processes that reinforce platform power accumulation
coexist with other processes that create tensions that call for change and adjustment.
Misalignments between userspractices and their expected behavior can quickly
accumulate, destabilizing the platforms operation or posing challenges for its
governance. In addition, platforms with massive user bases and innumerable inter-
actions can become problematic for the platform owner to police, creating the space
for agents to exercise their hacking power. These counterpower accumulation forces
can therefore create an emergent enlightenmentas opposed to immaturityfor the
agents.
Phase 3: Platform Cross-Field Expansion
In a third phase, the platforms domination over the field leads to the possibility of
integrating new fields, further contributing to the accumulation of power and the
organizing of immaturity. Once a platform has become the dominant agent, a
position in which the structure itself acts on the owners behalf, it can expand the
scope of the field to new geographies and users and even enter and integrate
previously separated fields. For example, Ubers launch of Uber Eats was deployed
using the platforms extant base of users (drivers and passengers, viewed now as
commensals).
From the domination position, the owner can operate in the various fields with
great freedom, changing the exchange rate of the capitals at play and accumulating
power. Highly dominant expressions of constitutional power include interoperabil-
ity lock-ins, the use of dark patterns and biased information that impede sovereignty
of choice, and digital workplace design and control. Juridical power can be com-
manded from a position of gatekeeping, permitting arbitrary suspension of users
accounts, biased arbitration in a dispute, the imposition of discriminatory clauses,
restriction of access to the platform, or limits on freedom of speech. Abuses of power
21H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
are typically supported by the discursive power that enacts the discourse of Manifest
Destiny and uses opaque arguments to justify the increased accumulation of power
and the need to enforce the juridical power measures of the platform owner. Also in
this phase, a full deployment of distinction power relates to the platform owners
ability to monopolize the capture and processing of data through control of the
technological architecture. This can be used to drive user choice in multiple ways,
such as information asymmetries about the activities of a market or participant and
political influences on social media platforms.
The activation of powers in the cross-field expansion phase depicts the dynamics
within a field in a given moment, but it does not mean that the dominion of the
platform owner is absolute or that the platform becomes a total institution
(Bourdieu and Wacquant 2007). What we highlight is how the fields structural
homology with the platform eases a fast concentration of powers and creates
remarkable obstacles to modifying this situation, whether from within (due to users
habituation) or outside of the platform (because of network and lock-in effects,
barriers to entry, and technical complexity). In addition to this, the form of dominion
that the platforms specific logic enables is very effective because by creating
multisided businesses, it invisibilizes the specific accumulation and struggle dynam-
ics with respect to the core practices users perform. For example, Amazon is a place
to buy and sell online,and the fact that the company accumulates capital from the
capture of user data and the use of its sellerscapitals is not evident to the platforms
users. Thus the platformsrules of the gamemay appear to be somewhat objective
and relatively neutral, but they are in fact part of the organization of immaturity.
DISCUSSION
In this article, we present a sociosymbolic approach to power dynamics in digital
platforms and how they relate to organizing immaturity. A sociosymbolic approach
explains the structural and agentic dynamics of power accumulation leading to
organized immaturity. We contrast the power asymmetries between the platform
owner, as the central coordinating agent, and the rest of the agents directly partic-
ipating in the platform to present five main forms of power enacted by the platform
owner: constitutional, juridical, discursive, distinction, and crowd. We also present
two forms of power that explain how users counteract the platform owners power
accumulation: crowd and hacking. We explain how these forms of power are
fundamental for understanding the different ways in which immaturity is organized.
We show that constitutional power limits the symbolic world of the users and
therefore their capacity to influence new rules and vocabularies that orchestrate
participation. We explain how through juridical power, the platform owners have the
capacity to define the sanctions that restrict the voice and participation of users. We
show how through the digital habitus, the logic of the field is constituted, explaining
the emergence of immaturity and its self-infliction. However, we also argue that
distinction power enacts the platform owners capability to shape behavior through
creating evaluative infrastructures that mediate the emergence of immaturity. Fur-
thermore, we argue that the construction of a narrative of omniscience, through
22 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
discursive power, explicitly asks users to delegate their own reasoning to the
platform. We also highlight the existence of forms of power (hacking and crowd)
that help users to accumulate power and resist the central authority of the platform
owners.
Finally, we describe power dynamics and their relation to organized immaturity
through three phases: first, platform formation, where forms of powermainly
constitutional, juridical, and discursiveoperate to promote the field enclosure
and set the basis for immaturity to occur; second, platform domination in the field,
where distinction power promotes the field reproduction and processes of self-
infliction of immaturity, while hacking and crowd power create resistance to the
central authority; and third, platform cross-field expansion, in which power accu-
mulation dynamics lead to the integration of new fields and increasing dynamics of
immaturity. In defining the power accumulation dynamics, we explain the emergent
character of immaturity and its relation to agentsstrategies.
By focusing on the digital platform and its power dynamics, we contribute in two
ways to the current literature. First, we build a framework that explains the orga-
nizing dynamics of immaturity, based on the relations between the platform struc-
ture, the digital objects, and the agentsstrategies. Through this, we expand the
understanding of organized immaturity in the light of sociological perspectives. Our
framework analyzes how immaturity is constituted in practice and explains and
nuances the possibility of emergence and the self-infliction dimensions of immatu-
rity. Second, we provide a dynamic framework of platform power accumulation
contributing to the platform literature. Finally, we also provide policy recommen-
dations on how to tackle immaturity, and we highlight potential avenues for further
research.
Rethinking Organized Immaturity from a Sociosymbolic Perspective
A sociosymbolic perspective on digital platforms and its power dynamics can push
the boundaries of current concepts of organized immaturity toward a post-Kantian
and more sociologically grounded view (Scherer and Neesham 2020). This contrib-
utes to the understanding of organized immaturity in three ways. First is by explain-
ing the different components of the emergence of immaturity through power
struggles. We show how struggles are the result of agentsdifferent strategies,
heterogeneously shaped by their positions on the platform and their practices, but
also by their discourses and the history of experiences of each individual that shape
the digital habitus. By showing the dynamics in these struggles, we contribute to
explaining the process through which immaturity emerges as a nonorchestrated
phenomenon.
Second, we explain self-infliction by moving away from the more political
understandings of autonomy erosion. Political perspectives of immaturity look at
the individual and its (in)capacity for public use of reason(Scherer and Neesham
2020, 1) and consider the delegation of decision making to impersonal authorities
they cannot comprehend or control(Scherer and Neesham 2020, 4) as a condition
of the individual. We, however, adopt a sociological view that focuses on the
generation of practices and places the individual in a space of sociosymbolic power
23H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
struggles. We complement previous literature exploring the symbolic aspects of
technology and its impacts on society and, more concretely, on autonomy erosion
(Zuboff 2019; Stark and Pais 2020; Fayard and Weeks 2014) by providing a set of
forms of power that articulate how self-infliction is embedded in the digital habitus
and thus how immaturity is organized. Our sociosymbolic perspective explains how
the conditions of agency are shaped by the specific structure of the platform and its
power dynamics.
Last, looking at fields through the power dynamics between the different agents
can shed explanatory light on the formation process of organized immaturity. The
relationship between habitus and field operates in two ways: while the field struc-
tures the habitus as the embodiment of the immanent necessity of a field, the habitus
makes the field a meaningful space in which the agents may invest their capitals and
themselves (Bourdieu and Wacquant 2007). By defining the stages through which
this relationship unfolds, we contribute to showing the emergent, dynamic, and
accumulative nature of organized immaturity.
Contribution to the Understanding of Platform Power Accumulation
We have approached organized immaturity by analyzing platforms as spaces of
coordination and production of practices, shaped by relations engrained into a digital
habitus and the logic of the field. By better understanding the forms of power and the
role they play in field transformation, we have identified more clearly the different
forms of power accumulation through which digital platforms can become vehicles
for organized immaturity and its dynamics. This contributes to the literature of
platforms in the following ways. First, our description of the structural process of
power accumulation on the platform expands market and network approaches
(Jacobides 2021; Khan 2018; Eaton et al. 2015) by showing the importance of the
social, cultural, and symbolic dimensions of capital. This lays the foundations for
fundamentally reconceptualizing platform power and further explaining how power
is exercised by the platform owner (Cutolo and Kenney 2020; Kenney, Bearson, and
Zysman 2019).
Second, we enrich structural approaches to platforms by presenting how fields can
be transformed through dynamics of power accumulation that extend beyond the
consequences of an asymmetric structure (Curchod et al. 2019; Hurni, Huber, and
Dibbern 2022). Furthermore, our framework shows how platforms can be reshaped
by the interaction of agentsstrategies and the reconfiguration of the fields. By
introducing a field view, we provide a more holistic scheme in which power is both
accumulated and contested. We also highlight how agents other than the platform
owner play a role in producing and exercising forms of power. This nuances our
understanding of field dynamics and agent interaction in the context of platform
power dynamics.
Third, our model complements sociomaterial studies on platform power (e.g.,
Beer 2017; Kornberger 2017; Stark and Pais 2020) with the notion of the digital
habitus and its relation to organized immaturity. Other authors have analyzed
technological affordances as social designations of a space and the social and
cultural factors that signify a space and set a generative principle of governance
24 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
(Jung and Lyytinen 2014). Although these authors do not talk explicitly about
habitus or social capital, they reflect on the generative reproduction of norms by
individuals in contact with their social spaces; this is very similar to the Bourdieu
definition of habitus in social spaces. We complement the sociomaterial view of
platforms by showing how the digital habitus works and by emphasizing the role of
the platform as an organizing agent with a privileged capacity of capital accumula-
tion. We present the platform as a space of symbolically mediated power relation-
ships in which the digital objects and structural elements interplay to conform the
logic of the field. We provide an understanding of the multifaceted nature of power
as a process resulting from agentspractices and strategies, the habitus, and capital
accumulation in a field. We argue that this conceptualization defines power in
platforms not only as an instrument(Zuboff 2019) at the service of the platform
owners but as a web of relations utilized by agents who can better exploit the
different forms of capital. We also contribute to the debate about the coordinating
role of platforms and how they create generative forms of distributed control while
power remains centralized, in an interplay between hierarchical and heterarchical
power relations (Kornberger et al. 2017).
Bourdieus(1977,1990) concepts of capitals, habitus, and distinction have been
used before in the study of the social consequences of digitalization and platforms
increase of power (e.g., Levina and Arriaga 2014; Fayard and Weeks 204; Romele
and Rodighiero 2020). We complement that research with a view of platforms
accumulation of power and its role in the organizing of immaturity. We go beyond
the explanation of distinction power to define constitutional, juridical, discourse,
crowd, and hacking forms of power, thereby offering a more complete view of how
platforms accumulate power and organize immaturity.
Contributions to Practice and Avenues for Future Research
Our article provides a conceptual framework to practitioners that can enable
platform owners, users, and policy makers to fundamentally rethink how they
might address the platformsnegative consequences for society. First, it highlights
immaturity as a relevant concept to address social issues in platforms. Our detailed
understanding of the mechanisms leading to immaturity and the manipulation of
individualsdecisions can help policy makers to identify and set limits on these types
of powers, especially in the light of platform domination. By explaining the orga-
nizing dynamics of immaturity, we direct attention to the more holistic assessments
of the social consequences of platforms. Concretely, we emphasize how these are not
just concerned with the concentration in specific industries (such as retailing or
advertising) but also involve constraints on human rights (such as freedom of
speech). Furthermore, we show how the consequences of organizing our practices
through platforms are embedded in social structures and expressed in the transfor-
mation of fields. We believe that this line of thought is fundamental if we are to
collectively rethink the social role of platforms.
Our article has also limitations that open up avenues for further research. We have
identified not only forms of platform power accumulation but also forms of platform
counterpower accumulation. As our focus in this article has been on how platforms
organize immaturity, we have devoted more attention to the forms of power
25H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
accumulation. However, future work is needed to deepen our understanding of how
platforms lose power. For example, in recent years, we have witnessed an increasing
backlash against big tech platforms, fueled by reputational scandals and vigorous
societal complaints (Joyce and Stuart 2021; Gawer and Srnicek 2021). We have also
observed a new wave of regulatory initiatives that intend to curb platformspower by
forcing interoperability and limiting self-preferencing and acquisitions (Cusumano
et al. 2021; Jacobides, Bruncko, and Langen 2020), even when the effectiveness of
these policies is being debated (Rikap and Lundvall 2021). For example, in Europe,
the new legislation of the Digital Markets Act (European Commission 2022a) and
the Digital Services Act (European Commission 2022b) are respectively intended to
create more contestability in digital platform markets. In the United States, there has
been intense debate around the possible revocation of Section 230, which has so far
provided a shield for platformsactivities in social networks (SCDP 2019), leading
to abuses of power and increasing immaturity. In parallel to regulatory or external
counterpower mechanisms, research into power dynamics could also analyze the
flows of affects and affective intensification (Just 2019) that happen with the abuse
of the digital habitus. Incipient research (e.g., Just 2019; Castelló and Lopez-Berzosa
2023) has shown how these flows of affects not only shape collective meanings but
can also lead to increasing forms of hate speech and the renaissance of populist
politics. More should be researched about what forms of counterpower may emerge
in society to reduce populism and hate speech. We believe that our framework sets
grounds for studying the more concrete practices of immaturity in platforms but also
new forms of resistance.
CONCLUSION
Building on the concepts of fields, capitals, and habitus, we propose a sociosymbolic
framework to explain organized immaturity in digital platforms. We articulate six
forms of power that characterize the different ways in which platforms organize
immaturity. It is our suggestion that a more precise understanding of the digital
platformsrole in driving organized immaturity can become the basis for funda-
mentally rethinking the role of the digital platform in society. Can the processes that
lead to organized immaturity be reoriented toward organized enlightenment? We
argue that a first step in this direction is to better understand how power is performed
in digital platforms, which is what our framework contributes to explaining.
Acknowledgments
We express our gratitude to this special issues editorial team, and particularly to Dennis
Schoene-born, for their extraordinary work in helping us develop this article into its current
form. We also express our appreciation to the three anonymous reviewers who provided
valuable guidance during the editorial process. We are thankful to the Research Council of
Norway, project Algorithmic Accountability: Designing Governance for Responsible Dig-
ital Transformations(grant 299178) and the British Academy, project Fighting Fake News
in Italy, France and Ireland: COVID-19(grant COVG7210059) for supporting this
research.
26 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
REFERENCES
Adner, Ron, and Rahul Kapoor. 2010. Value Creation in Innovation Ecosystems: How the
Structure of Technological Interdependence Affects Firm Performance in New
Technology Generations.Strategic Management Journal 31 (3): 30633.
Alaimo, Cristina, and Jannis Kallinikos. 2017. Computing the Everyday: Social Media as
Data Platforms.Information Society 33 (4): 17591.
Aral, Sinan, and Dylan Walker. 2012. Identifying Influential and Susceptible Members of
Social Networks.Science 337 (6092): 33741.
Baldwin, Carliss Y., and C. Jason Woodard. 2009. The Architecture of Platforms: A Unified
View.In Platforms, Markets and Innovation, edited by Annabelle Gawer, 32.
Cheltenham, UK: Edward Elgar
Beer, David. 2017. The Social Power of Algorithms.Information, Communication, and
Society 20 (1): 113.
Bennett, W. Lance, Alexandra Segerberg, and Shawn Walker. 2014. Organization in the
Crowd: Peer Production in Large-Scale Networked Protests.Information, Commu-
nication, and Society 17 (2): 23260.
Bhargava, Hemant K. 2022. The Creator Economy: Managing Ecosystem Supply,
Revenue-Sharing, and Platform Design.Management Science 68 (7): 523351.
Boudreau, Kevin. 2010. Open Platform Strategies and Innovation: Granting Access
vs. Devolving Control.Management Science 56 (10): 184972.
Bourdieu, Pierre. 1969. Intellectual Field and Creative Project.Social Science Information
8 (2): 89119.
Bourdieu, Pierre. 1977. Outline of a Theory of Practice. Cambridge: Cambridge University
Press.
Bourdieu, Pierre. 1979. Symbolic Power.Critique of Anthropology 4 (1314): 7785.
Bourdieu, Pierre. 1984. Distinction: A Social Critique of the Judgement of Taste. Cambridge,
MA: Harvard University Press.
Bourdieu, Pierre. 1987. The Force of Law: Toward a Sociology of the Juridical Field.
Hastings Law Journal 38 (5): 81453.
Bourdieu, Pierre. 1989. Social Space and Symbolic Power.Sociological Theory 7 (1):
1425.
Bourdieu, Pierre. 1990. The Logic of Practice. Redwood City, CA: Stanford University
Press.
Bourdieu, Pierre. 1991. Language and Symbolic Power. Cambridge, MA: Harvard Univer-
sity Press.
Bourdieu, Pierre. 1993. Génesis y estructura del campo burocrático.Actes de la Recherche
en Sciences Sociales 9697: 4962.
Bourdieu, Pierre. 2005. Principles of an Economic Anthropology.In The Handbook of
Economic Sociology, 2nd ed., 7589. Princeton, NJ: Princeton University Press.
Bourdieu, Pierre. 2011. The Forms of Capital.In The Sociology of Economic Life, 3rd ed.,
edited by Mark Granovetter and Richard Swedberg, 24158. New York: Routledge.
Bourdieu, Pierre. 2014. On the State: Lectures at the Collège de France, 19891992. Edited
by Patrick Champagne. Cambridge: Polity Press.
Bourdieu, Pierre, and Loïc Wacquant, eds. 2007. An Invitation to Reflexive Sociology.
Malden, MA: Polity Press.
Boyle, James. 2003. The Second Enclosure Movement and the Construction of the Public
Domain.Law and Contemporary Problems 66 (1): 42.
27H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
Bucher, Eliane, Christian Fieseler, Matthes Fleck, and Christoph Lutz. 2018. Authenticity
and the Sharing Economy.Academy of Management Discoveries 4 (3): 294313.
Bucher, Eliane Léontine, Peter Kalum Schou, and Matthias Waldkirch. 2021. Pacifying the
Algorithm: Anticipatory Compliance in the Face of Algorithmic Management in the
Gig Economy.Organization 28 (1): 4467.
Busch, Christoph, Inge Graef, Jeanette Hofmann, and Annabelle Gawer. 2021. Uncovering
Blindspots in the Policy Debate on Platform Power: Final Report. Luxembourg:
Publications Office of the European Union. https://platformobservatory.eu/app/
uploads/2021/03/05Platformpower.pdf.
Caillaud, Bernard, and Bruno Jullien. 2003. Chicken and Egg: Competition among Inter-
mediation Service Providers.RAND Journal of Economics 34 (2): 30928.
Calo, Ryan, and Alex Rosenblat. 2017. The Taking Economy: Uber, Information, and
Power.SSRN Electronic Journal.https://doi.org/10/gfvmg3.
Castelló, Itziar, and David Lopez-Berzosa. 2023. Affects in Online Stakeholder Engage-
ment: A Dissensus Perspective.Business Ethics Quarterly 33 (1): 180215.
Castells, Manuel. 2011. Network Theory: A Network Theory of Power.International
Journal of Communication 5: 77387.
Clegg, Stewart R. 1989. Organization Theory and Class Analysis: New Approaches and
New Issues. New York: De Gruyter.
Costa, Ricardo L. 2006. The Logic of Practices in Pierre Bourdieu.Current Sociology
54 (6): 87395.
Constantinides, Panos, Ola Henfridsson, and Geoffrey G. Parker. 2018. Introduction:
Platforms and Infrastructures in the Digital Age.Information Systems Research
29 (2): 381400
Crémer, Jacques, Yves-Alexandre de Montjoye, and Heike Schweitzer. 2019. Competition
Policy for the Digital Era. Luxembourg: Publications Office of the European Union.
https://ec.europa.eu/competition/publications/reports/kd0419345enn.pdf.
Culpepper, Pepper D., and Kathleen Thelen. 2020. Are We All Amazon Primed? Con-
sumers and the Politics of Platform Power.Comparative Political Studies 53 (2):
288318.
Curchod, Corentin, Gerardo Patriotta, Laurie Cohen, and Nicolas Neysen. 2019. Working
for an Algorithm: Power Asymmetries and Agency in Online Work Settings.
Administrative Science Quarterly 65 (3): 64476.
Cusumano, Michael A., Annabelle Gawer, and David B. Yoffie. 2019. The Business of
Platforms: Strategy in the Age of Digital Competition, Innovation, and Power.
New York: Harper Business.
Cusumano, Michael, Annabelle Gawer, and David Yoffie. 2021. Can Self-Regulation Save
Digital Platforms?Industrial and Corporate Change 30 (5): 125985.
Cutolo, Donato, and Martin Kenney. 2020. Platform-Dependent Entrepreneurs: Power
Asymmetries, Risks, and Strategies in the Platform Economy.Academy of Man-
agement Perspectives 35 (4): 584685.
Dewey, John. 1939. Freedom and Culture. New York: Putnam.
Eaton, Ben, Silvia Elaluf-Calderwood, Carsten Sørensen, and Youngjin Yoo. 2015. Dis-
tributed Tuning of Boundary Resources: The Case of Apples IOS Service System.
MIS Quarterly 39 (1): 21743.
Etter, Michael, and Oana Brindusa Albu. 2021. Activists in the Dark: Social Media
Algorithms and Collective Action in Two Social Movement Organizations.Orga-
nization 28 (1): 6891.
28 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
European Commission. 2022a. Deal on Digital Markets Act.March 24. https://www.
europarl.europa.eu/news/es/press-room/20220315IPR25504/deal-on-digital-mar
kets-act-ensuring-fair-competition-and-more-choice-for-users.
European Commission. 2022b. The Digital Services Act Package.https://digital-strategy.
ec.europa.eu/en/policies/digital-services-act-package.
Evans, David. 2003. Some Empirical Aspects of Multi-sided Platform Industries.Review
of Network Economics 2 (3): 191209.
Faraj, Samer, Stella Pachidi, and Karla Sayegh. 2018. Working and Organizing in the Age
of the Learning Algorithm.Information and Organization 28 (1): 6270.
Fayard, Anne-Laure, and John Weeks. 2014. Affordances for Practice.Information and
Organization 24 (4): 23649.
Flyverbom, Mikkel, Ronald Deibert, and Dirk Matten. 2019. The Governance of Digital
Technology, Big Data, and the Internet: New Roles and Responsibilities for
Business.Business and Society 58 (1): 319.
Fuchs, Christian. 2007. Internet and Society: Social Theory in the Information Age.
New York: Routledge.
Furman, Jason. 2019. Unlocking Digital Competition: Report of the Digital Competition
Expert Panel. https://assets.publishing.service.gov.uk/government/uploads/system/
uploads/attachment_data/file/785547/unlocking_digital_competition_furman_
review_web.pdf.
Garud, Raghu, Arun Kumaraswamy, Anna Roberts, and Le Xu. 2022. Liminal Movement
by Digital Platform-Based Sharing Economy Ventures: The Case of Uber
Technologies.Strategic Management Journal 43 (3): 44775.
Gawer, Annabelle. 2014. Bridging Differing Perspectives on Technological Platforms:
Toward an Integrative Framework.Research Policy 43 (7): 123949.
Gawer, Annabelle. 2021. Digital PlatformsBoundaries: The Interplay of Firm Scope,
Platform Sides, and Digital Interfaces.Long Range Planning 54 (5): 102045.
Gawer, Annabelle, and Rebecca Henderson. 2007. Platform Owner Entry and Innovation in
Complementary Markets: Evidence from Intel.Journal of Economics and Man-
agement Strategy 16 (1): 134.
Gawer, Annabelle, and NickSrnicek. 2021. Online Platforms: Economic and Societal Effects.
Brussels: Panel for the Future of Science and Technology, European Parliament.
https://www.europarl.europa.eu/stoa/en/document/EPRS_STU(2021)656336.
Ghosh, Arpita, and Patrick Hummel. 2014. A Game-Theoretic Analysis of Rank-Order
Mechanisms for User-Generated Content.Journal of Economic Theory 154
(November): 34974.
Gillespie, Tarleton. 2010. The Politics of Platforms.’” New Media and Society 12 (3):
34764.
Goldhaber, Michael H. 1997. The Attention Economy and the Net.First Monday 2 (4).
Gulati, Ranjay, Phanish Puranam, and Michael Tushman. 2012. Meta-Organization
Design: Rethinking Design in Interorganizational and Community Contexts.Stra-
tegic Management Journal 33 (6): 57186.
Hilferding, Rudolph. 2005. Finance Capital: A Study of the Latest Phase of Capitalist
Development. London: Taylor and Francis.
Hunsinger, Jeremy, and Andrew Schrock. 2016. The Democratization of Hacking and
Making.New Media and Society 18 (4): 53538.
Hurni, Thomas, Thomas L. Huber, and Jens Dibbern. 2022. Power Dynamics in Software
Platform Ecosystems.Information Systems Journal 32 (2): 31043.
29H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
Ignatow, Gabe, and Laura Robinson. 2017. Pierre Bourdieu: Theorizing the Digital.
Information, Communication, and Society 20 (7): 95066.
Introna, Lucas D. 2016. Algorithms, Governance, and Governmentality: On Governing
Academic Writing.Science, Technology, and Human Values 41 (1): 1749.
Jacobides, Michael G. 2021. What Drives and Defines Digital Platform Power?White
paper, Evolution Ltd., April 18. https://www.evolutionltd.net/post/what-drives-and-
defines-digital-platform-power.
Jacobides, Michael G., Martin Bruncko, and Rene Langen. 2020. Regulating Big Tech in
Europe: Why, So What, and How Understanding Their Business Models and Eco-
systems Can Make a Difference.White paper, Evolution Ltd., December 20. https://
www.evolutionltd.net/post/regulating-big-tech-in-europe.
Jordan, Tim. 2009. Hacking and Power: Social and Technological Determinism in the
Digital Age.First Monday 14 (7).
Joyce, S., & Stuart, M. (2021). Trade union responses to platform work: An evolving tension
between mainstream and grassroots approaches. In A Modern Guide to Labour and
the Platform Economy, edited by Jan Drahokoupil and Kurt Vandaele, 17792.
Cheltenham, UK: Edward Elgar.
Julien, Chris. 2015. Bourdieu, Social Capital and Online Interaction.Sociology 49 (2):
35673.
Jung, Yusun, and Kalle Lyytinen. 2014. Towards an Ecological Account of Media Choice:
A Case Study on Pluralistic Reasoning While Choosing Email.Information Systems
Journal 24 (3): 27193.
Just, Natascha, and Michael Latzer. 2017. Governance by Algorithms: Reality Construction
by Algorithmic Selection on the Internet.Media, Culture, and Society 39 (2):
23858.
Just, Sine N. 2019. An Assemblage of Avatars: Digital Organization as Affective Intensi-
fication in the GamerGate Controversy.Organization 26 (5): 71638.
Katz, Michael L., and Carl Shapiro. 1985. Network Externalities, Competition, and
Compatibility.American Economic Review 75 (3): 42440.
Kelkar, Shreeharsh. 2018. Engineering a Platform: The Construction of Interfaces, Users,
Organizational Roles, and the Division of Labor.New Media and Society 20 (7):
262946.
Kenney, Martin, Dafna Bearson, and John Zysman. 2019. The Platform Economy Matures:
Pervasive Power, Private Regulation, and Dependent Entrepreneurs.SSRN Elec-
tronic Journal. DOI: 10.2139/ssrn.3497974.
Khan, Lina M. 2018. Sources of Tech Platform Power.Georgetown Law Technology
Review 2 (2): 32534.
Kitchens, Brent, Steve L. Johnson, and Peter Gray. 2020. Understanding Echo Chambers
and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts
in News Consumption.MIS Quarterly 44 (4): 161949.
Kolagar, Milad, Vinit Parida, and David Sjödin. 2022. Ecosystem Transformation for
Digital Servitization: A Systematic Review, Integrative Framework, and Future
Research Agenda.Journal of Business Research 146 (July): 176200.
Kornberger, Martin. 2017. The Visible Hand and the Crowd: Analyzing Organization
Design in Distributed Innovation Systems.Strategic Organization 15 (2): 17493.
Kornberger, Martin, Dane Pflueger, and Jan Mouritsen. 2017. Evaluative Infrastructures:
Accounting for Platform Organization.Accounting, Organizations, and Society 60
(July): 7995.
30 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
Kretschmer, Tobias, Aija Leiponen, Melissa Schilling, and Gurneeta Vasudeva. 2022.
Platform Ecosystems as Metaorganizations: Implications for Platform Strategies.
Strategic Management Journal 43 (3): 40524.
Krona, Michael. 2015. Contravigilancia y videoactivismo desde la plaza Tahrir. Sobre las
paradojas de la sociedad contravigilante.In Videoactivismo y movimientos sociales,
edited by David Montereo and Francisco Sierra, 17. Barcelona: Gedisa.
Lanier, Jaron. 2018. Ten Arguments for Deleting Your Social Media Accounts Right Now.
New York: Random House.
Lessig, Lawrence. 2009. El Código 2.0. Madrid: Traficantes de Sueños.
Levina, Natalia, and Manuel Arriaga. 2014. Distinction and Status Production on User-
Generated Content Platforms: Using Bourdieus Theory of Cultural Production to
Understand Social Dynamics in Online Fields.Information Systems Research
25 (3): 443666.
Lianos, Ioannis, and Bruno Carballa-Smichowski. 2022. A Coat of Many Colours: New
Concepts and Metrics of Economic Power in Competition Law and Economics.
Journal of Competition Law and Economics 18 (4): 795831.
Lupton, Deborah. 2016. The Quantified Self. Hoboken, NJ: John Wiley.
Lynskey, Orla. 2017. Regulating Platform Power.’” LSE Legal Studies Working Paper
1/2017, London School of Economics.
Lynskey, Orla. 2019. Grappling with Data Power: Normative Nudges from Data Protec-
tion and Privacy.Theoretical Inquiries in Law 20 (1): 189220.
Lyon, David. 2018. The Culture of Surveillance: Watching as a Way of Life. 1st ed. Medford,
MA: Polity Press.
Maddox, Jessica, and Jennifer Malson. 2020. Guidelines without Lines, Communities
without Borders: The Marketplace of Ideas and Digital Manifest Destiny in Social
Media Platform Policies.Social Media þSociety 6 (2).
Margetts, Helen, Vili Lehdonvirta, Sandra González-Bailón, Jonathon Hutchinson, Jonathan
Bright, Vicki Nash, and David Sutcliffe. 2021. The Internet and Public Policy:
Future Directions.Policy and Internet. DOI: 10.1002/poi3.263.
Martin, Kirsten E. 2022. Algorithmic Bias and Corporate Responsibility: How Companies
Hide behind the False Veil of the Technological Imperative.In The Ethics of Data
and Analytics: Concepts and Cases,36
50. New York: Auerbach.
McCoy, Jennifer, Tahmina Rahman, and Murat Somer. 2018. Polarization and the Global
Crisis of Democracy: Common Patterns, Dynamics, and Pernicious Consequences
for Democratic Polities.American Behavioral Scientist 62 (1): 1642.
McIntyre, David, Arati Srinivasan, Allan Afuah, Annabelle Gawer, and Tobias Kretschmer.
2021. Multi-sided Platforms as New Organizational Forms.Academy of Manage-
ment Perspectives 35 (4): 56683.
Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce
Racism. Illustrated ed. New York: NYU Press.
OConnor, Cailin, and James Owen Weatherall. 2019. The Misinformation Age: How False
Beliefs Spread. New Haven, CT: Yale University Press.
Orlikowski, Wanda J., and Susan V. Scott. 2015. The Algorithm and the Crowd: Consid-
ering the Materiality of Service Innovation.MIS Quarterly 39 (1): 20116.
Pasquale, Frank. 2015. The Black Box Society: The Secret Algorithms That Control Money
and Information. Cambridge, MA: Harvard University Press.
Philbeck, Thomas, and Nicholas Davis. 2018. The Fourth Industrial Revolution: Shaping a
New Era.Journal of International Affairs 72 (1): 1722.
31H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
Portes, Alejandro. 1998. Social Capital: Its Origins and Applications in Modern
Sociology.Annual Review of Sociology 24 (1): 124.
Richards, Neil M. 2012. The Dangers of Surveillance Symposium: Privacy and
Technology.Harvard Law Review 126 (7): 193465.
Rikap, Cecilia, and Bengt-Åke Lundvall. 2021. The Digital Innovation Race: Conceptual-
izing the Emerging New World Order. Cham, Switzerland: Springer.
Rochet, Jean-Charles, and Jean Tirole. 2003. Platform Competition in Two-Sided
Markets.Journal of the European Economic Association 1 (4): 9901029.
Rodon Modol, Joan, and Ben Eaton. 2021. Digital Infrastructure Evolution as Generative
Entrenchment: The Formation of a CorePeriphery Structure.Journal of Informa-
tion Technology 36 (4): 34264.
Romele, Alberto, and Dario Rodighiero. 2020. Digital Habitus or Personalization without
Personality.HUMANA.MENTE Journal of Philosophical Studies 13 (37): 98126.
Sandberg, Jörgen, and Mats Alvesson. 2011. Ways of Constructing Research Questions:
Gap-Spotting or Problematization?Organization 18 (1): 2344.
Scherer, Andreas Georg, and Cristina Neesham. 2020. New Challenges to Enlightenment:
Why Socio-technological Conditions Lead to Organized Immaturity and What to Do
about It.SSRN Electronic Journal. DOI: https://doi.org/10/gj8mhq.
Scherer, Andreas Georg, Cristina Neesham, Dennis Schoeneborn, and Markus Scholz. 2020.
Call for Submissions Business Ethics Quarterly Special Issue on: Socio-
technological Conditions of Organized Immaturity in the Twenty-First Century.
Business Ethics Quarterly 30 (3): 44044.
Schwab, Klaus. 2017. The Fourth Industrial Revolution. New York: Crown Business.
Seymour, Richard. 2019. The Machine Always Wins: What Drives Our Addiction to Social
Media.Guardian, August 23, sec. Technology.
Srnicek, Nick. 2016. Platform Capitalism. Malden, MA: Polity Press.
Stark, David, and Ivana Pais. 2020. Algorithmic Management in the Platform Economy.
Sociologica 14 (3): 4772.
Stigler Committee on Digital Platforms. 2019. Final Report.Stigler Center for the Study of
the Economy and the State. https://www.chicagobooth.edu/-/media/research/stigler/
pdfs/digital-platformscommittee-reportstigler-center.pdf.
Striphas, Ted. 2010. How to Have Culture in an Algorithmic Age.https://www.
thelateageofprint.org/2010/06/14/how-to-have-culture-in-an-algorithmic-age/.
Teece, David J. 2017. Dynamic Capabilities and (Digital) Platform Lifecycles.Advances
in Strategic Management 37: 21125.
Thelen, Kathleen. 2018. Regulating Uber: The Politics of the Platform Economy in Europe
and the United States.Perspectives on Politics 16 (4): 93853.
Tiwana, Amrit, Benn Konsynski, and Ashley A. Bush. 2010. Platform Evolution: Coevo-
lution of Platform Architecture, Governance, and Environmental Dynamics.Infor-
mation Systems Research 21 (4): 67587.
Trittin-Ulbrich, Hannah, Andreas Georg Scherer, Iain Munro, and Glen Whelan. 2021.
Exploring the Dark and Unexpected Sides of Digitalization: Toward a Critical
Agenda.Organization 28 (1): 825.
Vaast, Emmanuelle, Elizabeth J. Davidson, and Thomas Mattson. 2013. Talking about
Technology: The Emergence of a New Actor Category through New Media.MIS
Quarterly 37 (4): 106992.
Valdez, Jimena. 2023. The Politics of Uber: Infrastructural Power in the United States and
Europe.Regulation and Governance 17 (1): 177194.
32 B E Q
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
von Hippel, Eric. 2006. Democratizing Innovation. Cambridge, MA: MIT Press.
Walker, Michael, Peter Fleming, and Marco Berti. 2021. “‘You Cant Pick Up a Phone and
Talk to Someone: How Algorithms Function as Biopower in the Gig Economy.
Organization 28 (1): 2643.
Wu, Liang, Fred Morstatter, Kathleen M. Carley, and Huan Liu. 2019. Misinformation in
Social Media: Definition, Manipulation, and Detection.ACM SIGKDD Explora-
tions Newsletter 21 (2): 8090.
Ziccardi, G. (2012). Resistance, Liberation Technology and Human Rights in the Digital
Age. Dordrecht, Netherlands: Springer Science & Business Media.
Zittrain, Jonathan. 2009. Law and Technology: The End of the Generative Internet.
Communications of the ACM 52 (1): 1820.
Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future
at the New Frontier of Power. London: Profile Books.
...
MíHá(m.harraca@surrey.ac.uk, corresponding author) is a postgraduate rese-
archer and PhD candidate at Surrey Business School, University of Surrey. He holds a MA
(analyse politique et économique; Hons) from Paris 13Sorbonne Paris Cité and a Licentiate
degree (Hons) in economics from Universidad de Buenos Aires. He is interested in societys
transformation through digitalization, with a focus on strategy and competition in digital
platforms.
I Có is a reader at Bayes Business School (formerly Cass) at City University of
London. She holds an executive MBA and a PhD from ESADE, Ramon Llull University and
a MSc from the College of Europe in Belgium. She is interested in social change in digital
contexts. Her research uses corporate social responsibility, deliberation, and social move-
ment theories to understand social and environmental challenges like climate change, plastic
pollution, and social polarization.
A G is chaired professor in digital economy and director of the Centre of
Digital Economy at the University of Surrey and a visiting professor of strategy and
innovation at Oxford University Saïd Business School. A pioneering scholar of digital
platforms and innovation ecosystems, she is a highly cited author or coauthor of more than
forty articles and four books, including The Business of Platforms: Strategy in the Age of
Digital Competition, Innovation, and Power (2019). Gawer is a digital expert for the UK
Competition and Markets Authorities, and she has advised the European Parliament and the
European Commission on digital platforms regulation as an expert in the EU Observatory of
the Online Platform Economy.
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence
(https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and
reproduction in any medium, provided the original work is properly cited.
33H D P O I
https://doi.org/10.1017/beq.2022.40 Published online by Cambridge University Press
... Platform-dependence, however, can also be seen as a result of "organizational immaturity": following Harracá et al. (2023) and drawing on the Kantian principle of intellectual maturity, enterprises in a DP ecosystem are individual actors who willingly give away autonomy to an organizing actor, a DP owner. Thus, platforms are more than just technical intermediaries; they "organize social interactions and transform the power relations of the different agents who participate in the digital exchanges" (p. ...
... On top of that, the authors find that these power relations are dynamically changing over time. From platform formation to original-field domination and to cross-field expansion, DPs accumulate constitutional, juridical and discursive power, as well as distinction power and platform-enhancing crowd power (Harracá et al. 2023). Exemplary cases for such power asymmetry dynamics are documented by Asadullah et al. (2023): their study indicates that SMEs are aware of their high dependence on DPs, partly due to limited substitutability of platforms. ...
Article
Digital platform (DP) enterprises have risen to the top of the global economy by inverting traditional business models. They earn money through matchmaking, transaction facilitation, and efficient orchestration of other stakeholders' resources. Small‐ and medium‐sized enterprises (SMEs) play a decisive role in this success story: they offer products and services on many leading DPs and thereby feed platform networks as sellers and suppliers. On the other hand, while DPs enable SMEs to reach new customers and outsource costly managerial tasks, many SMEs struggle with the negative consequences of asymmetric power dynamics and dependency risks in platform business. This paper analyzes 14 cases of SMEs that face power asymmetry as DP value providers, unraveling challenges encountered and coping practices employed in this situation. The findings reveal that power asymmetry between dominant DPs and SMEs creates significant challenges, including exploitative relationships, loss of autonomy, surveillance pressures, and devaluation of SME contributions, while SMEs cope through disintermediation, multi‐homing, individual resistance tactics, and striving for more equitable DP models. Moreover, three relationality types are identified: DPs being SMEs' “partners,” “tools,” and “necessary evil.” This article offers a new perspective on boundaries of SME strategies in the DP economy and contributes with empirical insights to the advancing field of entrepreneurial platform research.
... Platforms succeed by detaching themselves from on-site investments and responsibilities (Kirchner 2022;Kirchner and Beyer 2016): they position themselves as neutral intermediaries, facilitating transactions between independent parties and making profits by charging for the use of their infrastructure (Kirchner 2023;Langley and Leyshon 2017;Srnicek 2017Srnicek , 2021. At the same time, platforms have a strong interest in governing their users' activities to ensure profitability while maintaining enough flexibility to avoid liability (Ametowobla and Kirchner 2023;Harracá et al. 2023;Kirchner 2022;Pohl 2023;Schüßler et al. 2021). ...
... Airbnb, Inc. is a publicly traded company backed by venture capitalist firms, subject to economic pressures. Analyses of the platform's interfaces reveal techniques that promote professionalized hosting and revenue maximization (Bosma and van Doorn 2022;Harracá et al. 2023;Pohl 2023). Despite public campaigns emphasizing intercultural exchange and sharing, these findings highlight the platform's commercializing influences. ...
Article
Full-text available
In the early days of its existence, Airbnb was heralded as a champion of the “sharing economy”, enabling amateur hosts to offer short-term lodging to a global community of guests. Since then, public debates and research have highlighted the rise of professional hosts as an indication for the commercialization of Airbnb. However, it is neither well understood how Airbnb listings developed across cities and time nor how they are distributed across local hotspots. The paper conceptualizes the Airbnb marketplace in its spatial constitution, as a network space brokering accommodation at specific places within designated city territories. It asks who dominates the digital marketplace across cities and time, distinguishing between professional and amateur listings. Based on an extensive dataset of 45 cities all over the world over a period of eight years, the paper investigates the distribution of amateur and professional listings. The article then focuses on the spatial diffusion of Airbnb listings in four selected cities (Amsterdam, Berlin, London, San Francisco) which represent diverse regulatory approaches, to identify local hotspots and establish who dominates these hotspots. The results show a worldwide trend of professional listings rising to dominate Airbnb, alongside few pockets of amateur marketplaces. Within cities, amateurs prevail only in local hotspots in specific cities—which, moreover, tend to be found only in peripheral locations. The results provide rich empirical insights into the commercialization and diversity of Airbnb as a network space around the world.
... Furthermore, the societal disembeddedness of platforms is raised in analyses of their governance. While they decentralize the ability to participate in the economy, digital platforms like crowdworking ones paradoxically centralize power as more businesses become dependent on them (Harracá et al., 2023;Kenney & Zysman, 2020), typically excluding users and workers from decision-making. Hence, recent studies have described digital platforms as 'undemocratic' and 'distant' toward their stakeholders, often lacking clear ethical values that would motivate their governance (Hardaker, 2021: p. 5-8;Scharlach et al., 2023). ...
... Similarly, certain platforms assert that they propose an open governance model (P3), adopting a cooperative status. This approach might help to address criticisms of platforms' centralized power (Harracá et al., 2023;Kenney & Zysman, 2020) and poor social and employment relations (Gawer & Srnicek, 2021;Gillespie, 2010;Katta et al., 2020). Finally, local platforms that display local cultural symbols (C2) may provide a solution to ethical concerns around the globalization of cultures and values (Elkins, 2019;Dal Yong, 2017;Sans & Quaglieri, 2016). ...
Article
Full-text available
Digital platforms are increasingly criticized for being disembedded, raising ethical concerns about their minimal links with the economic, political, and cultural environments in which they operate. Many ‘local digital platforms’ argue that their connection with and responsibility to their territory sets them apart from traditional digital platforms. However, more research is needed to better understand how local platforms claim different forms of territorial embeddedness to address the ethical challenges of the platform economy. In this article, we analyze these claims and abductively develop a typology of digital platforms’ links to their local environments based on eleven sub-dimensions, drawing on the concept of territorial embeddedness. According to our framework, territorial embeddedness is multifaceted, and platforms can be characterized by a continuum from weak to strong embeddedness. This renewed conceptualization offers a deeper understanding of local platforms’ territorial embeddedness. In addition, our framework allows for a critical examination of how local platforms respond to ethical challenges of the platform economy. Our research thus brings a fresh perspective to the polarized debate between platform capitalism and cooperativism.
... The outsourcing of informational curation to algorithmic tools may act to the detriment of our sense of responsibility for our epistemic environment, our trust in our epistemic capacities, and our habit of exercising them on a regular basis. 34 Harracá et al. (2023) describe this effect of platforms' operations as "organizing immaturity." ...
Article
Full-text available
The digital public forum has challenged many of our normative intuitions and assumptions. Many scholars have argued against the idea of free speech as a suitable guide for digital platforms’ content policies. This paper has two goals. Firstly, it suggests that there is a version of the free speech principle which is suitable for platforms that have adopted a commitment to free speech to guide their content curation strategies. I call it the Principle of Epistemic Resilience. Secondly, it aims to analyze some of the practical implications of the principle. It argues that upholding this principle in the digital public forum requires a comprehensive strategy, including (1) the automated removal and demotion of contents that threaten to cause serious harm; (2) changes to engagement optimization algorithms; and (3) changes to affordances inside the platform. These changes are necessary to create a fertile environment for deliberation, which is crucial to epistemic resilience. If such a comprehensive strategy is absent, platforms may actively undermine the societal value of speech.
... In other studies, the theory of affordances is less focal. Holford (2022) draws on affordance theory to denote the relationship between airline pilots and automated cockpit technology, while Harracá et al. (2023) situate their sociosymbolic perspective on immaturity in digital platforms as an extension of affordance theory. Despite these, our review of the literature found that the use of the theory of affordances in business ethics has been relatively scant. ...
Article
Full-text available
Biometric technologies are at the forefront of organizational innovation, surveillance, and control. In many instances, the use of physiological and behavioral biometrics enhances individual and organizational performance. However, they also have the potential to hinder human wellbeing. In particular, recent generations of biometrics are capable of extracting deeper insights into human behavior, enabling organizational surveillance practices, but may also constrain individual rights and freedoms. While biometric technologies have been evidenced to infringe upon privacy and lead to discriminatory practices, little research has examined the impact of biometrics on dignity, an important ethical construct related to human wellbeing. In this conceptual paper, we draw from the theory of affordances to identify and delineate six affordances of biometric technologies, categorized into inhibiting and augmenting biometric affordances. We propose a framework in which inhibiting and augmenting biometric affordances may simultaneously support and humiliate dignity. This separation offers a theoretical base for future empirical research to explore the increasingly pervasive relationship between biometric adoption and human dignity. Moreover, we explain six paradoxical tensions across three forms of dignity—inherent, behavioral, and meritocratic—in the proposed framework. Finally, we discuss why firms should be responsible for addressing the tensions across dignity forms when they adopt biometric technologies to balance the trade-off between wealth creation and human wellbeing. This offers guidance for practitioners on how to integrate biometric technologies without hindering human dignity.
... Society, through social interaction and the dynamics of daily life, has the ability to control and shape the narrative of information that is scattered. In the digital era and increasing connectivity, social biopower is crucial in shaping collective perceptions of access to information and public policies (Harracá et al., 2023;Karakayali & Alpertan, 2021). ...
Article
Full-text available
This study aims to provide an in-depth understanding of the role of local politics and socio-cultural forces in public information disclosure and offer both theoretical and practical contributions to corruption prevention efforts in Indonesia. This study employs Michel Foucault's concept of biopower as a framework to analyze and assess the political and sociocultural forces at play in Sendang village's relationship with public information transparency. The results show that the village government can serve as an important model in the implementation of transparency and accountability, as well as increasing community participation in the decision-making process. In addition, the results of this analysis show that, when properly implemented, public disclosure can be an important mechanism to facilitate more responsive and participatory governance. The contribution of this study lies in providing new insights into how local governments can be integrated into village government structures and how communities can be empowered through access to information.
Conference Paper
This article addresses two important gaps in the digital platform literature. First, it addresses the lack of a shared conceptualization of digital platforms. Following a literature review, we identified three perspectives explicitly discussed in the literature to define digital platforms. In addition, we discovered an implicit perspective that, although not directly addressed, emerged in the literature. With the help of these perspectives, their scopes, and definition examples, researchers and practitioners can enhance their communication and improve the comparability of future studies. The second gap this paper addresses is the lack of criteria that distinguish e-commerce entities. This paper proposes an e-commerce taxon-omy based on both the literature and an empirical sample. This taxonomy offers an overview of the distinctive characteristics expressed by these entities such as "net-work effects", "type of network effects", "information availability for the seller" and "information availability for buyers". Our taxonomy contributes to the theoretical and empirical understanding of e-commerce entities by providing a structured framework that can be used to study and compare different e-commerce entities systematically.
Conference Paper
The concept of digital platforms has attracted a great deal of attention in the fields of economics and information systems, shaping scholarly discourse for the past few years. However, despite the frequent usage of the term "digital platform" in the literature, its definition covers a wide range of meanings. Scholars often use the term to mean very distinct concepts which creates a sort of semantic confusion within the scientific community. To clarify the different elements and perspectives shaping the definition of a digital platform, a systematic literature review was carried out. A total of 64 definitions from 58 scholarly resources were collected and analyzed through qualitative content analysis. This analytical process aimed to isolate and identify the fundamental elements used in defining digital platforms, along with discerning the multiple perspectives from which this term is approached. Furthermore, this study traces the evolution of the definition of the concept in parallel with the dynamic development of digital platforms in recent years.
Article
Full-text available
This study examines how self-taught online graphic designers in Kaliabu Village, Salaman District, Magelang Regency, Central Java, navigate the challenges and dependencies arising from digital platforms, particularly 99designs. Using a case study and in-depth interviews, the research reveals that, while digital technology appears to open new opportunities and empower individuals, it often reinforces control and exploitation within the digital economy. Most Kaliabu designers, lacking formal technical or creative education, leverage 99designs to access the global market, relying on self-developed skills to compete. Despite utilizing their informal capital, they are constrained by the platform's stringent rules. Although 99designs offers broad economic access, it imposes terms that limit designers' autonomy and innovation. Dependence on the platform forces designers to constantly adapt to its policies and face intense competition. The crowdsourcing system of 99designs, where projects are awarded through open competitions, exacerbates these dynamics, pushing designers to compete for every job. In the digital capitalist context, their informal capital often fails to overcome the platform's control and exploitation. This study highlights that while Kaliabu's self-taught designers can compete globally, they must navigate the balance between the freedom and control offered by digital technology. Digital technology enables participation in the digital industry but also presents challenges of control and exploitation that need addressing.
Article
Drawing upon case study research investigating the Irish Health Service Executive’s (HSE) response to a fake news attack on their human papillomavirus (HPV) vaccination campaign, we argue that responses to fake news should be analyzed from a legitimacy perspective. A model for emotional legitimacy management is proposed in which the HSE and a third-party collaborate to (a) connect with the emotional aspects of the issue; (b) leverage emotions to build vicarious legitimacy; (c) transfer the third-party’s legitimacy to the HSE; and (d) emotionally activate the public. This study contributes to fake news and legitimacy management by moving beyond fact-checking and debunking strategies. We suggest a framework centered on legitimacy in which emotions are used to counteract fake news. Finally, we emphasize the importance of third-party vicarious legitimacy building and the transfer of this legitimacy to the organization.
Article
Full-text available
The digital economy has brought new business models that rely on zero-price markets and multi-sided platforms nested in business ecosystems. The traditional concept of market power used by competition authorities cannot engage with this new reality in which (economic) power manifests beyond price and output within a relevant market. These developments have culminated in multiple recent calls for a more multidimensional concept of power. Consequently, suggestions over new concepts of power triggering antitrust/regulatory intervention, such as ‘strategic market status’, ‘conglomerate market power’, ‘intermediation power’, ‘structuring digital platforms’, or ‘gatekeepers’ have proliferated to complete, or even substitute, the archetypical concept of market or monopoly power in competition law. However, a theoretical framework for this multidimensional concept of power that can set the basis for new metrics is missing. This article makes three contributions in that direction. First, we conceptualize different forms of (economic) power that go beyond competition within a single relevant market in terms of competition law and economics. Second, we propose new metrics to measure two forms of power: panopticon power and power based on differential dependency between value co-creators. Third, we test the latter and show how they could reduce false positives and false negatives when assessing dominance.
Article
Full-text available
Manufacturing firms are increasingly seeking to capture the potential of digitalization by transforming towards digital servitization. Yet, most manufacturers struggle to realize the value through digital servitization because it requires a sustained focus on forming ecosystem partnerships. Digital servitization research has long recognized the importance of ecosystem tranformation but much of the existing discussion on this interlink is fragmented and understudied. Therefore, this study's purpose is to investigate how manufacturing firms engaged in digital servitization transform their ecosystems. To this end, we have examined the triggers, firm-level enablers, ecosystem phases and activities, and effects of ecosystem transformation in digital servitization. We provide a comprehensive review of the phases of ecosystem transformation including ecosystem formation, orchestration, and expansion as well as their associated activities. These findings have been consolidated into an integrative framework for ecosystem transformation and, based on this analysis, suggestions for future research are provided for digital servitization scholars.
Article
Full-text available
A predominant assumption in studies of deliberative democracy is that stakeholder engagements will lead to rational consensus and to a common discourse on corporate social and environmental responsibilities. Challenging this assumption, we show that conflict is ineradicable and important and that affects constitute the dynamics of change of the discourses of responsibilities. On the basis of an analysis of social media engagements in the context of the grand challenge of plastic pollution, we argue that civil society actors use mobilization strategies with their peers and inclusive-dissensus strategies with corporations to convert them to a new discourse. These strategies use moral affects to blame and shame corporations and solidarity affects to create feelings of identification with the group and to avoid disengagement and polarization. Our research contributes to the literature on deliberative democracy and stakeholder engagement in social media in the collective constructions of discourses on grand challenges.
Article
Platform firms have been depicted as having structural and instrumental power and being able to prevail in regulatory battles. This article, in contrast, documents how they have often adapted to regulations and provide different services across locales. I show that platform firms have a specific type of power, infrastructural power, that stems from their position as mediators across a variety of actors. This power, I argue, is shaped by pre‐existing regulations and the firms' strategic response, that I call “contentious compliance”: a double movement of adapting to existing regulations, while continuing to challenge them. I apply this framework to the expansion and regulation of Uber in New York City (US), Madrid (Spain), and Berlin (Germany).
Article
Many digital platforms give users a bundle of goods sourced from numerous creators, generate revenue through consumption of these goods, and motivate creators by sharing of revenue. This paper studies the platform’s design choices and creators’ participation and supply decisions when users’ (viewers’) consumption of goods (content) is financed by third-party advertisers. The model specifies the platform’s scale: number of creators and content supplied and magnitudes of viewers, advertisers, and revenues. I examine how the distribution of creator capabilities affects market concentration among creators and how it can be influenced by platform design. Tools for ad management and analytics are more impactful when the platform has sufficient content and viewers but has low ad demand. Conversely, reducing viewers’ distaste for ads through better matching and timing—which can create win–win–win effects throughout the ecosystem—is important when the platform has strong demand from advertisers. Platform infrastructure improvements that motivate creators to supply more content (e.g., development toolkits) must be chosen carefully to avoid creating higher concentration among a few powerful creators. Investments in first-party content are most consequential when the platform scale is small and when it has greater urgency to attract more viewers. I show that revenue sharing is (only partly) a tug of war between the platform and creators because a moderate sharing formula strengthens the overall ecosystem and profits of all participants. However, revenue-sharing tensions indicate a need to extend the one-rate-for-all creators approach with richer revenue-sharing arrangements that can better accommodate heterogeneity among creators. This paper was accepted by David Simchi-Levi, information systems.
Book
This book develops new theoretical perspectives on the economics and politics of innovation and knowledge in order to capture new trends in modern capitalism. It shows how giant corporations establish themselves as intellectual monopolies and how each of them builds and controls its own corporate innovation system. It presents an analysis of a new form of production where Google, Amazon, Facebook, Apple and Microsoft, and their counterparts in China, extract value and appropriate intellectual rents through privileged access to AI algorithms trained by data from organizations and individuals all around the world. These companies’ specific form of production and rent-seeking takes place at the global level and challenges national governments trying to regulate intellectual monopolies and attempting to build stronger national innovation systems. It is within this context that the authors provide new insights on the complex interplay between corporate and national innovation systems by looking at the US-China conflict, understood as a struggle for global technological supremacy. The book ends with alternative scenarios of global governance and advances policy recommendations as well as calls for social activism. This book will be of interest to students, academics and practitioners (both from national states and international organizations) and professionals working on innovation, digital capitalism and related topics. Bengt-Åke Lundvall is Professor emeritus in economics at Department of Business Studies at Aalborg University and Professor emeritus at Department of Economic History at Lund University. His research is organized around a broad set of issues related to innovation systems and learning economies. Cecilia Rikap is Lecturer in International Political Economy at City, University of London, CONICET researcher and associate researcher at COSTECH, Université de Technologie de Compiègne. She has a PhD in Economics from the University of Buenos Aires, Argentina. Her research deals with the global political economy of science, technology and innovation.
Article
This article explores some of the critical challenges facing self-regulation and the regulatory environment for digital platforms. We examine several historical examples of firms and industries that attempted self-regulation before the Internet. All dealt with similar challenges involving multiple market actors and potentially harmful content or bias in search results: movies and video games, radio and television advertising, and computerized airline reservation systems. We follow this historical discussion with examples of digital platforms in the Internet era that have proven problematic in similar ways, with growing calls for government intervention through sectoral regulation and content controls. We end with some general guidelines for when and how specific types of platform businesses might self-regulate more effectively. Although our sample is small and exploratory, the research suggests that a combination of self-regulation and credible threats of government regulation may yield the best results. We also note that effective self-regulation need not happen exclusively at the level of the firm. When it is in their collective self-interest, as occurred before the Internet era, coalitions of firms within the same market and with similar business models may agree to abide by a jointly accepted set of rules or codes of conduct.