ChapterPDF Available

What are algorithmic cultures?

1 What are algorithmic cultures?
Jonathan Roberge and Robert Seyfert
The current, widespread dissemination of algorithms represents a double chal-
lenge for both our society and the social sciences tasked with studying and
making sense of them. Algorithms have expanded and woven their logic into the
very fabric of all social processes, interactions and experiences that increasingly
hinge on computation to unfold; they now populate our everyday life, from the
sorting of information in search engines and news feeds, to the prediction of per-
sonal preferences and desires for online retailers, to the encryption of personal
information in credit cards, and the calculation of the shortest paths in our
navigational devices. In fact, the list of things they can accomplish is rapidly
growing, to the point where no area of human experience is untouched by
them—whether the way we conduct war through ballistic missile algorithms
and drones, or the manner in which we navigate our love lives via dating apps,
or the way we choose how to dress by looking at weather forecasts. Algorithms
make all of this possible in a way that initially appears disarmingly simple.
Onewaytoapproach algorithms isthroughKowalski’snowclassicdenition:
“Algorithm = Logic + Control” (1979). Using both simple and complex sorting
mechanisms at the same time, they combine high- level description, an embedded
command structure, and mathematical formulae that can be written in various
programming languages. A wide variety of problems can be broken down into a
set of steps and then reassembled and executed or processed by different algo-
rithms. Hence, it is their versatility that constitutes their core capability and
power, which extends far beyond the mathematical and computer sciences.
According to Scott Lash, for instance, “a society of ubiquitous media means a
society in which power is increasingly in the algorithms” (2007, 71), an idea
echoed by Galloway when he states that “the point of power today resides in net-
works, computers, algorithms, information and data” (2012, 92). Yet, it is imper-
ative to remain cautious with such formulations, and their tendency to be too
critical, too quickly. While it may capture important challenges that society faces
with ‘the rise of the algorithm,’ it can also provide something of a teleological or
deterministic “seductive drama,” as Zietwitz has recently warned us (2016, 5).
Algorithms can actually be considered less sovereign than mundane in this
regard—that is, again, deeply rooted in the fabric of society. Rather than being
omnipotent, they are oftentimes ambiguous and quite messy. What is crucial,
2 J. Roberge and R. Seyfert
then, is to bring into question how, and especially why, the apparent simplicity
of algorithms is in fact inseparable from their complexity, in terms of their
deploymentand multiple, interrelated ramications. These areepistemological
as well as ontological interrogations, confronting not only the social sciences but
society at large. As both a known unknown and an unknown known, the sorting
mechanism that is the algorithm still needs some sorting out.
 Thisintroductioniscertainlynotthe rst to stress the inherentdifcultyof
shedding light on algorithms. Seaver, for instance, observes how they “are tricky
objects to know” (2014, 2), while Sandvig insists on “the complexity of repre-
senting algorithms” (2015, 1; see also Introna 2016; Barocas et al. 2013). Con-
ceptually perspicacious as they are, these arguments do not, however, foreclose
the need to understand the extent of such invisibility and inscrutability. On the
surface,itis oftenthe‘blackbox’ natureofthealgorithms thatisrstevoked,
namely that they are incredibly valuable patented trade secrets for companies
such as Amazon, Google, Facebook, and the like. If they were revealed to non-
insiders, they would eo ipso be ruined. Or at least so we are told by numerous
technical, economic, legal, and political experts (Pascale 2015). This is where
things noticeably start to get more serious and profound. There is not one box,
but multiple boxes. The opacity of algorithms is more precisely expressed in dif-
betweenness of a plethora of actors, both human and non- human. While a few
commentators have remarked upon the plural character of such opacity (Burrell
2016; Morris 2015), the fact remains that each and every algorithm can only
exist in rich and dense, if not tense, environments.
This is the inherently messy, vivid, and dynamic nature of algorithms, which
explains why they are ultimately so challenging to study. As Kitchin puts it, “cre-
ating an algorithm unfolds in context through processes such as trial and error,
play, collaboration and negotiation” (2014, 10). The latter term is of particular
algorithms. On the most fundamental level, they are what one can call anthropo-
logically entrenched in us, their creators and users. In other words, there is a “con-
stitutive entanglement” where “it is not only us that make them, they also make
us” (Introna and Hayes 2011, 108). Indeed, the problem with such mutual imbrica-
tion is that algorithms cannot be fully ‘revealed,’ but only unpacked to a certain
extent.Whatis more, they always nd themselves temporally entrenched, so to
speak. They come to life with their own rhythm, or, to use Shintaro Miyazaki’s
description in this volume, “they need unfolding, and thus they embody time” (p.
129). Another metaphor that proves useful in this regard is Latour’s idea of the
well be hard to follow or may even be imperceptible from time to time. The most
important point to make here is how practical and mundane they are. Again, they
unfold in a state of incessant negotiation and in- betweenness; for all algorithms, as
Seaver has noticed, there are “hundreds of hands reaching into them, tweaking and
tuning, swapping out parts and experiencing with new arrangements” (2014, 10).
What are algorithmic cultures? 3
The multiple ways in which algorithms unfold today thus give new meaning
to the familiar description, “the most profound technologies are those that dis-
appear” (Weiser 1991, 95). But there is more. We would like to take this oppor-
tunity to argue that such concrete unfoldings also give a new yet complex
meaning to what it is that algorithms actually do, i.e., the kind of agency and
performativity they embody. Of course, there is now a substantial tradition of
academicsworkingwithin this broadlydenedpraxiologicalparadigm,includ-
ing Lucas Introna (this volume, 2016, 2011), Adrian Mackenzie (2005), David
Beer (2013), and Solon Barocas et al. (2013). Somewhat aligning ourselves with
them, we invoke Andrew Goffey’s persuasive insight that “algorithms do things,
and their syntax embodies a command structure to enable this to happen” (2008,
17)—an insight almost as persuasive as Donald MacKenzie’s description of the
algorithm as “an engine, not a camera” (2006). Many things could be said about
such a position, and it will be important to come back to them in due time. It suf-
cesforthe momenttosaythat theagencyofalgorithmsis afarcryfrom the
category of ‘action,’ if we understand by the latter something purposive and
straightforward. On the contrary, the type of agency involved here can be best
described as ‘fractal,’ that is, producing numerous outputs from multiple inputs
(Introna 2016, 24). What counts as ‘control’ in the algorithmic sense is in fact
relatively limited; there is so much more implied before, during, and after the
operation of algorithms. For instance, to both the anthropological and temporal
entrenchment discussed above, it appears necessary to add the concept of self-
entrenchment, whereby one algorithm is intertwined with many others in
extremely intricate networks. Non- human as much as human contributions are
thus key here, and could rather easily result in mismatches, unpredictable results,
or even dramatic failure—as will be seen later. It is as if algorithms themselves
are constituted by the very possibility of ‘being lost in translation,’ not only in
their relations to machines, code, or even some more discursive dimensions, but
algorithmisperformative bydenition,andtobeperformativeis tobehetero-
geneous in all circumstances (Kitchin 2014, 14–15; Seaver 2014). To be able to
carefully read such messy unfoldings constitutes a pressing challenge for the
social sciences in general, and for cultural sociology in particular. What does it
mean, indeed, if these unfoldings themselves become a particular object of
investigation? How is it that we could or should adapt in turn, with what kind of
precision, changes in focus, and so forth?
Now is an appropriate moment to assess the state of research on algorithms in
maturity, even if it was not until very recently that it started to migrate to the
humanities, social sciences, and cultural studies. Currently, there are several
promising cross- currents that more or less co- exist, but that do not yet properly
engage with one another. First, there are those authors developing almost stand-
aloneconcepts:“the algorithmicturn”(Uricchio2011), “algorithmicideology”
(Mager 2012), “algorithmic identity” (Cheney- Lippold 2011), “algorithmic life”
4 J. Roberge and R. Seyfert
(Amoore and Piotukh 2016), and the like. There are also signicant attempts
Technologies Studies (STS) and the Social Studies of Finance (MacKenzie
2015; Wansleben 2012), as well as embryonic efforts to develop Critical Algo-
rithm Studies (The Social Media Collective 2015). In addition, there have been
and Europe, including ‘Governing Algorithms’ (Barocas et al. 2013) and the one
that gave rise to this book project (Ruhe 2014). Together, these different per-
spectives have raised crucial epistemological questions as to what would consti-
tute the most appropriate scope for studying algorithms. For instance, what
would be too narrow or too broad? And what constitutes the ideal distance to
study algorithmic culture, allowing for a critical reexivity without being
too detached or removed from the actual practice and operation of algorithms?
To this can be added the problems often associated with so- called ‘hot topics,’
that is, the pursuit of the ‘new’ for its own sake, and how to avoid falling into
the “trap of newness” (Beer 2013, 6–7; Savage 2009).
Conceptual innovation, in light of such questions and problems, might very
well mean returning to, and relying and building on older but more solid founda-
tions, which do in fact exist. What we propose in this introduction is thus to
revisit and modify Alexander R. Galloway’s classic intervention, which con-
strues ours as an age of algorithmic culture (2006). This idea of culture as
marked by the algorithmic resonates strongly with the encompassing yet estab-
lished discipline of cultural sociology and its efforts ‘to take meaning seriously,’
tions, but as something deeply rooted in reality, agency, and performativity.
Indeed, a cultural sociology of the algorithm is possible only insofar as algo-
rithms are considered as both meaningful and perfomative, that is to say, perfor-
mative for the very reason that they are meaningful, and vice versa. It is our
contention here that while the aforementioned perspectives are all signicant
contributions, they generate rather than obviate the need for thicker, deeper, and
more complex analyses of the kind of culture that algorithms are currently
shaping. As the title of this volume suggests, we want to engage with this pos-
sibility of an algorithmic culture by supplementing or contaminating it with
observations on pluralization.
The plurality of cultures in algorithmic cultures
Despite its theoretical potency, Galloway’s innovation was never fully
developed, and appears more inspirational than analytical. Of late, it is mostly
TedStriphas whohasled whathecalls“historico-denitional”effortsin deter-
mining what could more fully constitute such an algorithmic culture (2015,
2009; Hallinan and Striphas 2014; see also Roberge and Melançon forthcoming;
and to a lesser extent, Kushner 2013). And the way he puts things in perspective
hasa rather humanistic tone: “What doesculture mean, and what might it be
coming to mean, given the growing presence of algorithmic [recommendation]
What are algorithmic cultures? 5
systems [. . .]?” (Hallinan and Striphas 2014, 119). His attempt, in other words, is
gearedtowardsndingessential,ifnotontological,categories under the terms
“work of culture” or “world’s cultural heritage,” and their fundamental trans-
formation through automation. For Striphas, it is all of the circulation, sorting,
and classifying processes that are now dictated by “a court of algorithmic
appeal.” This too is a powerful notion; Striphas’s argument is worth mentioning
as it is epistemologically sound and captures the important stakes in this debate.
On the one hand, he never fails to acknowledge the dual nature of algorithmic
culture, or the way its semantic dimensions are inseparable from its more techni-
cal ones. On the other hand, he fully appreciates how the very ‘publicness’ of
culture is currently being black- boxed through processes of privatization, to
which we return below. The problem, small as it is, is elsewhere. If Striphas’s
arguments can be criticized at all, then it will be for their tendency to be relat-
ively abstract and broad. To say that we are witnessing a shift towards algorith-
mic culture does not necessarily have to be an all- encompassing theoretical
move. His idea of algorithmic culture remains one concept of one culture. In the
end, as much as it is meaningful and consistent, it struggles to recognize the
variety of algorithms today, and the ways they are fractal and heteronomous by
denition.Sohow do we proceed from here? How can we develop anunder-
standing of algorithmic culture that takes meaning seriously by being especially
attentive to its inherent performativity and messiness? One possible way is to go
even further back in time, to another seminal author who preceded Striphas and
Galloway. In the 1970s Michel de Certeau wrote La culture au pluriel, in which
heinsists thatanydenitionofculturewouldhavetoconceiveofitas un mul-
tiple (1974; translated by Conley as Culture in the Plural, 1998). While he could
nonetheless vital, and inspirational in this context. Indeed we are currently living
in the age of algorithmic cultures.
 Althoughdifculttorepresentinsimplelogicalterms,onethingcanbemany,
and multiple disparate things can be very commensurable. Such is an archipel-
ago—for instance, the Bahamas and the Philippines—to give a metaphorical
example. In the case of algorithmic cultures, it is necessary to make sense of
how a certain enclosure is nonetheless part of a larger whole. There are of course
many ways to explain such an enclosure; one that has become almost main-
stream in cultural sociology comes from the Yale School, which insists on giving
cultural realities a ‘relative autonomy’ in the way their terms are often dependent
on one another (see Alexander 2004, 1990; Alexander and Smith 2002, 1998;
routinized ‘inside,’ an internal or auto- referential logic that is all interrelated
meanings. They are a textual reality even before they are mathematical calcula-
tions; they crystallize imaginaries, hopes, expectations, etc. As Valentin Rauer
puts it later in this volume, “Algorithms are part of a broader array of performa-
tivities that includes, for example, rituals, narratives, and symbolic experiences”
(p. 142). As contingent normalizers and stabilizers, they have a symbolic life of
their own which, like texts, only makes sense in a particular context. Cultural
6 J. Roberge and R. Seyfert
sociology rests here on what may constitute an original, yet very solid theoret-
ical ground. Jeffrey Alexander’s notion of “relative autonomy” resonates with
Lorraine Daston’s more recent narratological perspective, for instance, which
inquiresintothe specic“historyandmythology[...]ofthe algorithm”(2004,
362). To give a concrete example of how an algorithm, or a set of algorithms—a
networkoraspecicfamily,sotospeak—developsby, of, and for its own, our
contributor Lucas Introna has shown elsewhere how algorithms used to detect
an ‘original’ text. As algorithms can identify matching copies by fastening upon
suspicious chains of words, writers have adapted their style of writing. Plagi-
arism algorithms are thus only able to detect “the difference between skillful
copiers and unskillful copiers,” and thereby performatively and somehow para-
doxically produce the skillful copier as an ‘original’ author, resulting in an entire
culture surrounding the sale of ‘original’ essays and ghost- writing services
(Introna 2016, 36). Hence, instead of treating algorithms as mere utilitarian
devices,thestudyofalgorithmicculturesratheridentiesthemeaningfully per-
they do, culturally speaking? How do they make sense of their surroundings and
the different categories people use to interpret them?
As it turns out, one of the most salient points to be made in this introduction
revolves around algorithmic cultures as being un multiple. Nick Seaver offers a
similar argument when he notes that “rather than thinking of algorithms- in-the-
wild as singular objects, [. . .] perhaps we should start thinking of them as a popu-
lation to be sampled” (2014, 6). Algorithms are dynamic entities that mesh with
another appealing way to make sense of their relative autonomy and enclosure is
to borrow from the language of cybernetics (Totaro and Ninno 2014; Becker
2009). Feedback loops, decision-making by classication, continual adaption,
and the exchange of information are all characteristics of recursive quasi- circular
routines that typify the non- linear unfolding of algorithms, as seen above. Göran
Bolin and Jonas Andersson Schwartz have recently given this idea a practical
spin, noting that
(a.) in their daily operation, professionals have to anticipate what the end-
user will think and feel; [. . . and that] (b.) many everyday users try to antici-
pate what the [. . .] media design will do to them, [. . .] which involves a
recourse back to (a.)
(2015, 8)
Google could serve as a prime example here. Complex and multivalent, there
exists, as our collaborator Dominique Cardon calls it, something like a unique
“PageRank spirit” (2013; see also in this volume), in which symbolic as well as
performative aspects are constantly interacting. Such a spirit is easy to spot in
the cyclical anticipation of needs, the satisfaction of experience, and the person-
alization of navigation, all typical of the contemporary search engine. It is also
What are algorithmic cultures? 7
evident in the implementation of sophisticated algorithms over the years—such
as Panda, Penguin, Hummingbird, and Pigeon—and how they have helped in the
on- going struggle against the polluting power of search engine optimization (see
Röhle 2009). Lastly, this particular spirit is present in how Google has tried to
nda balance between its sense ofnatural, meritocratic indexing and its own
commercial needs, which then serve to subsidize its more futuristic technolo-
gical endeavors. Not only are these three examples recursive in themselves, but
theyalsoendup swirling togetherandinuencingoneanotherto create adis-
tinctive, powerful, and meaningful algorithmic culture. This is precisely Goog-
le’s own “culture of search” (Hillis et al. 2013) or, to put it more bluntly, the
“Googleplex” (Levy 2011). Is this to say that the company has no sense of what
is going on outside? Certainly not. Rather, this particular culture can co- operate
with others, and may even coincide with others in many respects, but it does not
cultures, in other words, should be able to zoom in and zoom out, to see the
particularities of each algorithmic culture, as much as what they also have in
Examples of this abound: individuality and reaching, particularity and
sharing, distinctiveness and commensurability, small and big picture. For algo-
rithmic cultures can of course cut across various social, economic, and political
spheres; for instance, when a particular usage of predictive algorithms in the
stock market borrows its probabilistic methods from games of chance, transport-
needs. Or when developments in articial intelligence are derived from com-
puter algorithms in the game of chess, thereby shaping the very future of arti-
cial intelligence for years to come (Ensmenger 2012). Thus, algorithmic
mobile methods that are adapted, transformed and made to measure for each par-
ticular use. In fact, this entire volume serves as proof for this argument. Each
chapter develops a unique take on what it means for algorithms to be culturally
entrenched and performative; each of them explores the density extending from
aparticular assemblage or ecology by proposinga specic interpretation. The
exactdescriptionofthechapters’contentswillcomeinamoment,but sufce
now to say that it also falls on the reader to navigate between them, to ask the
questionss/hejudgesappropriate,andto wrestle with the different intellectual
possibilities that are opened up.
To argue that algorithmic cultures are un multiple still opens, rather than fore-
tute their variable yet common nature. There must be something; indeed,
algorithms revolve around a question or an issue that is each and every time par-
ticular but nonetheless always similar. We want to suggest here, as others have,
that such important stakes constantly bring about and thus recycle “the power to
enable and assign meaningfulness” (Langlois quoted in this volume in Gillespie
2014; see also Roberge and Melançon forthcoming). This is a question as old as
the idea of culture itself, and the social sciences have been aware of it for their
8 J. Roberge and R. Seyfert
entire existence too, from the moment of their founding until today (Johnson et
al. 2006). Culture needs legitimacy, just as algorithms and algorithmic cultures
need legitimacy. It is about authority and trust; it is about the constant intertwin-
ing of symbolic representation and more prosaic performance, the production as
well as the reception of discursive work. In our current day and age, we are wit-
nessing the elaboration of a kind of ‘new normal’ in which algorithms have
come to make sense in the broader imaginary; they are ‘accepted’ not because
they refer to something transcendent in the classical sense, but because they have
developed such acceptability in a newer, more immanent way. Scott Lash’s
insight regarding algorithms’ principle of “legitimation through performance” is
fundamental in this regard (2007, 67). In their actual real- time unfolding, algo-
rithms implicitly or explicitly claim not only that they are cost- effective, but
moreover objective, in both an epistemological and a moral sense. Again, this
inanenclosedroutinethatsays verylittleinfact:algorithms work straightfor-
wardly, they provide solutions, etc. Neutrality and impartiality are whispered and
tacitly assumed. Tarleton Gillespie notes something similar when he underscores
that “more than mere tools, algorithms are also stabilizers of trust, practical and
symbolical assurances that their evaluations are fair and accurate, free from sub-
jectivity,error,or attempts at inuence” (Gillespie 2014, 179; seealsoMager
2012). That is the magic of something non- magical. Objectivity as an informa-
tion process, a result, and a belief is the equivalent of legitimacy as the result of
a form of belief. The strength of algorithms is their ability to project such objec-
tivity to the outside world (to what is in their rankings, for instance), while accu-
mulating it ‘inside’ the algorithms themselves as well. This is because any
provider of value ought to be constructed in a way that is itself valued. Gillespie
is astute on this point, noting that “the legitimacy of these functioning mecha-
nisms must be performed alongside the provision of information itself (2014,
179). Here legitimacy acquires an ontological dimension.
This is not to say that the quest for legitimacy is an easy endeavor—quite the
contrary. Performance and justication exist only insofar as they can nd an
audience, to the point in fact where the ‘reception’ part of the equation is just as
important. The problem, of course, is that such reception is inherently cultural
and constituted by interpretation, expectation, affect, speculation, and the like
(Galloway 2013; Seyfert 2012; Kinsley 2010). Reception, in other words, is
performance” is for this reason nothing less than a steady negotiation—in terms
close to those discussed above. Performance and reception interweave in such a
algorithms cannot foreclose the possibility of contestation. The hopes and desires
tion is performative, so too is criticism. The controversy that erupted around
Google Glass is a case in point. Research into their Glass Explorer program
into the corporate planning for wearable computing (Roberge and Melançon
What are algorithmic cultures? 9
forthcoming). For example, to give Google Glass a broader appeal, the company
hired a Swedish designer to help design the device, including its color palette
and minimalistic contours (Miller 2013; Wasik 2013). Regardless, the critical
response was negative, noting that Glass is “so goddam weird- looking,” “ugly
and awkward,” and makes interaction “screamingly uncomfortable” (Honan
2013; Pogue 2013). Social and cultural discomfort with this new form of inter-
action helps explain the algorithmic device’s critical reception. In the end, it was
the pejorative term “glasshole,” symptomatically blending aesthetic and
forced Google to withdraw. What this example thus shows is how ambiguous
variousmeaningsandinterpretiveconicts, as wellasthealgorithmiccultures
they shape, end up being. Messiness is not an option; it is an ongoing and trans-
formative characteristic.
Algorithmic trafc: calculative recommendation, visibility
and circulation
The key idea behind this volume on algorithmic cultures is that such cultures are
plural, commensurable, and meaningfully performative. The purpose here is to
offer a “thick description” à la Geertz (1973), i.e., an analysis of different routi-
nized unfoldings that revolve around rich and complex stakes and issues. Legiti-
macy is certainly one of these. Everyday life is full of occasions where this
question is not raised, but here the stakes are tremendous, as they encroach on
some sort of cultural core. Algorithms are sorters; they are now key players in
the gatekeeping mechanisms of our time (Hargittai 2000). To be sure, gatekeep-
ing has been around for a long time, from the arts patrons of the classical age to
modern-daynewspapercritics. Butthisonlystrengthenstheargument:therole
played today by algorithms still adheres to a prescriptive selection of ascribing
value, for a particular audience, with all of the attendant moral and political
valences. Gatekeeping is about making editorial choices that others will have to
deal with. It is about taste and preference- making, which explains, at least in
part, why many recommendation algorithms are so inuential today, from
It is about the visibility of culture, and of particular forms of culture that
algorithmically nds its audience. These systems shape cultural encounters
and cultural landscapes. They also often act and make taste visible. The
question this creates is about the power of algorithms in culture and, more
specically, the power of algorithms in the formation of tastes and
(Beer 2013, 97, emphasis added)
Two recent articles in particular have captured this trend and how it has evolved
inspecicsettings,onein terms of lm(HallinanandStriphas2014),andthe
other in music (Morris 2015). Netix, and specically the Netix Prize, is
10 J. Roberge and R. Seyfert
emblematic in many regards; launched in 2006, the contest offered US$1 million
to whoever could rst boost the accuracy of their recommendation algorithm
over the benchmark of 10 percent. As the challenge was a huge success among
computer scientists in the U.S. and abroad, it represents for Blake Hallinan and
Striphas a prime example of how “questions of cultural authority are being dis-
placedsignicantly into the realm of technique and engineering”(2014, 122).
Yet this is only one half of the equation. The other half deals with the logic or
the economic purpose enabling such a quest for personalized recommendation,
something the authors call a “closed commercial loop,” in which “the production
of sophisticated recommendation produces greater customer satisfaction which
produces more customer data which in turn produce more sophisticated recom-
mendations, and so on” (122). Where information processing becomes key, the
meaning of culture drifts toward simpler views on data, data- mining, and the
Spotify in 2014. The management of massive databases and new behavioral
tracking techniques, by those that Morris calls “infomediaries,” now relies “on
tastes” (2015, 456). This is the case because it essentially opens the door to
“highly segmented and targeted advertising opportunities” (455). This logic or
trend is indeed very strong, though it is not the only one at play. Morris’s argu-
ment is subtle enough to recognize the pervasiveness of human- maintained play-
lists as a mode of alternative curation that most of today’s platforms are unable
to let go of. These human- to-human taste dialogues, so to speak, still exist in
most music streaming services as a way to cope with the abundance of content.
Both automated and so- called ‘manual’ gatekeeping mechanisms thus co- exist
more or less side by side in a sort of complex, if tacit and very delicate, tension.
The data- intensive economy and culture that is currently taking shape is also
of interest to Lucas Introna in his contribution to our volume. By tracing the
genealogy of online advertising, he analyzes recent forms of what he calls “algo-
rithmic choreography.” While traditional online advertisements indiscriminately
place ads on sites that all users will encounter—a banner on the top of a
webpage, for instance—more innovative brokers such as Dstillery adapt to what
they perceive as the needs of the individual. Data- mining, behavioral targeting,
contextual advertising, machine- learning algorithms, and the like are thus all part
subjects are addressed through personalized advertisements. Time and again, it
is about addressing “the right person at the right time with the right creative
content” (p. 41). Such a choreography requires and enacts particular forms of
subjectivity, which Introna calls “impressionable subjects,” i.e., subjects that are
willing to be impressed by the information the algorithm has prepared for it at
any given moment. In one way of reaching customers in an online advertisement
called “prospecting,” data are collected from user activities on the spot (through
clicks, queries, etc.). From such data, correlations can be derived and users can
What are algorithmic cultures? 11
in the same products as another user who visited similar sites. On the one hand,
in algorithmic cultures the subject is treated as a mere statistical entity, a branded
subject. On the other, subjects are not entirely passive, but rather are actively
engaged in the selection of information they see and how they are shaped by it;
they partially curate what they are going to see (and perhaps buy) through their
own behavior. Thus, user behavior and online advertising become deeply cul-
tural and social affairs because they either enact subjects or fail to connect with
them. Introna shows how in their own way algorithmic cultures are un multiple,
that is, very generic but at the same time very personal. Placing an advertisement
tisement is not only a missed opportunity, but can also question and insult the
subject (‘Why am I seeing this?’).
In his contribution, Tarleton Gillespie investigates the complexity and hetero-
geneity of automated gatekeeping by addressing the rich yet understudied sub-
category of trending algorithms. Indeed, these are everywhere today, from
Buzzfeed to Facebook and Twitter; they are an icon of a new genre that is often-
grained analysis thus starts by asking not what algorithms do to cultural artifacts,
but instead “what happens when algorithms get taken up as culture, when their
kinds of claims become legible, meaningful and contested” (p. 69). Such algo-
rithms appear as a measurement ritual, but of exactly what is less clear. Is it a
glimpse into the popularity of different content, as was American Top 40 or Bill-
board?Is it a small window into‘us,’ with the attendant problem ofdening
exactly who this ‘us’ is—a public, a nation, etc.? Or is it simply about capturing
some sort of pulse, velocity and movement in between undisclosed and thus
incalculable points? Surprisingly, all these difculties are fueling, rather than
extinguishing, the urge to measure and position measurement as a meaningful
accomplishment. In other words, trending algorithms are popular because they
are inherently ambiguous. In addition, real and practical biases are numerous, as
if they were inscribed in the very DNA of these algorithms. According to
Gillespie, this has to do with the black box character of most social media plat-
forms. More important, however, is the fact that biases are above all interpreta-
tions of biases, in the way that they depend on the expectations, hopes, and
desires of those who care enough. Validity is a cultural question in this regard.
For instance, many have criticized Twitter and Facebook for the triviality of
their trends, while at the same time often underscoring that their own favorite
‘hot topic’ was not appearing. Controversies related to trending algorithms are
simply not about to vanish. They emerge from time to time, depending on dif-
ferent places, people and issues, as a symptom of something deeper—indicating
Gatekeeping, as has become clear, represents an issue with both representa-
tion of pretty much everything cultural, it has been fundamentally transformed
by the dissemination of algorithms. The challenge to the authority- thrust nexus
12 J. Roberge and R. Seyfert
of all gatekeeping mechanisms is thus as signicant as those mechanisms are
constant. For the social sciences, too, this represents a substantial challenge, one
that forces us to develop new holistic understandings as well as new and more
empirical analyses (Kitchin 2014; see also Ruppert et al. 2013). In their contri-
bution to this volume, Jean- Samuel Beuscart and Kevin Mellet offer an excellent
example of the latter. They study and other consumer rating and
review sites as a now more- or-less standardized, if not ubiquitous, tool on the
Web.Whattheirndings show, however,isthatthemassivepresence of such
platforms is not antithetical to a sense of agency among users, and that the latter
has given rise to a rich and interesting negotiation among actors, both human
and non- human alike. Frequent writers of reviews, for instance, are indeed
moved by a non-negligible dose of reexivity. According to Beuscart and
Mellet, “at least part of the effectiveness of this phenomenon is the ability of
users to build a coherent pattern of use that regulates their evaluation behavior to
work towards a collective aim” (p. 90). Self- esteem thus derives from a sense
that somehow there exists a form of readership that also forms a rational and
socialized judgment. This might create a distant image of what constitutes a col-
lective intelligence, and such an image is active enough to be considered
Not to be forgotten is the question of whether the actual fragmented nature of
recommendation algorithms constitutes un multiple. Different calculation rou-
tines clearly produce different outcomes, and from there it becomes important to
assess what this could mean, both ontologically and epistemologically. Putting
things in such a perspective is the task Dominique Cardon sets for himself in his
contributiontoourvolume.Heproposes,inessence,aclassicationof classi-
catory principles, focusing on the ways that they are not simply and straight-
forwardly dependent on economic forces, but also on one another, by way of
relation, opposition, comparison, etc.—a conceptual move closely linked with
Alexander’s “relative autonomy of culture,” as seen above. Cardon discusses
four types of calculation and the ways they inform the “competition over the best
waytorankinformation”: beside the Web, as a calculation of views and audi-
ence measurement; above the Web, as a meritocratic evaluation of links; within
the Web,asameasureoflikesandpopularity;andnally,below the Web, as the
recording of behavioral traces that allows for more tailored advertising. These
four types reveal very different metrics, principles, and populations to be
sampled, and yet they are commensurable in that together they inform a “sys-
temic shift” in how society represents itself. “Digital algorithms,” writes Cardon,
“prefer to capture events (clicks, purchases, interactions, etc.), which they record
tions” (p. 106). Statistics as we used to know them, such as those relying on
large variables like sex and race, are being replaced with more precise and indi-
vidualized measurements. In turn, society appears as an increasingly hetero-
geneous ex- post reality, the best explanation of which might be that there is no
real, fundamental, or comprehensive explanation—with all the consequences
that this entails for the social sciences.
What are algorithmic cultures? 13
From algorithmic performances to algorithmic failures
Instability, fragility and messiness all gesture at the praxiological character of
algorithmic cultures. In contrast to the dominant paradigm of computer science,
which describes algorithms as procedural and abstract methods, we conceptual-
ize algorithms as practical unfoldings (Reckwitz 2002). Galloway, in his seminal
essay, already points to the pragmatic aspect of algorithmic cultures: “to live
today is to know how to use menus” (Galloway 2006, 17). As users, when we
operate in algorithmic cultures, we operate algorithms. For instance, the hand-
ling of software menus is a practice (interactions and operations with others,
human and non-human alike) in which we manage algorithmic devices: we
schedule meetings on our online calendar, set up notications on emails,
program our navigational devices to lead us home, etc. We activate and deacti-
vate algorithms to govern our daily life. Thus, algorithms are not so much codes
as they are realizations of social relations between various actors and actants.
As practices, algorithms are distinguished by recursive and very entrenched
routines. Algorithms are supposed to help in the performance of repetitious
tasks; they implement activities for reduced cognitive and affective investment,
and thereby make it possible to focus on more important and perhaps more inter-
esting tasks. The analysis of algorithms as routines (or routine practices)
accounts for deviations from the mathematical and technical scripts, deviations
that emerge from various sources, such as a failure in design, incomplete imple-
mentation, and the messiness of operations or interactive effects between dif-
ferent algorithmic and non- algorithmic actants. This is something computer
science can barely do, as it is in its DNA, so to speak, to dene algorithms
through precision and correctness. Computer scientists accept deviations only in
human routines, and thus foreclose the possibility that not every repetition is
identical; rather, each iteration of the routine introduces little deviations in each
step (Deleuze 1994). We would even go so far as to say the discourse of the dis-
cipline of computer science conceptually excludes algorithmic practices, and
hence the possibility of their deviations from the script. For cultural sociology,
the assignation of deviations exclusively to humans seems problematic. The
notion of an algorithmic precision and correctness seems to be rather part of the
tale of an algorithmic objectivity discussed above, a quest for a higher ration-
ality, where algorithms act autonomously and supercede human routines. In this
tale, algorithms promise an identical repetition that allows for easy modeling and
precise predictions. However, such imaginaries of algorithmic cultures, their
promises and dreams, have to be distinguished from algorithms in practice.
In algorithmic cultures, we witness changes of social relations, for instance
the emergence of highly customized relations. In Joseph Klett’s contribution to
this volume, he gives an example of the transition from digital stereo to “immer-
traditional stereo speaker systems) operates with generic relations: each audio
speaker establishes a xed relation to a ‘user,’ which really is an invariant
sensoryapparatuslocatedinaxed pointinspace (theso-called‘sweet-spot’).
14 J. Roberge and R. Seyfert
In contrast, relations in algorithmically realized soundscapes are highly person-
alized. Klett shows how audio engineering, as with many other technological
apparatuses, is moving from the use of algorithms as general mediators to the
singular individuals. Such personalization allows for a much richer audio experi-
the sound is continuously adapting to our individual perspective. Inevitably, the
transition from generic relations to dynamical adaptive relations through algo-
rithms has consequences for social life. By adapting to individual bodies and
subjects, personalization algorithms also change the very nature of social rela-
tions, disentangling and cutting off some relations and creating new ones. Per-
sonalization algorithms in noise- cancelling headphones are an example of such
disconnections; they deprive social relations of acoustic communication. Thus,
personalization algorithms create enclosures around the subjects where “the
body becomes a part of the audio system” (p. 116). Together, body and device
create a closed algorithmic culture.
In this day and age, algorithmic relations are not only enacted by and with
humans, but also by and with algorithms themselves. There are indeed endless
chains of algorithms governing one other. Understanding such relations will cast
doubt upon the purported antagonism between humans and computer algorithms,
between humans and algorithmic routines—antagonisms endemic to the propos-
als of computer science, approaches that generate notions like algorithmic objec-
tivity and pure rationality. The crafted imaginary that reiterates and relies on the
events such as Kasparov vs. Deep Blue) ignores human immersion in algorithms
(such as the programmers’ immersion in Deep Blue—their tweaking of the pro-
gramming between matches to adjust to Kasparov’s play). It bears repeating that
the denition of algorithms as formal procedures focuses only on precise and
identically repeatable processes, while the examination of practices and perform-
ances takes into account deviations and divergences. Unstable negotiations, slip-
page, fragility, and a proneness to failure are in fact important features of
algorithmic cultures. In ‘real life,’ algorithms very often fail, their interactions
and operations are messy. This is particularly true when they tumble in a sort of
in- betweenness among other actors (algorithmic or not), where they tend to
deviate from their initial aim as much as any other actant.
The emergence of failures has to do with the complexity of interactions. Inter-
actions that are not only face- to-face or face- to-screen, but that also take place
within complex assemblages, contribute to the production of errors and bugs.
Countless examples of such failures can be found, from the (mis)pricing of “Ama-
zon’s $23,698,655.93 book about ies” (Eisen 2011), to the demise of Knight
Capital, an algorithmic trading company that lost about US$400 million in a
matter of 45 minutes due to a malfunctioning trading algorithm (SEC 2013, 6).
Consequently, the everyday use of algorithms results in a mixture of surprise and
disappointment. The astonishment often expressed when Amazon’s recommenda-
tion algorithms correctly predict (or produce) our taste, and directly result in a
What are algorithmic cultures? 15
purchase, goes hand in hand with complaints of how wildly off the mark they are.
We have come to expect failing algorithmic systems and we have indeed become
accustomed to dealing with them. Making fun of such failures has become a genre
#PrimeDay deals and am not interested in any of them” (Davis 2015).
In his contribution to our volume, Shintaro Miyazaki explains the avalanch-
ing effect of “micro- failures” in algorithmic cultures. He shows how something
that might seem miniscule, irrelevant, a small divergence in code, an almost
indiscernible misalignment, can be leveraged to catastrophic results in algorith-
mic feedback processes. Miyazaki’s historical case study of the AT&T Crash
from 1990 shows that such failures have been part of algorithmic cultures from
very early on. In this case, a software update in AT&T’s telephone network
created a feedback loop in which the entire system created an unstable condition
from which it was not able to recover. While separate subsystems contained
emergency routines that enabled each to automatically recover from cases of
malfunction, the algorithmic feedback loops across subsystems caused interact-
ing algorithms to turn one another off. This resulted in an algorithmic network
with unproductive operations, which stem from what Miyazaki calls “distributed
dysfunctionalities” (p. 130).
If we were to take seriously the fact that failure is an inevitable part of algo-
rithmic life, then Miyazaki’s analysis of “distributed dysfunctionality” has a
further implication—namely, that distributed dysfunctionality may in fact be a
process where a network of algorithms inadvertently creates a higher form of an
ultimate machine. The prototypical ultimate machine was created by Claude E.
Nothing could look simpler. It is merely a small wooden casket the size and
shape of a cigar- box, with a single switch on one face. When you throw the
switch, there is an angry, purposeful buzzing. The lid slowly rises, and from
beneath it emerges a hand. The hand reaches down, turns the switch off, and
the buzzing ceases, and peace reigns once more.
(Clarke 1959, 159)
Because of its particular functionality, the ultimate machine was also named the
useless machine or leave me alone box. The case described by Miyazaki may be
understood as a more complex version of such a machine. In fact, it was not a
single machine that turned itself off, but rather a chain of machines performing
algorithmic interactions, so that each machine turned its neighbor off, right at the
moment when the neighbor’s recovery operation had been completed. While a
distributed dysfunctionality incorporates this function, creating a stable instab-
ility that requires non- algorithmic actors to end those dysfunctional and the non-
productive routines. This is a case of an algorithmic practice where algorithms
start to act and interact according to a pattern that had not been inscribed into
16 J. Roberge and R. Seyfert
them, making them essentially unproductive. One might describe such a machine
as an algorithmic Bartleby, where the demand to initiate routines is countered by
the algorithmic expression I would prefer not to. Such a description has perplex-
algorithms as routinized unfolding. As much as Bartleby’s refusal affects the
daily routines at work, algorithmic dysfunctionality also addresses those rou-
tines, undermining them and making them unproductive.
Cases of unstable algorithms are not unusual. In algorithmic trading, it is not
uncommon for traders to have to force algorithms out of unstable conditions. For
instance, software bugs or feedback loops might cause an algorithm to icker
around thresholds, where it continuously places and cancels orders, etc. (Seyfert
have also argued that many unusual market events can be explained by such non-
productive routines (Johnson et al. 2012; Cliff et al. 2011; Cliff and Nothrop 2011).
To give an example, an initial analysis of the Flash Crash of 2010 suggested that
such non- productive algorithmic interactions might have been the culprit. The Flash
Crash describes a very rapid fall and consecutive recovery in security prices. The
Joint Report by the Commodity Futures Trading Commission and the Security
Atabout2:40intheafternoon of May 6, pricesforboththeE-MiniS&P
500 futures contract, and the SPY S&P 500 exchange traded fund, suddenly
plunged 5% in just 5 minutes. More so, during the next 10 minutes they
recovered from these losses. And it was during this recovery period that the
prices of hundreds of individual equities and exchange traded funds plum-
meted to ridiculous levels of a penny or less before they too rebounded. By
the end of the day everything was back to ‘normal,’ and thus the event was
dubbed the May 6 Flash Crash.
(CFTC and SEC 2010a, 3)
According to this Joint Report, high- frequency traders (relying on algorithms)
began to quickly buy and then resell contracts to each other—generating a
‘hot potato’ volume effect as the same positions were rapidly passed back
and forth. [. . .] HFTs traded over 27,000 contracts, which accounted for
about 49 percent of the total trading volume, while buying only about 200
additional contracts net.
(CFTC and SEC 2010a, 3)
This hot potato effect is another iteration of distributed dysfunctionality, an
unproductive routine that inadvertently subverts the productivity paradigm of the
One reason for the emergence of failures in algorithmic practices has to do
with the fact that interactions with and among algorithms often tend to be misun-
derstood. In his contribution, Valentin Rauer shows in two case studies the
What are algorithmic cultures? 17
problems in assessing algorithmic agency. In algorithmic cultures, traditional
interactions through deictic gestures have been replaced by what Rauer calls
“mobilizing algorithms.” While face- to-face interactions allow for deictic ges-
tures such as this or you, interactions over distance require intermediaries.
Mobilizing algorithms have become such intermediaries, operating to a certain
extent autonomously. Examples are automated emergency calls that serve as
functional equivalents to deictic gestures (Mayday! Mayday!). Rauer shows that
the introduction of such algorithmic intermediaries leads to varying scales and
ranges in capacities to act. Such scaling processes make the notion of a purely
pendence are thresholds, or rather each constitutes a limit that is never fully
reached in either humans or algorithms. But in public discourse, such scales of
agency are ignored and obfuscated by strong imaginaries. The problems with
these imaginaries become especially visible at the moment of algorithmic break-
downs. Rauer illustrates this with the case of a “missing algorithm” that ulti-
mately led to the failure of the Euro Hawk drone project. In this particular
circumstance, a missing algorithm caused the drone to y on its rst ight
“unguided and completely blind, posing a real threat to anything in its vicinity”
(p. 146). That particular algorithm was ‘missing,’ not as a result of an uninten-
tional error, but rather, because the drone was supposed to be guided—that is,
governed—by an acting human. Thus, the prototype of Euro Hawk operated
with a strong notion of human agency—an agency that always masters its crea-
tions—while the agency of the drone was underestimated. The missing algorithm
shows that failures and messiness are crucial to algorithmic practices.
Paradoxical as it seems, a missing algorithm is part of the messiness in algo-
rithmic practices, a messiness that is also the reason for the promises and dreams
inherentinalgorithmiccultures. That istosay,thefulllmentof this dreamis
always one step away from its completion. There is always only one more algo-
rithm yet to be implemented. In other words, it is only such constant algorithmic
misalignments that explain the existence of promises and hopes of a smooth
algorithmic functionality. If everything were functioning smoothly, these prom-
ises would be superuous and would simply disappear. Strictly speaking, the
nomy and the hope of a higher rationality, makes sense especially in contrast to
constant failures.
Furthermore, misalignments and failures in algorithmic cultures are not only
due to missing algorithms and bugs, but may precisely be attributable to the mis-
match between the expectations of algorithmic rationality, agency, and objectiv-
ity inscribed in the codes on the one hand, and actual algorithmic practices on
the other. When algorithms enter into socio- technical assemblages they become
more than just “Logic + Control.” Thus, a cultural analysis of algorithms cannot
just include the technical niceties of codes and technical devices, i.e., their tech-
nical functionalities; it will also need to focus on the complex of material cul-
tures, technological devices and practices. Hence, it is problematic when
contemporary studies of algorithms primarily focus on the creepiness and
18 J. Roberge and R. Seyfert
suspicious nature of algorithms, which are hinted at in conference titles such as
“The Tyranny of Algorithms” (Washington, December 2015). Such perspectives
not only ignore the very mundane nature of the disappointments caused by algo-
rithms but also the logical dynamics between promise and disappointment oper-
ating in algorithmic cultures. These studies tend to conate the industries’
imaginaries of rationality, autonomy, and objectivity with actual practices. They
(mis)take the promises of those who construct and, most importantly, sell these
systems for the realities of algorithmic cultures. Where they should be analyzing
the ‘legitimation through performance’ of algorithmic cultures, they end up criti-
cizing imaginaries and their effects, irrespective of the praxiological processes
of actualization (or non- realization) of these imaginaries. In their preferred mode
of criticism they fall prey to what Mark Nunes has called “a cybernetic ideology
predictability” (2011, 3). Consequently, by overestimating the effectiveness and
by ignoring the messiness and dysfunctionality of algorithmic practices, these
cultural and social analyses take on the character of conspiracy theories in which
“secret algorithms control money and information” (Pasquale 2015).
The rather conspiratorial attitudes towards algorithms might also be explained
by the sheer magnitude of the ambiguity that is involved in algorithmic cultures.
Algorithmic practices, where we use and where we are being used by algorithms,
involve tacit knowledge. Most of us use algorithms every day, we govern them
every day, and we are governed by them every day. Yet most of us do not know
much about the algorithmic codes of which these algorithmic assemblages are
made. This non- knowledge makes us suspect something uncanny behind the
screen, something that is fundamentally different from the intentions of our
human companions. It is the lack of information that leads some human actors to
ascribe intentions to all algorithmic activities, a general attitude of suspicion that
Nathalie Heinich has called the “intentionalist hypothesis,” that is, a “systematic
reduction of all actions to a conscious (but preferably hidden and thus mean)
intention” (Heinich 2009, 35). It is this ambiguity that makes the analysis of
algorithmic cultures in social and cultural studies particularly germane. The pro-
duction, usage, and failure of algorithmic systems are stabilized by cultural nar-
ratives that resort to powerful imaginary expectations. Thus, in order to see this
tension between practices and imaginaries, to grasp algorithmic cultures in their
constitutive tension, it is not enough to focus on the cultural narratives of those
who explain and promote algorithmic systems and on those who express con-
spiratorialfears:focusonthe algorithmicpracticesthemselves isalsorequired,
for it is here where failures are most visible.
Cultivating algorithmic ambiguity
Because algorithmic circuits are interactions between very different human and
non-humanactors,theyareambiguous,andit becomes particularly difcult to
locate agency and responsibility. Consequently, algorithmic circuits and interac-
tions present a challenge, not only to the scholars in social sciences and cultural
What are algorithmic cultures? 19
studies. Interpretations vary widely, and the distribution of agency and the attri-
bution of responsibility shifts, depending on the epistemic formations of the
interpreters of particular events. While some authors like Miyazaki focus on pure
algorithmic interactions (Miyazaki [in this volume]; MacKenzie 2015; Knorr
Cetina 2013), others conceive of them as distributed functionality between
humans and algorithms, as “blended automation” (Beunza and Millo 2015),
while some even go so far as to see in algorithms nothing but instruments of
human agency (Reichertz 2013). Political systems especially tend to resort to the
last view, in particular when things go wrong and accountable actors need to be
named. Here, the Flash Crash of 2010 and its interpretation by the Security
Exchange Commission in the United States is a particularly apt example. The
rapidity of the fall in stock market prices and their subsequent recovery led to
traders. Early interpretations especially took this event as a new phenomenon, an
event resulting from the interaction of complex technological systems (‘hot
potato effects’). However, as time went by, human rather than algorithmic
agency was increasingly deemed accountable. A comparison between the rst
report of the Flash Crash by the CFTC and SEC from May 18 (CFTC and SEC
2010a) and the second report from September 30 (CFTC and SEC 2010b) shows
an increasing focus on the inclusion of individual actors and their intentions.
Whilethe rstreportalsoincludesthepossibilityofinter-algorithmicfeedback
loops (the aforementioned ‘hot potato effects’), the most recent report from 2015
does not mention algorithmic interactions or any type of complex feedback
loops. Instead, it points to a human trader, London- based Navinder Singh Sarao,
who was the single individual actor named as being connected to the event
(CFTC 2015a and b). Such reductionist explanations are highly contested within
ally create such an impact on a trillion-dollar market (Pirrong 2015). If his activ-
ities did indeed contribute to the Flash Crash, then, it has been argued, it was
(Foresight 2012, 71–72).
However, as this example of the slow transition from blaming algorithmic
interactions to blaming human intentions shows, the interpretation of algorithmic
failures greatly depends on the epistemic paradigm used by the interpreter. That
is to say, each interpretation stems from a particular way of sense- making, which
includes the devices used to access an event. While information science, media
studies, and STS have no problems ascribing agency, responsibility, and
accountability to emergent phenomena stemming from inter- algorithmic events,
the same is not true for political systems (or market authorities for that matter)
that (still) tie responsibility to human actors. It is safe to say that the political
system itself created the pressure on the SEC and CFTC to present an account-
able actor with which traditional juridical systems can operate. Algorithms are
certainly not (yet) among those. As we have seen, the emergence of algorithmic
atmosphere of uncertainty about the identity of interactional partners.
20 J. Roberge and R. Seyfert
Thus, one of the most important questions within algorithmic cultures is
always “who we are speaking to” (Gillespie 2014, 192). In all types of social
mediaplatforms,theuserneedsto trustthats/he isinteractingwith an‘actual’
user. That is especially important for economic interests, which rely on an unam-
biguousidenticationofsendersand receivers of nancial transmissions.Eco-
are speaking, for it is only then that we know the identities of those from whom
we are buying or to whom we are selling.
In his contribution to this volume, Oliver Leistert shows that social media plat-
ensure that our crucial communications are with ‘real’ users and real users alone.
In turn, users need to believe that their counterparts are real, ergo, they need to
trust the social media platform they are using. Thus, the “algorithmic production
of trust” (p. 159) is one of the most important mechanisms of social media plat-
forms. This is what such platforms actually do:relyheavilyontrusttosolvethe
problem of uncertainty. Leistert further describes the doubling mechanisms in con-
ditions of uncertainty, where certain social bots are designed to exploit the trust
that social media platforms painstakingly try to establish. He sees such social bots
as machines that parasitically feed on our desires to be followed, to be ranked, and
to be trending. As ‘algorithmic pirates’ they feed in various ways on ‘pure’ inter-
actions. These desires can be exploited, for instance by the offer to ‘automatically’
feed it with fake followers, with bots that pretend to be ‘real’ followers. In addi-
tion, it is not uncommon for some—often commercial—users to buy followers on
social media platforms. Another example is harvesters that attempt to friend as
many users possible in order to extract user data. Not only do they feed on the
numbersof followers),they also feedonthedataows that constitutethe core
business of social media platforms. Leistert hence describes real performative
effects in algorithmic cultures. Not only is the general uncertainty regarding whom
we are addressing exploited, the exploitation in fact increases uncertainty, even for
bots. For instance, when ‘social bots’ mimic human users they increase uncertainty
to the extent that they themselves become unsure whether or not they are still
dealing with ‘normal’ users. Thus, bots themselves have to identify fake counter-
parts. On the one hand, algorithmic parasites pollute the pure interactions between
‘normal’ users that social media platforms try so hard to establish. But on the other
hand, they too need to purify the pollutions their own actions have caused. In turn,
intensify and escalate the process of producing and reducing uncertainty.
The interpretations of algorithmic cultures are not just epistemic problems,
as procedures or recipes for solving problems, approaches such as cultural soci-
ology emphasize their performative effects, their recursive functions by which
algorithmic practices not only create new problems, but also create the problems
for which they are ultimately the answer. The performativity of algorithms is
also (recursively) related to reections in social and cultural studies itself.
What are algorithmic cultures? 21
Barocas and Nissenbaum (2014) have shown that the use of new technologies
caninitiateareexiveprocessthathelpsusclarifyalreadyexistingideas. For
instance, algorithmic practices do not simply, as is often suggested, challenge
traditional notions of privacy, for instance in the context of Edward Snowden’s
revelations. Algorithmic practices such as Big Data do not simply threaten
classic notions of individual privacy and anonymity, since they do not operate
with classical features such as name, address, and birth place. Rather, they
changethevery denitions ofwhatitmeans to beprivateandanonymous. By
assembling algorithmic portfolios of the users they are tracing, they operate with
entirely different features of their users, and thereby create new identities. Con-
sequently, Facebook’s shadow prole and what Google has rather cynically
ingdenitionsof basictermsisimportant becauseitmighthelp uscircumvent
foreseeable misunderstandings in future political regulations.
For the understanding of algorithmic cultures, it is important to understand
the multiplicity and entanglement of these imaginaries, epistemic views, prac-
tical usages, and performative consequences. For this reason, scholars in social
sciences, cultural studies, and in particular, cultural sociology, should take heed
andnotmix uporconatepromises,imaginaries, andpracticaleffects.This is
not to say that we are reducing imaginaries to mere fantasies. Imaginaries are
also real; they have real effects in algorithmic cultures, and thus need to be taken
into account. However, the performative effects of imaginaries, and the perfor-
mative effects of practices, do differ. It is important to be able to distinguish the
two, and not only for cultural sociology.
Alexander, J. 1990. “Analytic Debates: Understanding the Relative Autonomy of
Culture.” In Culture and Society: Contemporary Debates. Edited by J. Alexander and
Alexander, J. 2004. “Cultural Pragmatics: Social Performance Between Ritual and
Strategy.” Sociological Theory22(4):527–573.
Alexander, J. and Smith, P. 1998. “Sociologie culturelle ou sociologie de la culture? Un
programmefortpourdonneràlasociologiesonsecondsoufe.”Sociologie et sociétés
Structural Hermeneutics.” In Handbook of Sociological Theory. Edited by J. Turner,
Amoore, L. and Piotukh, V., eds. 2016. Algorithmic Life: Calculative Devices in the Age
of Big Data.London:Routledge.
Barocas, S. and Nissenbaum, H. 2014. “Big Data’s End Run Around Consent and Ano-
nymity.” In Privacy, Big Data and the Public Good. Edited by J. Lane, V. Stodden, S.
22 J. Roberge and R. Seyfert
Barocas,S.,Sophie,H., and Ziewitz, M.2013.“GoverningAlgorithms: A Provocation
Piece.” Presented at Governing Algorithms: A Conference on Computation, Automa-
tion, and Control, New York University, May 16–17.
Becker, K. 2009. “The Power of Classication: Culture, Context, Command, Control,
Communications, Computing.” In Deep Search: The Politics of Search Engines Beyond
Beer, D. 2013. Popular Culture and New Media.Basingstoke,UK:PalgraveMacmillan.
Beunza, D. and Millo, Y. 2015. “Blended Automation: Integrating Algorithms on the
Floor of the New York Stock Exchange.” SRC Discussion Paper, No 38. Systemic Risk
Centre, The London School of Economics and Political Science, London.
tion and Institutional Translation.” Big Data and Society (July–December), 1–12.
ing Algorithms.” In Big Data and Society (January–June), 1–12.
Cardon, D. 2013. “Dans l’esprit du PageRank.” Réseaux 1:63–95.
CFTC 2015a. Criminal Complaint, United States of America vs. Navinder Singh Sarao,
AO 91(Rev.11/11).
CFTC 2015b. United States of America vs. Nav Sarao Futures Limited PLC and Navinder
Singh Sarao, Appendix to Plaintiff ’s motion for statutory restraining order containing
declarations and exhibits,Case:1:15-cv-03398.
CFTC and SEC 2010a. Preliminary Findings Regarding the Market Events of May 6th,
2010, Report of the staffs of the CFTC and SEC to the Joint Advisory Committee on
Emerging Regulatory Issues. 18. May 2010.
CFTC and SEC 2010b. Findings Regarding the Market Events of May 6th, 2010, Report
of the staffs of the CFTC and SEC to the Joint Advisory Committee on Emerging Regu-
latory Issues. 30. September 2010.
tion of Control.” Theory, Culture & Society28(6):164–181.
Clarke, A. C. 1959. Voice Across the Sea.NewYork:Harper&Row.
Cliff, D. and Nothrop, L. 2011. “The Global Financial Markets: An Ultra-Large-Scale
Systems Perspective.” The Future of Computer Trading in Financial Markets. Fore-
Cliff, D., Brown D., and Treleaven, P. 2011. “Technology Trends in the Financial
Markets: A 2020 Vision.” The Future of Computer Trading in Financial Markets.
Daston, L. J. 2004. “Whither Critical Inquiry?” Critical Inquiry30(2):361–364.
Davis, D. 2015. “@Amazon’s Algorithms Are So Advanced, I’ve Been Offered Over
10,000 #PrimeDay Deals and Am Not Interested Any of Them” [Twitter Post], July 15,
De Certeau, M. 1974. La culture au pluriel.Paris:Seuil.
Deleuze, G. 1994. Difference and Repetition. Translated by Paul Patton. New York:
Columbia University Press.
Eisen, M. 2011. “Amazon’s $23,698,655.93 Book about Flies.” April 22, retrieved from
an Algorithm.” Social Studies of Science42(1):5–30.
Foresight 2012. The Future of Computer Trading in Financial Markets: An International
Perspective. Final Report, Foresight, London.
What are algorithmic cultures? 23
Galloway, A. R. 2006. Gaming: Essays on Algorithmic Culture.Minneapolis,MN:Uni-
versity of Minnesota Press.
Galloway, A. R. 2012. The Interface Effect.Cambridge,UK:PolityPress.
Galloway, A. 2013. “Emergent Media Technologies, Speculation, Expectation, and
Human/Nonhuman Relations.” Journal of Broadcasting & Electronic Media 57 (1):
Geertz, C. 1973. The Interpretation of Cultures.NewYork:BasicBooks.
Gillespie, T. 2014. “The Relevance of Algorithms.” In Media Technologies: Essays on
Communication, Materiality, and Society. Edited by T. Gillespie, P. J Boczkowski, and
Goffey, A. 2008. “Algorithm.” In Software Studies: A Lexicon. Edited by M. Fuller. Cam-
Hallinan,B.and Striphas, T. 2014.“RecommendedforYou: The NetixPrizeandthe
Production of Algorithmic Culture.” New Media & Society18(1):117–137.
Hargittai, E. 2000. “Open Portals or Closed Gates? Channeling Content on the World
Wide Web.” In Poetics27(4):233–253.
Heinich, N. 2009. Le bêtisier du sociologue.Paris:Klincksieck.
Hillis, K., Petit, M., and Jarrett, K. 2013. Google and the Culture of Search.NewYork:
Honan, M. 2013. “I, Glasshole: My Year with Google Glass.” Wired, December
30, retrieved from (accessed May 24,
Introna,L.D.2011.“The Enframing of Code:Agency,OriginalityandthePlagiarist.”
Theory, Culture & Society28:113–141.
Introna, L. D. 2016. “Algorithms, Governance, and Governmentality: On Governing
Academic Writing.” Science, Technology, & Human Values41(1):17–49.
Introna, L. D. and Hayes, N. 2011. “On Sociomaterial Imbrications: What Plagiarism
Detection Systems Reveal and Why It Matters.” Information and Organization 21:
Johnson, C., Dowd, T. J., and Ridgeway, C. L. 2006. “Legitimacy as a Social Process.”
American Review of Sociology32(1):53–78.
Johnson, N., Zhao, G., Hunsader, E., Meng, J., Ravindar, A., Carran, S., and Tivnan, B.
2012. “Financial Black Swans Driven by Ultrafast Machine Ecology.” Working paper,
Kinsley,S. 2010.“Representing‘ThingstoCome’: FeelingtheVisionsofFutureTech-
nologies.” Environment and Planning A42(11):2771–2790.
Kitchin, R. 2014. “Thinking Critically about and Researching Algorithms.” In The Pro-
grammable City Working Paper, Maynooth, Republic of Ireland: Maynooth Univer-
sity, retrieved from
(accessed May 24, 2016).
Knorr Cetina, K. 2013. “Presentation to Panel, Theorizing Numbers.” Presented at the
American Sociological Association Annual Meeting, New York.
Kowalski, R. 1979. “Algorithm = Logic + Control.” Communications of the ACM22(7):
Kushner, S. 2013. “The Freelance Translation Machine: Algorithmic Culture and the
Invisible Industry.” New Media & Society.PublishedonlinebeforeprintJanuary3,doi:
Lash,S. 2007.“PowerafterHegemony:CulturalStudiesinMutation?”Theory, Culture
& Society24(3):55–78.
24 J. Roberge and R. Seyfert
and Society: Studies in the Sociology of Culture Past and Present, Volume 6. Edited
Levy, M. 2011. In the Plex: How Google Thinks, Works, and Shapes Our Lives. New
Mackenzie, A. 2005. “The Performativity of Code: Software and Cultures of Circula-
tion.” Theory, Culture & Society22(1):71–92.
MacKenzie, D. 2006. An Engine, Not a Camera: How Financial Models Shape the
mated Trading.” Working paper.
Information, Communication & Society15(5):769–787.
Miller, C. C. 2013. “Privacy Ofcials Press Google on Its Glasses.” New York Times,
June 19, retrieved from http://bits.blogsofcials-
worldwide- press-google- about-glass (accessed May 24, 2016).
Morris,J. W.2015.“CurationbyCode: InformediairiesandtheDataMiningofTaste.”
European Journal of Cultural Studies18(4–5):446–463.
Nunes,M.2011.“Error,Noise,andPotential:TheOutsideofPurpose.”InError: Glitch,
Noise, and Jam in New Media Cultures. Edited by Mark Nunes, 3–23. New Haven, CT
Pasquale, F. 2015. The Black Box Society: The Secret Algorithms That Control Money
and Information.Cambridge,MAandLondon:HarvardUniversityPress.
wise Professor(blogbyUniversityofHoustonnanceprofessorCraigPirrong),January1,
Pogue, D. 2013. “Why Google Glass Is Creepy.” Scientic American, May 21, retrieved
from www.scientigoogle-glass-is-creepy (accessed
May 24, 2016).
Reckwitz, A. 2002. “Toward a Theory of Social Practices. A Development in Culturalist
Theorizing”, In European Journal of Social Theory,5(2):245–265.
Reichertz, J. 2013. “Algorithmen als autonome Akteure?” In SozBlog, February 24,
retrieved fromals-autonome-akteure/
#more- 964 (accessed May 24, 2016).
Roberge, J. and Melançon, L. Forthcoming. “Being the King Kong of Algorithmic
Culture Is a Tough Job After All: The Justicatory Regimes of Google and the
Meaning of Glass.” Convergence: The International Journal of Research into New
Media Technologies. Published online before print July 2, 2015, doi: 10.1177/
Röhle, T. 2009. “Dissecting the Gatekeepers: Relational Perspectives on the Power of
Search Engines.” In Deep Search: The Politics of Search Engines beyond Google.
Ruhe, N. 2014. “Algorithmic Cultures – Conference Report.” H- Soz-Kult, October 29,
retrieved from www5626 (accessed
May 24, 2016).
Challenge of Digital Devices.” Theory, Culture & Society30(4):22–46.
Sandvig,C.2015.“SeeingtheSort:TheAestheticand IndustrialDefenseof‘TheAlgo-
rithm’.” Journal of the New Media Caucus 10 (1), retrieved from http://median.
What are algorithmic cultures? 25infrastructures-information/seeing-the-sort-the-aesthetic-and-
New Media & Society. Published online before print April 29, 2013, doi: 10.1177/
Sociology.” Cultural Sociology3(2):217–238.
Seaver, N. 2014. “Knowing Algorithms.” Presented at Media in Translation 8, Cam-
bridge, MA, April 2013.
SEC 2013. Securities Exchange Act of 1934, Release No. 70694, October 16, 2013,
Administrative Proceeding File No. 3–15570.
Seyfert, R. 2012. “Beyond Personal Feelings and Collective Emotions: A Theory of
Social Affect.” Theory, Culture & Society29(6):27–46.
Seyfert, R. Forthcoming. “Bugs, Predations or Manipulations? Incompatible Epistemic
Regimes of High- Frequency Trading.” Economy & Society.
Striphas, T. 2009. The Late Age of Print: Everyday Book Culture from Consumerism to
Striphas, T. 2015. “Algorithmic Culture.” European Journal of Cultural Studies18(4–5):
The Social Media Collective. 2015. “Critical Algorithm Studies: A Reading List.”
retrieved fromlists/critical-algorithm-studies/
(accessed February 29, 2016).
Totaro, P. and Ninno, D. 2014. “The Concept of Algorithm as an Interpretative Key of
Modern Rationality.” Theory, Culture & Society31(4):29–49.
Uricchio, W. 2011. “The Algorithmic Turn: Photosynth, Augmented Reality and the
Changing Implications of the Image.” Visual Studies26(1):25–35.
Wansleben, L. 2012. “Heterarchien, Codes und Kalküle. Beitrag zu einer Soziologie des
algo trading.” Soziale Systeme18(1–2):225–259.
Wasik, B. 2013. “Why Wearable Tech Will Be as Big as the Smartphone.” Wired,
May 24, 2016).
Weiser, M. 1991. “The Computer for the Twenty- First Century.” Scientic American,
September 1, 94–100.
Zietwitz,M.2016.“GoverningAlgorithms:Myth,Mess,andMethods.”Science, Techno-
logy & Human Values4(1):3–16.
Algorithmic Cultures
This book provides in- depth and wide- ranging analyses of the emergence, and
subsequent ubiquity, of algorithms in diverse realms of social life. The plurality
of Algorithmic Cultures emphasizes: (1) algorithms’ increasing importance in
the formation of new epistemic and organizational paradigms; and (2) the multi-
        
The authors in this volume address the complex interrelations between social
groups and algorithms in the construction of meaning and social interaction. The
contributors highlight the performative dimensions of algorithms by exposing
the dynamic processes through which algorithms—themselves the product of a
how people think about society. With contributions from leading experts from
Media Studies, Social Studies of Science and Technology, Cultural and Media
Sociology from Canada, France, Germany, UK and the USA, this volume
presents cutting- edge empirical and conceptual research that includes case
Robert Seyfert is a Postdoctoral Fellow at the Cluster of Excellence “Cultural
Foundations of Social Integration” at Universität Konstanz, Germany, and Visit-
ing Full Professor of Comparative Cultural Sociology at Europa- Universität
Viadrina Frankfurt (Oder), Germany. He recently published in Theory, Culture
& Society and European Journal of Social Theory.
Jonathan Roberge is Assistant Professor of Cultural and Urban Sociology at
Research Chair in Digital Culture, in addition to being a Faculty Fellow at the
Center for Cultural Sociology at Yale University.
Routledge Advances in Sociology
178 The Politics and Practice of
Religious Diversity
National contexts, global issues
Edited by Andrew Dawson
179 São Paulo in the Twenty- First
Spaces, heterogeneities,
Edited by
Eduardo Cesar Leão Marques
180 State Looteries
Historical continuity,
rearticulations of racism, and
American taxation
Kasey Henricks and
David G. Embrick
181 Lesbian, Gay, Bisexual and
Trans* Individuals Living with
Concepts, practice and rights
Edited by Sue Westwood and
Elizabeth Price
182 Family, Culture, and Self in the
Development of Eating
Susan Haworth- Hoeppner
183 Origins of Inequality in Human
Bernd Baldus
184 Confronting the Challenges of
Urbanization in China
Insights from social science
Edited by Zai Liang,
Steven F. Messner,
Youqin Huang and Cheng Chen
185 Social Policy and Planning for
the 21st Century
In search of the next great social
Donald G. Reid
186 Popular Music and Retro
Culture in the Digital Era
Jean Hogarty
187 Muslim Americans
Debating the notions of
American and un- American
Nahid Kabir
188 Human Sciences and Human
Integrating the social, economic,
and evolutionary sciences
Mikael Klintman
189 Algorithmic Cultures
Essays on meaning, performance
and new technologies
Edited by Robert Seyfert and
Jonathan Roberge
Algorithmic Cultures
Essays on meaning, performance and new
Edited by Robert Seyfert and
Jonathan Roberge
First published 2016
by Routledge
2 Park Square, Milton Park, Abingdon, Oxon OX14 4RN
and by Routledge
711 Third Avenue, New York, NY 10017
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2016 Robert Seyfert and Jonathan Roberge
matter, and of the authors for their individual chapters, has been asserted
in accordance with sections 77 and 78 of the Copyright, Designs and
Patents Act 1988.
All rights reserved. No part of this book may be reprinted or reproduced or
utilized in any form or by any electronic, mechanical, or other means, now
known or hereafter invented, including photocopying and recording, or in
any information storage or retrieval system, without permission in writing
from the publishers.
Trademark notice Product or corporate names may be trademarks or
without intent to infringe.
British Library Cataloguing- in-Publication Data
A catalogue record for this book is available from the British Library
Library of Congress Cataloging-in-Publication Data
A catalog record for this book has been requested
ISBN: 978-1-138-99842-1 (hbk)
ISBN: 978-1-315-65869-8 (ebk)
Typeset in Times New Roman
by Wearset Ltd, Boldon, Tyne and Wear
the book.
List of gures vii
Notes on contributors viii
Acknowledgments xi
1 What are algorithmic cultures? 1
2 The algorithmic choreography of the impressionable subject 26
3 #trendingistrending: when algorithms become culture 52
4 Shaping consumers’ online voices: algorithmic apparatus or
evaluation culture? 76
5 Deconstructing the algorithm: four types of digital
information calculations 95
6 Bafedbyanalgorithm:mediationandtheauditory
relations of ‘immersive audio’ 111
7 Algorhythmic ecosystems: neoliberal couplings and their
pathogenesis 1960–present 128
vi Contents
8 Drones: the mobilization of algorithms 140
9 Social bots as algorithmic pirates and messengers of
techno- environmental agency 158
Index 173
2.1 The Prodigy graphical user interface 30
2.2 The Mosaic World Wide Web browser 32
2.3 AT&T clickable advertisement shown on Hotwire 33
2.4 Third- party cookies when visiting the Guardian newspaper’s
website 36
2.5 The honesty box experiment 45
3.1 Twitter Trends 54
3.2 “Trending” 57
3.3 “Pornhub’s US Top 3 Search Terms by State” 58
3.4 “What’s Trending?” 59
3.5 “The Autocomplete Truth” 68
5.1 Four types of digital information calculations 97
Jean- Samuel Beuscart is a Sociologist at Orange Labs and Associate Professor
at the University of Marne- la-Vallée (LATTS). He is currently working on
the framing of Internet markets as well as the implications of online visibility.
He published Promouvoir les oeuvres culturelles (Paris: La Documentation
Française, 2012), with Kevin Mellet. With Dominique Cardon, he now leads
the project “Algopol,” which receives substantial support from the Agence
Nationale de la Recherche in France.
Dominique Cardon is a Sociologist in the Laboratory of Uses of France
Telecom R&D and Associate Professor at the University of Marne- la-Vallée
(LATTS). He is studying transformations of public space and the uses of new
technologies. He has published different articles on the place of new technol-
ogies in the no- global movement, alternative media and on the process of
bottom- up innovations in the digital world. He published La démocratie
Internet (Paris: Seuil/République des idées, 2010) and, with Fabien Granjon,
Médiactivistes (Paris: Presses de Science Po, 2010).
Tarleton Gillespie is an Associate Professor at the Department of Communica-
tion at Cornell University and is currently a visitor with Microsoft Research
New England. He is the author of Wired Shut: Copyright and the Shape of
Digital Culture (Cambridge, MA: MIT Press, 2007) and the co- editor (with
Pablo Boczkowski and Kirsten Foot) of Media Technologies: Essays on Com-
munication, Materiality, and Society (Cambridge, MA: MIT Press, 2014). He
is also a co- founder (with Hector Postigo) of the NSF- sponsored scholarly
   
book on the implications of the content policies of online platforms for Yale
University Press, and has written on the relevance of algorithms for the
changing contours of public discourse.
Lucas D. Introna is Professor of Technology, Organization and Ethics at the
Centre for the Study of Technology and Organization, Lancaster University.
His primary research interest is the social study of technology. In particular,
he is concerned with theorizing social/technical entanglements, especially
with regard to ethics and politics. He has published on a variety of topics,
Contributors ix
such as sociomateriality, performativity, phenomenology of technology,
information and power, privacy, surveillance, technology ethics and virtual-
ity. He is a co- editor of Ethics and Information Technology and has acted as
associate editor for a variety of leading journals.
Joseph Klett is a Visiting Assistant Professor in the Department of Sociology
at the University of California, Santa Cruz (PhD Yale), and a regular parti-
cipant in the American digitalSTS initiative. He has recently written two
articles, “The Meaning of Indeterminacy” about the social practices which
lend meaning to Noise Music (Cultural Sociology, 2014), and “Sound on
Sound,” about the ethnographic study of sonic interactions (Sociological
Theory, 2014).
Oliver Leistert is a media and technologies scholar at Leuphana Universität
Lüneburg, Germany. Previously he was a Postdoctoral Researcher at the
“Automatisms” research group, University of Paderborn. He is a collaborator
at the ESRC project “Digital Citizenship and Surveillance Society” at the
University of Cardiff. He has studied philosophy, computer science and liter-
ature. His doctoral thesis in media studies, “From Protest to Surveillance: The
Political Rationality of Mobile Media” won the Surveillance & Society Book
Award in 2014. Other recent publications include (co- edited with Lina
Dencik) Critical Perspectives on Social Media and Protest: Between Control
and Emancipation
Kevin Mellet is a Researcher at the Social Sciences Department of Orange Labs
and Associate Researcher at the Centre de Sociologie de l’Innovation (Mines
ParisTech). Originally trained as an economist, he has developed expertise in
economic sociology and science and technology studies. His research
explores the construction of digital markets. Current areas of interest include
marketing and advertising practices, participatory valuation devices, business
models and market intermediation. He is the co- author (with Jean- Samuel
Beuscart) of a book on advertising strategies within cultural industries (Pro-
mouvoir les œuvres culturelles, Paris: La Documentation Française, 2012).
Shintaro Miyazaki is a Senior Researcher and Lecturer at the University of
Applied Sciences and Arts Northwestern Switzerland, Academy of Art and
Design, Institute of Experimental Design and Media Cultures. He is writing
at the intersection of media history, design theory and the history of science
and technology. Previously, he was a Resident Fellow at the Akademie
Schloss Solitude in Stuttgart (2011–2012) and Art/Science Resident at the
National University of Singapore (September 2012). He not only works as a
scholar, but also actively engages in practices between experimental design
and artistic research.
Valentin Rauer works as a Senior Research Fellow at the Cluster of Excellence
“The Formation of Normative Orders” at Frankfurt University. He is inter-
ested in social and cultural processes that transform, transmit and translate the
x Contributors
past (collective memories and identities), and the future (security cultures and
risks). Current publications include “The Visualization of Uncertainty,” in
Iconic Power: Materiality and Meaning in Social Life, ed. Alexander, Jeffrey
C. et al. (New York: Palgrave Macmillan, 2012) and “Von der Schuldkultur
zur Sicherheitskultur: Eine begriffsgeschichtliche Analyse 1986–2010,”
Sicherheit & Frieden (February 2011).
Jonathan Roberge is Assistant Professor of Cultural and Urban Sociology at
          
Canada Research Chair in Digital Culture, in addition to being a Faculty
Fellow at the Center for Cultural Sociology at Yale University.
Robert Seyfert is a Postdoctoral Fellow at the Cluster of Excellence “Cultural
Foundations of Social Integration” at Universität Konstanz, Germany, and
Visiting Full Professor of Comparative Cultural Sociology at Europa-
Universität Viadrina Frankfurt (Oder), Germany. He recently published in
Theory, Culture & Society and European Journal of Social Theory.
This volume evolved out of work initially presented at the Algorithmic Cultures
Conference at University of Konstanz in Germany, June 23–25, 2014. This
volume and the conference were made possible with the generous support of the
Canada Research Chairs Program and the “Cultural Foundations of Social Inte-
gration” Centre of Excellence at the University of Konstanz, established in the
framework of the German Federal and State Initiative for Excellence.
... Extreme positions attribute rationality, autonomy, and objectivity to algorithms because of their technical characteristics. However, they do not evaluate algorithmic performance in specific usage contexts (Roberge & Seyfert, 2018). Hence, it is critical to determine what happens to users and their agency when accessing information mediated by automated recommendations. ...
Full-text available
This study addresses the experiences of young Mexican users with YouTube’s recommendation algorithms. It seeks to determine if they are subordinated to algorithmic governance or if, on the contrary, they are capable of developing some tactics to resist algorithmic power through their agency. It uses a qualitative methodology based on focus groups and shows that users are not entirely subordinated to the platforms. Their possibilities of agency vary depending on the different technological appropriations of the platform, the intuitive theories about the ways in which these operate, the ability to dodge algorithmic distortions, and the resources to evaluate the quality of the information offered. As a result, the study identifies the specific skills that constitute algorithmic literacy.
... Dado el papel central que tienen los algoritmos en la vida cotidiana mediada por tecnologías digitales, las recomendaciones algorítmicas afectan en gran medida a los usuarios de las plataformas sociales. Posturas extremas adjudican a los algoritmos racionalidad, autonomía y objetividad solo por sus características técnicas, pero no evalúan su desempeño en contextos específicos de uso (Roberge & Seyfert, 2018). Por este motivo, resulta de importancia conocer qué sucede con los usuarios y su agenciamiento en el acceso a la información mediado por recomendaciones automatizadas, así como identificar qué conocimientos e intuiciones ponen en juego al negociar sus decisiones con estos sistemas. ...
Full-text available
Este estudio aborda las experiencias de jóvenes usuarios mexicanos con los algoritmos de recomendación de la plataforma YouTube. Busca determinar si se encuentran subordinados a la gobernanza algorítmica o si, por el contrario, son capaces de desarrollar algunas tácticas para resistir las lógicas del poder algorítmico a través de su propia agencia. Utiliza una metodología cualitativa centrada en grupos de enfoque. Evidencia que los usuarios no se subordinan completamente a las plataformas, y que sus posibilidades de agenciamiento varían en función de sus diferentes modos de apropiación tecnológica, las teorías intuitivas acerca de su funcionamiento, la capacidad para evadir las distorsiones algorítmicas, y los recursos para evaluar la calidad de la información ofertada. Como resultado del análisis, se han identificado las habilidades específicas que constituyen la literacidad algorítmica.
... We have come to expect failing algorithmic systems and we have indeed become accustomed to dealing with them. 53 Indeed, following Seyfert and Roberge, at times the only response to Amazon's miscalculations is to laugh. It is remarkably hard to peek into the inner workings of algorithms and understand their potential mistakes, particularly involving the anticipatory package shipping methods on the table. ...
... Here, the interrelations between algorithms and their social surroundings appears as key to unlocking them. Or, as Roberge and Seyfert (2016: 2) formulate the point: ...
Full-text available
Building on critical approaches that understand algorithms in terms of communication, culture and organization, this paper offers the supplementary conceptualization of algorithms as organizational figuration, defined as material and meaningful sociotechnical arrangements that develop in spatiotemporal processes and are shaped by multiple enactments of affordance–agency relations. We develop this conceptualization through a case study of a Danish fintech start-up that uses machine learning to create opportunities for sustainable pensions investments. By way of ethnographic and literary methodology, we provide an in-depth analysis of the dynamic trajectory in and through which the organization gives shape to and takes shape from its key algorithmic tool, mapping the shifting sociotechnical arrangements of the start-up, from its initial search for a viable business model through the development of the algorithm to the public launch of its product. On this basis, we argue that conceptualizing algorithms as organizational figuration enables us to detail not only what algorithms do but also what they are.
... To examine the truly complex role algorithms currently play in the distribution and consumption of film and television, I suggest that it is necessary to embrace a relational materialist understanding of algorithmic technology, as has been loosely developed by scholars such as Roberge and Seyfert (2016), Kitchin (2017), Seaver (2017) and Bucher (2018). From a relational materialist perspective, algorithms are understood to be sociotechnical processes that come into existence and operate in the world via a series of complex relations between human and nonhuman actors. ...
Full-text available
As the Streaming Wars continue to heat up, recommendation systems like the Netflix Recommender System (NRS) will become key competitive features for every major over-the-top video streamer. As a result, film and television production and consumption will increasingly be in the hands of semi-autonomous algorithmic technologies. But how do recommendation systems like the NRS work? What purposes do they serve? And what sorts of impacts are they having on film and television culture? To respond to these questions, this article will (1) examine how algorithms are impacting processes of taste-making and (2) re-evaluate some of the critical theoretical perspectives that have come to dominate the discourse surrounding algorithmic cultures. To do so, I join Bucher ((2016) Neither black nor box: Ways of knowing algorithms. In: S Kubitscko and A Kaun (eds) Innovative Methods in Media and Communication Research. Cham: Springer International Publishing, pp. 81–98; (2018) If…then: Algorithmic Power and Politics. London: Oxford University Press) in adopting a relational materialist perspective of algorithms and proceed to reverse engineer the NRS; an experiment that exposes the system’s circular and economic logics while highlighting the complex and networked nature of taste-making in the film and television industry.
... The recent dispersion of algorithms into a large part of social life makes algorithms valid analytical objects for sociology in the twenty-first century (Totaro & Ninno 2014;Amoore & Piotukh, 2016;Roberge & Seyfert, 2016). Social attempts to make sense of algorithms can be found in different forms of engagement, including designing, maintaining, selling, using and controlling them (Seaver, 2014). ...
The recent dispersion of algorithms throughout a large part of social life makes them valid analytical objects for sociology in the twenty-first century. The ubiquity of algorithms has led to increased public attention, scrutiny and, consequently, regulation. That is the focus of this paper. I will show that such regulatory processes are not just aimed at preventing certain algorithmic activities, but that they are also co-producing algorithms. They determine, in specific settings, what an algorithm is and what it ought to do. I will illustrate this by comparing two different European regulations aimed at algorithmic practices: the regulation of trading algorithms in the German High Frequency Trading Act and in the Markets in Financial Instruments Directive (MiFID II), and the regulation of personal data processing in the General Data Protection Regulation (GDPR).
Full-text available
This article addresses the current challenges for geographical educational science brought about by algorithmic cultures. It argues from different theoretical perspectives. First, we discuss sociological, cultural and geographical perspectives on algorithmic cultures and link them to selected approaches of media education in general and geographical media education in particular. We conclude from the discussion that Spatial Citizenship Education is especially suitable to address the challenges. This leads to the question of what aspects of Spatial Citizenship Education need to be supplemented to address algorithmic cultures. We therefore examined the curriculum of Spatial Citizenship Education. The analysis demonstrated that the approach is a sound basis for coping with the developments seen in the context of algorithmic cultures. Nevertheless, algorithmic cultures are accompanied by some changes in social and geographical structures that have not yet been captured by the approach. For example, there is no consideration of geomedia as algorithmic, semi-autonomous systems. Further implications also emerge. In this context, the investigation into the practicability and relevance of algorithmic cultures for the practical learning process is of particular interest.
Full-text available
Digital labour platforms have become important sites of negotiation between expressions of micro-entrepreneurship, worker freedom and dignity of work. In the Global South, these negotiations are overlaid on an already fraught relationship mediated by the dynamics of caste and culture, to the usual politics of difference. Urban Company (UC), an app-based, on-demand platform in India that connects service providers offering home-based services to potential customers, lists professionalised services that have hitherto been considered part of a ‘culture of servitude’, performed by historically marginalised groups afforded little dignity of labour. Such platforms offer the possibility of disrupting the entrenched ‘master-servant’ relationship that exists in many traditional cultures in the Global South by their ostensibly professional approach. While service providers now have the opportunity for self-employment and gain ‘respectability’ by being associated with the platform, UC claims to have leveraged AI to automate discipline in everything the providers do. Using interviews with UC women service providers involved in beauty work and software development engineers, this paper explores the agency afforded to service partners in both professional and personal spheres. Further, we propose the term blended cultures to think about the ways in which algorithms and human cultures mutually (re)make each other.
Full-text available
This erudite volume examines the moral universe of the hit Netflix show Black Mirror. It brings together scholars in media studies, cultural studies, anthropology, literature, philosophy, psychology, theatre and game studies to analyse the significance and reverberations of Charlie Brooker’s dystopian universe with our present-day technologically mediated life world. Brooker’s ground-breaking Black Mirror anthology generates often disturbing and sometimes amusing future imaginaries of the dark side of ubiquitous screen life, as it unleashes the power of the uncanny. This book takes the psychoanalytic idea of the uncanny into a moral framework befitting Black Mirror’s dystopian visions. The volume suggests that the Black Mirror anthology doesn’t just make the viewer feel, on the surface, a strange recognition of closeness to some of its dystopian scenarios, but also makes us realise how very fragile, wavering, fractured, and uncertain is the human moral compass.
Full-text available
The global financial markets have been aggressive early-adopters of new technologies for most of their history. In the past quarter of a century, since the instigation of the “Big Bang” switch to paperless electronic trading, the City of London has led the world in the adoption of new information and communications technology (ICT) for the provision of electronic trading facilities, and the associated distribution of data and news feeds. This hunger for new technologies looks unlikely to be diminished in future. As well as many opportunities, ICT development has additionally brought risks (some of which are non obvious and even counter-intuitive) for which there is an immediate requirement for careful and thorough evaluation. New technologies may come in the form of new hardware, new software (including algorithms), or (most likely) combinations of the two. As new technologies become available and more widely adopted, they may significantly alter what market actions and activities are possible, and in the longer term they may significantly alter the socio-economics of the financial markets, and hence also the necessary regulatory and political frameworks that financial institutions operate in. In this document we establish the historical context for technology adoption in the financial markets, review current technology trends, and then extrapolate them out by five to ten years, in an attempt to identify what the financial-markets technology landscape might reasonably look like in 2020 or 2022. By identifying current products and services that appear to meet the technical definition of disruptive technologies, we explore what likely ICT developments over the next ten years will become the most significant to the financial markets, and how those developments might change the industry and affect the employment distribution of human traders. We then briefly speculate on the consequent possible impacts on systemic financial stability.
Full-text available
Calls for greater transparency as well as corporate and individual accountability have emerged in response to the recent turbulence in financial markets. In the field of high-frequency trading (HFT), suggested solutions have involved a call for increased market information, for example, or better access to the inner workings of algorithmic trading systems. Through a combination of fieldwork conducted in HFT firms and discourse analysis, I show that the problem may not always stem from a lack of information. Instead, my comparative analysis of different market actors (regulators, market analysts and traders) shows that the diverse and complex ways in which they access and construct knowledge out of information in fact lead to what I call different epistemic regimes. An understanding of how epistemic regimes work will enable us to explain not only why the same market event can be viewed as very different things – as market manipulation, predation or error – but also why it is so difficult to arrive at a unified theory or view of HFT. The comparative perspective introduced by the idea of epistemic regimes might also serve as a starting point for the development of a cultural approach to the study of financial markets.
Full-text available
This article considers the issue of opacity as a problem for socially consequential mechanisms of classification and ranking, such as spam filters, credit card fraud detection, search engines, news trends, market segmentation and advertising, insurance or loan qualification, and credit scoring. These mechanisms of classification all frequently rely on computational algorithms, and in many cases on machine learning algorithms to do this work. In this article, I draw a distinction between three forms of opacity: (1) opacity as intentional corporate or state secrecy, (2) opacity as technical illiteracy, and (3) an opacity that arises from the characteristics of machine learning algorithms and the scale required to apply them usefully. The analysis in this article gets inside the algorithms themselves. I cite existing literatures in computer science, known industry practices (as they are publicly presented), and do some testing and manipulation of code as a form of lightweight code audit. I argue that recognizing the distinct forms of opacity that may be coming into play in a given application is a key to determining which of a variety of technical and non-technical solutions could help to prevent harm.
In a talk in 2013, Karin Knorr Cetina referred to ‘the interaction order of algorithms’, a phrase that implicitly invokes Erving Goffman's ‘interaction order’. This paper explores the application of the latter notion to the interaction of automated-trading algorithms, viewing algorithms as material entities (programs running on physical machines) and conceiving of the interaction order of algorithms as the ensemble of their effects on each other. The paper identifies the main way in which trading algorithms interact (via electronic ‘order books’, which algorithms both ‘observe’ and populate) and focuses on two particularly Goffmanesque aspects of algorithmic interaction: queuing and ‘spoofing’, or deliberate deception. Following Goffman's injunction not to ignore the influence on interaction of matters external to it, the paper examines some prominent such matters. Empirically, the paper draws on documentary analysis and 338 interviews conducted by the author with high-frequency traders and others involved in automated trading.
More and more aspects of our everyday lives are being mediated, augmented, produced and regulated by software-enabled technologies. Software is fundamentally composed of algorithms: sets of defined steps structured to process instructions/data to produce an output. This paper synthesises and extends emerging critical thinking about algorithms and considers how best to research them in practice. Four main arguments are developed. First, there is a pressing need to focus critical and empirical attention on algorithms and the work that they do given their increasing importance in shaping social and economic life. Second, algorithms can be conceived in a number of ways – technically, computationally, mathematically, politically, culturally, economically, contextually, materially, philosophically, ethically – but are best understood as being contingent, ontogenetic and performative in nature, and embedded in wider socio-technical assemblages. Third, there are three main challenges that hinder research ...
Zusammenfassung Der Einsatz von algorithmischen Handelssystemen hat sich innerhalb kurzer Zeit unter Teilnehmern auf Finanzmärkten verbreitet. Bisher haben Soziologen vor allem die Automatisierungen der Börsenplätze untersucht, durch die algo trading überhaupt erst möglich wurde. Wie sich die Praxis und Organisation des Finanzhandels selbst verändert hat, wurde allerdings kaum erforscht. In diesem Beitrag sollen Heuristiken für entsprechende Forschungen entwickelt werden. Auf Grundlage erster empirischer Daten wird argumentiert, dass algo trading eine grundlegende Rekonfiguration der Praxis des Finanzhandels impliziert: Während im traditionellen Handel Händler die Kernaktivitäten (Formierung von views, Positionsmanagement) monopolisieren konnten, wird algo trading in viel stärkerem Maße auf unterschiedliche Gruppen (Entwickler, Informatiker, Händler) verteilt. Diese Gruppen entwickeln mit je eigenen Codes eigene professionelle Perspektiven auf Algorithmen. Nicht nur die technische Operationalisierung, sondern auch die ökonomischen Kalküle des Finanzhandels haben sich verändert: algo-trading-Firmen treffen bereits mit der Selektion und Kombination von Personal und Technologien folgenreiche Entscheidungen; auch firmenintern verändert sich die Zurechnung von Leistungen und Risikoverantwortungen. Als wesentlicher zukünftiger Forschungsbedarf wird die Analyse der Mobilität von Personen, Teams und Codes innerhalb organisationaler Felder identifiziert.
This book critically explores forms and techniques of calculation that emerge with digital computation, and their implications. The contributors demonstrate that digital calculative devices matter beyond their specific functions as they progressively shape, transform and govern all areas of our life. In particular, it addresses such questions as: How does the drive to make sense of, and productively use, large amounts of diverse data, inform the development of new calculative devices, logics and techniques? How do these devices, logics and techniques affect our capacity to decide and to act? How do mundane elements of our physical and virtual existence become data to be analysed and rearranged in complex ensembles of people and things? In what ways are conventional notions of public and private, individual and population, certainty and probability, rule and exception transformed and what are the consequences? How does the search for 'hidden' connections and patterns change our understanding of social relations and associative life? Do contemporary modes of calculation produce new thresholds of calculability and computability, allowing for the improbable or the merely possible to be embraced and acted upon? As contemporary approaches to governing uncertain futures seek to anticipate future events, how are calculation and decision engaged anew? Drawing together different strands of cutting-edge research that is both theoretically sophisticated and empirically rich, this book makes an important contribution to several areas of scholarship, including the emerging social science field of software studies, and will be a vital resource for students and scholars alike. © 2016 selection and editorial material, Louise Amoore and Volha Piotukh. All rights reserved.