ArticlePDF Available

The Datafication of #MeToo: Whiteness, Racial Capitalism, and Anti-Violence Technologies


Abstract and Figures

This article illustrates how racial capitalism can enhance understandings of data, capital, and inequality through an in-depth study of digital platforms used for intervening in gender-based violence. Specifically, we examine an emergent sociotechnical strategy that uses software platforms and artificial intelligence (AI) chatbots to offer users emergency assistance, education, and a means to report and build evidence against perpetrators. Our analysis details how two reporting apps construct data to support institutionally legible narratives of violence, highlighting overlooked racialised dimensions of the data capital generated through their use. We draw attention to how they reinforce property relations built on extraction and ownership, capital accumulation that reinforces benefits derived through data property relations and ownership, and the commodification of diversity and inclusion. Recognising these patterns are not unique to anti-violence apps, we reflect on how this example aids in understanding how racial capitalism becomes a constitutive element of digital platforms, which more generally extract information from users, rely on complex financial partnerships, and often sustain problematic relationships with the criminal legal system. We conclude with a discussion of how racial capitalism can advance scholarship at the intersections of data and power.
Content may be subject to copyright.
The Datacation of #MeToo: Whiteness,
Racial Capitalism, and Anti-Violence
Kathryn Henne
, Renee Shelby
and Jenna Harb
This article illustrates how racial capitalism can enhance understandings of data, capital, and inequality through an in-depth
study of digital platforms used for intervening in gender-based violence. Specically, we examine an emergent sociotech-
nical strategy that uses software platforms and articial intelligence (AI) chatbots to offer users emergency assistance,
education, and a means to report and build evidence against perpetrators. Our analysis details how two reporting
apps construct data to support institutionally legible narratives of violence, highlighting overlooked racialised dimensions
of the data capital generated through their use. We draw attention to how they reinforce property relations built on
extraction and ownership, capital accumulation that reinforces benets derived through data property relations and own-
ership, and the commodication of diversity and inclusion. Recognising these patterns are not unique to anti-violence
apps, we reect on how this example aids in understanding how racial capitalism becomes a constitutive element of digital
platforms, which more generally extract information from users, rely on complex nancial partnerships, and often sustain
problematic relationships with the criminal legal system. We conclude with a discussion of how racial capitalism can
advance scholarship at the intersections of data and power.
Racial capitalism, data capital, whiteness, AI, sexual violence, data justice
This article is a part of special theme on Data, Power and Racial Formations. To see a full list of all articles in this
special theme, please click here:
Scholarship increasingly acknowledges how technology,
data, and the internetfar from being post-racial or colour-
blind (Daniels, 2016; Noble and Roberts, 2019)enshrine
whiteness (Daniels, 2013). Whiteness, the racial grammar
that reinforces logics of white privilege and racial hierarch-
ies, maintains power and status associated with racial cat-
egories (Bonilla-Silva, 2012). Research has traced
economic, political, and social harms of these practices,
including how minoritised communities endure heightened
forms of surveillance, face challenges obtaining resources,
and experience more interactions with criminal legal
systems (see Bhatia, 2021; Browne, 2015; Eubanks, 2018;
Jefferson, 2018). The disproportionate effects of datacation
rely on variegated systems of capital, which are enabled
through the interplay of data, labour, knowledge, technical
School of Regulation and Global Governance (RegNet), The Australian
National University, Canberra, Australia
Gender & Sexuality Studies Program, Northwestern University, Evanston,
Corresponding author:
Renee Shelby, Gender & Sexuality Studies Program, Northwestern
University, Evanston, USA
Creative Commons Non Commercial CC BY-NC: This article is distributed under the terms of the Creative Commons Attribution-
NonCommercial 4.0 License ( which permits non-commercial use, reproduction and
distribution of the work without further permission provided the original work is attributed as specied on the SAGE and Open Access page (https://us.
Original Research Article
Big Data & Society
JulyDecember: 114
© The Author(s) 2021
Article reuse guidelines:
DOI: 10.1177/20539517211055898
expertise, infrastructure, and noninterventionist regulation.
Commodication, extraction, and exploitationwhat some
refer to as data colonialism (Couldry and Mejias, 2019) or
technocolonialism (Madianou, 2019)are not only central
to reaping value from data; they also maintain racialised
sources of privilege and status associated with accessing
and using data.
While scholars recognise data as a core feature of 21
century capitalism (Sadowski, 2019) and that capitalism
perpetuates gendered and racialised oppression (Melamed,
2011), research on big data still reects little engagement
with critical scholarly traditions that explore entanglements
of capitalism and racial formation. This article offers an
analysis of how racial capitalism, which is central to
commodication for the purpose of deriving social or eco-
nomic value(Leong, 2013, 2152), operates in and through
dataeven when racialised dimensions are not evident on
the surface. Our aim is to illustrate how racial capitalism,
inclusive of its distinct strands (Bhattacharyya, 2018;
Cottom, 2020; Leong, 2013; Ralph and Singhal, 2019;
Robinson, 1983; Virdee, 2019), offers an analytic for under-
standing data practices often misunderstood as neutral or as
colourblind. Akin to other analyses of racial capitalism,
this article sheds light on how systems of capital accumula-
tion are constitutive of racialised forms of exploitation,
expropriation, and expulsion(Bhattacharyya, 2018, 37),
which includebut can exceedconcerns of whiteness.
To illustrate, we examine two prominent digital platforms
used for intervening in sexual harassment and gender-based
violence. They are part of an emergent sociotechnical strat-
egy that aim to offer emergency assistance, education, and
a means to report and build evidence against perpetrators.
While other studies illustrate gendered processes articulated
through these anti-violence technologies (Bivens and
Hasinoff, 2018), our analysis points to overlooked racialised
contours. We highlight how racial capitalism implicates the
design and use of anti-violence apps, particularly the data
capital generation and accumulation that they enable. Of
course, these patterns are not unique to anti-violence apps:
platforms more generally extract information from users,
rely on complex nancial partnerships, and often sustain pro-
blematic relationships with criminal legal systems (Jefferson,
2018; Mason and Magnet, 2012; Srnicek, 2017). Distinct in
this example is the accumulation of economic value directly
out of the act of doing social good”—a set of concerns that
have not yet received the same level of attention as wealth
generation by large for-prot companies (Magalhães and
Couldry, 2021, 353).
In the pages that follow, we trace arguments that detail
how systems of racial capitalism contribute to the
commodication of race for value generation, including
how whiteness often becomes embedded in these processes.
We highlight distinguishing features that play out in rela-
tion to digital modes of collecting data and the role of
data capital (Sadowski, 2019). We then contextualise the
rise of anti-violence apps and explain our methods before
discussing thematic concerns that emerge in relation to
two exemplars in the eld, Callisto, a condential reporting
platform whose matching algorithm connects victims of the
same perpetrator to identify repeat offenders, and Spot, the
rst cognitive interview articial intelligence (AI) chatbot
for reporting sexual and other workplace misconduct.
Extending our earlier analyses of the data protection chal-
lenges of anti-violence apps (Shelby, 2021; Shelby et al.,
2021), this article focuses on how both Callisto and Spot
evince three key areas identied in the literature on racial
capitalism: racialised property relations built on extraction
and ownership, capital accumulation that reinforces benets
derived through property relations and ownership, and the
commodication of diversity and inclusion promotion.
We conclude with a discussion of how these reporting
apps operate in tension with stated concerns of data justice.
Situating data and whiteness within
concerns of racial capitalism
Scholars have traced whiteness as a normative logic embed-
ded in and sustained through the technological tendency to
default to whitenessthat is, racial privilege remains unac-
knowledged and taken for granted (Schlesinger et al.,
2018). For instance, data-based devices, including
Amazons Alexa, and representations of technology in lm,
media, and marketing exhibit white attributes that range
from skin colour, facial features, voice, language, and cultural
sensibilities (Cave and Dihal, 2020; Phan, 2019).
Commonplace characteristics attributed to technology, such
as higher levels of professional status and smartness,also
reect longstanding hierarchies of racial power: features
have been selectively associated with advantaged social posi-
tions along axes of class, gender, race, and ethnicity
namely, middle-class, cisgender, masculine, white groups
(Cave and Dihal, 2020).
These attributes are not unique to digital technologies.
Earlier Critical Race Theory (CRT) analyses demonstrate
how whiteness not only has value as a property (as in pos-
session), but also has constitutively taken on characteristics
of property (as power relationships) (Harris, 1993). More
recent studies reveal whiteness has become an entitlement
to social goods, carrying reputational valueand the
power to exclude(Bhandar, 2018, 7). Racial capitalism
examines these processes of differentiation: it, as outlined
by Gargi Bhattacharyya (2018, 103), is not as an
account of how capitalism treats different racial groups’”,
but an approach that illuminates constitutive relationships
between racism and capitalism (see also Cottom, 2020;
Robinson, 1983). It enables investigation of how the
inequalities observed in relation to property are not
givens or inevitabilities, but rather are conscious selections
regarding structuring of social relations(Harris, 1993,
2Big Data & Society
1730). In sum, racial capitalism captures more than how
whiteness may transfer privilege to those who can acquire
it; it scrutinises how racism operates through practices of
accumulation, simultaneously shaping the distribution of
benets and forms of exclusion.
Digital data bring distinct dimensions of property rela-
tions to the fore through cycles of capital accumulation.
User activity yields reporting data, metadata, transaction
data, technical data, and other types of information. Such
data are then used to shape content, develop and enhance
products, and expand digital consumer bases for conversion
into economic value (Sadowski, 2019). Their use also
aims to inuence behaviour, purchasing patterns, latent
demand, and the attention of specically targeted
groups(Fuchs, 2012, 705), which Andrejevic (2012, 76)
frames as a form of intentional control and manipulation.
Alienation, appropriation, and coercion shape userscontri-
butions to generating value and creating data capital (Fuchs,
2012), as corporate entities often own and dene how their
contributions are used and enjoyed. As their labour is
appropriated, the lack of informed consent over the full
range of data usages sustains the coercive nature of these
Although nuanced, many reections on data capital
often miss how capital is itself racialised. Race can
operate as capital, both in a social sense and in the
Marxian tradition. In terms of social capital, markets have
arisen with value placed on diversity, which as Nancy
Leong (2013, 2181) explains, have enabled white people
and institutions to engage in ways that enhance their
status by signalling cross-cultural credibility. In terms of
Marxian capital, social processes are also important; they
facilitate the conversion of labour into commodities with
exchange value, not simply things that are useful.
According to Leong (2013), racial identity regardless of
whether it reects the privileges of whiteness can
become a commodity when it is conceded via exchange.
Racial capital becomes pronounced through this process.
The commodication of racial identity enables value gener-
ation and extraction vis-à-vis markets. As capital is often
invested and reinvested, white persons and institutions
(the dominant class for Marx) can receive additional and
cyclical benets from surplus value.
These observations about racial capital have direct rele-
vance for analyses of data capital. Cottom (2020) argues the
logics of racial capitalism are embedded across the plat-
forms enabling the internet economy, suggesting data
capital is similarly shaped by them. As most technologies
for collecting and processing data rest with companies,
the forms of knowingassociated with big data mining
are available only to those with access to the machines,
the databases, and the algorithms, which renders them as
advantageously positioned compared to those without
such access(Andrejevic, 2014: 1676). Modes of data
appropriation and alienation are notably uneven. Certain
sources of data are valued more than others; data are dispro-
portionately solicited and extracted from certain groups, while
others tend to be excluded (Sadowski, 2019). Looking at
these digitised processes through the lens of racial capitalism
sheds light on the racialised contours of the harms caused
through extraction and alienationan observation that our
analysis of anti-violence apps illuminates.
Situating the rise of anti-violence apps in
response to gender-based violence
Reporting apps rely on data about gender-based violence to
assist institutional decision-making and responses. Though
functionality varies, they tend to feature algorithms that
accumulate data in information escrows and match assault
incidents to identify repeat offenders (e.g. Callisto, JDoe)
as well as AI chatbots that interview victims and produce
incident reports, aggregating data in dashboards to identify
organisational patterns of misconduct (e.g. Talk to Spot,
#NotMe, Botler AI). Although responding toand some-
times explicitly appropriatinglanguage from the #MeToo
movement, existing analyses of anti-violence apps nd
they reinforce gendered rape myths (Bivens and Hasinoff,
2018), strengthen surveillance structures (Mason and
Magnet, 2012), and support limited legal responses (Sim,
The broader market for anti-violence apps emerged after
2010, driven by advances in smart technology and a U.S.
government call for software innovators to leverage
mobile technology to address sexual violence. The
Obama administrations 2011 Apps Against Abuse compe-
tition sparked venture capitalist interest in the possibilities
of anti-violence technology, bolstered by the emerging
smartpersonal safety industry and its promotion of
machine learning, AI, and big data analysis. The 2017
mainstream #MeToo movement increased scrutiny of the
failings of legal institutions and bureaucracies to protect
survivors, prompting renewed pleas for innovators to
develop accessible technologies to increase reporting
(Glaser, 2016). In light of high-prole repeat offender
cases, such as Larry Nassar and Harvey Weinstein, apps
seemed an effective tool to produce evidence that would
counter authoritiesapathy towards survivors (McPhee
and Dowden, 2018). Numerous apps have since prolifer-
ated, particularly in higher-income countries, with the per-
sonal smart safety and security market ballooning to 2904
million USD in 2019 (Market Research Future, 2020).
They are now implemented in multi-national corporations,
such as Instacart, Kickstarter, and Zillow, and across uni-
versity campuses in North America.
The embrace of reporting apps offers a site for examining
the value they provide organisations in the post-#MeToo
economya viral landscape that critical observers argue
has been steered by white feminism and the desire for state
Henne et al. 3
and institutional redress to personal injury (see Phipps,
2019). For example, as part of the 2018 report,
Transforming Workplace Culture in the Era of #MeToo,
#BlackLivesMatter, and More, Seyfarth, a large corporate
law rm, endorses reporting apps, including Callisto and
Talk to Spot, as a strategy for employers to react quickly
or bear the brunt of public backlash and shareholder disap-
proval(Gesinsky et al., 2018, 2). The #NotMe reporting
app promotes its value as helping companies reverse the
trendsof costly employee absences, lawsuits, and turnover.
AllVoices, an anonymous reporting platform for businesses,
indicates its goal is to support companies in creating
healthy, safe feedback cultures that directly lead to more pro-
ductive and engaged employees, higher productivity, and
less turnover. These framings underscore how apps are ima-
gined as projects that protect organisations, increase worker
productivity, and ensure a competitive edge.
The promise of these apps relies on a presumption that
technology can optimise a survivors report by collecting
and transforming personal accounts of violence into institu-
tionally legible data. Reporting, however, cannot be
abstracted from material conditions in which legibility is
understood. Feminist scholars have traced how racist for-
mations of sexism shape understandings of what gender
violence is and who is legible as a survivor (e.g.
Combahee River Collective, 1982; Davis, 1981; McGuire,
2010). For much of U.S. history, law has not provided
formal protection to all women, excluding women of
colour (Sommerville, 2004). Even though whiteness is
elastic and shifting, it remains a property of gender violence
legibility (Cacho, 2014). Reecting on the gure of the sur-
vivor in the viral #MeToo movement, Alison Phipps (2019,
17) explains that whiteness is still central to shaping narra-
tives of fragility and woundability”—that is, the assumed
purity and vulnerability of white women.Peopleof
colour, as well as gay, lesbian, bisexual, transgender, and
queer (LGBTQ) people, are often not afforded this presump-
tion of innocence. Instead, they are more likely to be blamed
for their assaults than their white and heterosexual cisgender
counterparts (Esqueda and Harrison, 2005). Anti-violence
interventions that rely on hegemonic notions of gender
tacitly reify experiences of privileged survivors (Richie,
2012), reinforcing a false assumption there is a universal
experience of gender-based violence (Crenshaw, 1991).
Despite these observations, reporting apps respond to the
problem of violence through a universalising design that
imagines algorithms, machine learning, and data analytics
as generating new institutional power relations untethered
to the inequitable politics of survivorship. Negating how
class, dis/ability, racism, and sexuality particularise reporting
, they present computational modalities as
viable solutions to the systemic inaction of authorities and
devaluing of survivorsaccounts. Promises of cryptography-
based individual privacy, user-friendly and efcient inter-
faces, and institutional dashboards render technical responses
and big data production as modes of increasing accountabil-
ity. These design features also subject users to processes of
data collection, accumulation, and circulation. In doing so,
the belief that digital reporting will present survivors
accounts in ways that lead to the recognition and support
of victims, including the realisation of their right to bodily
integrity, encourages participation in what are extractive
In addition to these extractive logics, racialised dynam-
ics that contribute to the inequities among survivors carry
over into processes of generating data capital. As Katz
(2020, 8) notes, AI is a technology that works to exibly
serve a social order premised on white supremacy.
Reporting apps do not escape this critique, even when
they mobilise diversity and inclusion discourses. They
both mobilise and obscure racialised experiences in data-
cation, mirroring the recognised contradictions of ofine
racial and gender politics (Ferguson and Hong, 2012).
These tools exemplify a longstanding trend: that contem-
porary capitalism has depended as much on the production
and negotiation of difference as it has through enforcing
sameness, standardization, and homogenization(Hall,
2017: 118119; quoted in Virdee, 2019, 9). Their contradic-
tions are the outgrowth of constitutive relationships
between data, capital, and inequality as well as how racial
capitalism underscores the interplay between them.
Our argument draws upon a qualitative analysis of two
popular reporting appsCallisto and Talk to Spot. They
rely on similar logics: accumulating data for decision-
making about gender-based violence and using collected
data to support institutional action. For Callisto, they are
to identify repeat offenders; for Talk to Spot, they are to
optimise HR responses. Project Callisto is a 501(c)3 organi-
sation whose mission is to create technology to combat
sexual assault, empower survivors, and advance justice
( Initially formed in
2011, Callisto rened their reporting app based on survivor
feedback in 2020. This app allows university users to enter
a record into their matching algorithm to identify repeat
offenders, review resources and potential actions to con-
sider (e.g. report to police or talk to an attorney), and
create a record of the assault using a standardised form
In contrast, Talk to Spot is a chatbot meant to encourage
employees to speak out about discrimination, harassment,
bullying, misconduct, and other inappropriate behaviour.
Founded in 2017, Talk to Spots chatbot feeds into a
webapp where managers and HR professionals eld
reports. The system is designed for organisations and has
been adopted by various corporations, including ASOS,
Davita Dialysis, and Kickstarter.
While using two examples of anti-violence technologies
aids in illustrating how racial capitalist formations operate
4Big Data & Society
in diverse ways, this approach did limit the scope of our
analysis to primarily U.S. contexts. We felt the benets out-
weighed the tradeoffs: the combined analysis of Callisto,
which is supported by a non-prot organisation, and Talk
to Spot, which has a for-prot model, revealed that both
tools generate data capital in ways that embed whiteness
and contribute to the commodication of race for value.
For each app, we collected textual and qualitative data,
including the app websites and advertisements, legal
terms of use (e.g. privacy statements), nancial and techni-
cal reports, functionality assessments, and media coverage
since 2011.
We located media and news coverage of Callisto and
Talk to Spot through targeted NexisUni and web searches
of technology publications and other mixed-media, includ-
ing Ted Talks and gender-based violence-themed podcasts.
In total, we reviewed 273 articles and 15 videos (including
podcasts) published between 2015 and 2021. Where possi-
ble, our research team conducted walkthroughsof the
apps, making step-by-step observations of the appsfea-
tures and ows to understand the expected functionality
and cultural messages conveyed in their design (Light
et al., 2018). As Talk to Spot is for organisations, we had
to review videos produced by the company to document
user and employer interfaces and the processes for sample
reports. Given the multimedia nature of the data, we open-
coded documents in vivo using Atlas.TI. In carrying out
data analysis, we also reected on insights gleaned
through archival work that began in 2017 and autoethno-
graphic insights from being approached by anti-violence
technology developers and funding bodies.
The analysis presented here draws on materials that
contextualised big data practices in relation to racial capit-
alism and ndings that reected commonalities and gaps
across the text. We iteratively revisited our analytical
themes of commodication, data extraction, ownership,
property relations, and instances of racialisation through
the lens of gendered racial capitalism to isolate patterns
and distinctions in and across the collected data
(Timmermans and Tavory, 2012). In line with qualitative
analyses of big data cultures and practices, this approach
examines the institutionalised routines, habits, and knowl-
edge practices of the app publishers with respect to data
(Albury et al., 2017, 2). It also enabled us to identify fea-
tures that aligned with insights about racial capitalism,
informing the structure and argument of the article. The
inherent opacity that comes with studying proprietary tech-
nology, however, prevented us from closely examining
algorithms and functionalityaspects often considered
trade secrets. We were, however, able to observe some
operative intricacies of these technologies and their pro-
cesses of generating and extracting value from data
capital. In the next sections, we examine how reporting
apps produce data capital that is not simply gendered but
also racialised.
Racial capitalism in and through the
datacation of gender violence reporting
Reporting apps mediate and control extractive data ows
among victims, organisations, and third-party data brokers
in ways that obscure and mobilise racial dynamics. While
their datacation of violence abstracts information about
the assault and identities of survivors, data accumulation
by dispossessiona practice of producing privatised data
capital (Thatcher et al., 2016, 995)facilitates the com-
modication of social hierarchies as reporting data comes
to take on value. Our scrutiny of these apps reveals practices
of racial capitalism emerging in three ways, which are often
overlapping and interrelated: (1) through racialised property
relations built on data extraction and ownership; (2) through
capital accumulation that reinforces benets derived
through property relations and data ownership; and (3)
through the commodication of diversity and inclusion pro-
motion. These techniques of racial capitalism operate as
mechanisms of coercive power in the data economy by
tapping into survivorsdesire for institutional recognition
and action.
Property relations through coercive data extraction
and data ownership
Critical analyses of property relations and ownership high-
light how law bolsters the generation of racial capital and
inequality by securing property rights, enabling the enforce-
ment of contracts, and creating subjects (Bhandar, 2018). In
the United States, various legal frameworks articulate the
terms of data ownership, generally privileging the rights
of data controlling and processing entities. For example,
the Privacy Policies and Terms of Use used by Callisto
and Spot facilitate data brokersproperty interests by
enshrining their right to own, share, and retain personal
and technical data through a notice and choiceframework.
While this framework purports to put individuals in charge
of the collection and use of their personal information
(Reidenberg et al., 2014, 489), the consent request facili-
tates claims to data ownership that ensure use value for
companies with access.
With Spot, collected personal information is controlled
by All Turtles Corporation, San Francisco, California
(Spot, 2018); users agree to permit All Turtles to collect,
access, process, and use a variety of information when
you use Spot for Teams(Spot, 2020). To encourage
consent, both Callisto and Spot emphasise that their tech-
nologies increase disclosure: Callisto (2018) notes survi-
vors are six times more likely to report after visiting the
app, and Spot (n.d.) reports a 7080% follow-up response
rate from employees who use the software. Although
appealing to the desire for institutional legibility and
accountability, neither app makes the processes and out-
comes of its services transparent. Obtaining users
Henne et al. 5
consent becomes explicitly coercive when apps are institu-
tionalised within an organisation. Consider, for instance,
when apps become the formal means of reporting harass-
ment and other workplace conduct to HR. They require sur-
vivors to agree to service terms that demand data about
themselves and acts of violence to enable the processing
and circulation of data. The generation of data is the key
source of these appsvalue, yet the extraction of data
from users relies on opacity, obfuscation, and information
asymmetry. While survivors may manually enter informa-
tion through the digital interface, the apps can collect
much more, such as user analytic data that can be personal
and technical in nature, anonymised IP address, device,
operating system, and browser information, and statistics
regarding how the user engages with the website (e.g.
areas of the website visited, number of clicks, time spent
on each page).
Callisto encrypts entered information so it cannot be
shared with its staff and Legal Options Counsellors
unless permission is granted. Stored information regarding
an alleged offender includes unique identiers such as
your telephone number, email address, or social media
account information(Callisto, 2020). Spot notes the only
mandatory personal data required is ones email;
however, it enables the collection of optionalinforma-
tion, such as name, demographics (e.g. gender, occupation,
but not race), information about their employer, name of the
alleged offender, and details of the event. In this case, col-
lecting information about gender but not race amalgamates
diverse identities, creating consumable data subjects that
may align with the privacy norms of the internet economy
but not the goal of transformative gender violence justice.
By annexing race to the data shadows (Leonelli et al.,
2017), these practices whitewash subjects and their experi-
ences of violence, which are inextricably linked to inter-
locking systems of oppression. In short, reporting apps
actively racialise data as if subjects are the samethat is,
as normatively white
The design of reporting apps, particularly how they
collect information from users, ensures they are the
primary generators and conduits of data ows. In fact,
Callisto users must repeatedly engage with the app to
derive direct benet or opt outfor example, to be
matched with another victim of the same perpetrator or to
have a legal counsellor advise on the best recourse. If
ones information matches with another users submission
in the system, the entry will not be deleted until the
Legal Options Counsellor closes your case, and survivors
must contact their assigned Counsellor if they wish to close
their case (Callisto, 2020). As such, it reects one mechan-
ism through which reporting apps turn survivors into
objects of data collection rather than subjects with agency
and power over their data. As an organisational and enter-
prise technology, Spot encourages in-app communications
by offering multiple channels for HR and employees to
continue conversations(
Although Spots storage terms differ slightly, it, like
Callisto, relies on the extraction of data, which can be
reused and reinvestedeven in cases where users do not
benet directly. Further, reporting apps promise data will
be anonymised, encrypted, and condential to encourage
users to submit information. While these design features
prompt users to engage the apps in pursuit of an algorithmic
match or institutional accountability that may never occur,
Callisto and Spot benet from the exchange value of the
data capital they circulate in the broader digital economy.
These practices attest that reporting app privacy policies
and data governance are less about protecting users and
more about enshrining property relations that create more
capital using survivorsdata. This distinct is important.
The realities of how data become commodied tip its use
value in favour of apps, organisations, and data brokers, a
central feature of both data capital and systems of racial
capitalism (Leong, 2013; Sadowski, 2019). For instance,
in an interview for The New Yorker, Jessica Ladd,
Callistos founder, was asked Will Callisto retain the
reports submitted by victims who may no longer be enrolled
in the service?Ladd replied, Dont tell anybody, but yes
(Mead, 2018). Callistos Privacy Policy suggests it can be
it is therefore important that our usersentries are main-
tained and stored over time to enable the matching
feature. This is why even after your account is terminated,
some or all of your data may still be stored on our platform.
The power of data property rights, explained here as a
public secret, is the ability to exercise both data accumula-
tion and dispossession, regardless of how usersdesire for
their information.
Extracting data along these terms also ensures surplus
value through the sharing of data with third parties, the
direct benets of which are not for users. MailChimp,
Callistos email service provider, facilitates data sharing
with social media platforms: users with the Social
Prolesfeatureturned on actually enable additional infor-
mationto be exchanged and more widely circulated
(Callisto, 2020). Its use of Google Analytics has similar
implications, enabling the sharing of information collected
by Google Analytics about your visits to our site and other
sites is governed by the Google Analytics Terms of Use
(Callisto, 2020). While Google Analytics can provide report-
ing apps with information about how people interact with
their products in ways that make them more accessible to sur-
vivors, Google Analytics and the other big data technologies
they talk to through APIs are fundamentally marketing tech-
nologies. Non-prots and apps might use them for social
good, but the data they circulate ows into the big data
economy. These policies thus reect distributive politics
characteristic of Big Tech markets, with information
6Big Data & Society
owing from users to reporting apps to corporate partners
who can translate and repurpose data into prot. Callisto
and Spot share data with Facebook, Google, Mailchimp,
and Twitter; the Spot AI further integrates with business
apps, such as BambooHR, Namely, Slack, Workday, and
Zenets. As reporting apps and technology companies can
continuously draw out new value from data, these arrange-
ments reect asymmetrical power relationships of data
capital ownership.
Privacy Policies and Terms of Service operate as forms
of law that facilitate data brokersinterests and privileges.
For example, they shift liability by detailing how parent
companies, despite owning and beneting from data
capital, are not responsible for data leaks, identication or
loss, or poor HR responses. In doing so, they elide culpabil-
ity for harms arising from authorised access to data or insti-
tutional failure. In addition, the acknowledged problem of
TL;DR(too long, didnt read) in privacy notices (Obar
and Oeldorf-Hirsch, 2020) supports these coercive capital-
ist conditions. The resulting dispersion of data ownership
and responsibility puts the onus on the actual producers
of data (survivors) to understand, locate, and take control
of their reporting information. Scrutinising the distribution
of benets aids in unveiling the racialised dimensions of
these dynamics, as discussed in the next sections.
Data capital accumulation as reinforcing benets
derived from data ownership
Reporting apps and their partnersadvantageous position
within systems generating data capital reect the overlap-
ping and negotiated nature of ownership. Capital accumula-
tion reinforces benets for data controlling and owning
entities in three interrelated ways: 1) through data capital
extracted from users, 2) through exchange and surplus
value gained by sharing and investing with other users
and consumers (like businesses and universities), and 3)
through data reinvestment in developing and improving
reporting websites, their services, and business decisions.
The racialised contours of these mechanisms are not neces-
sarily evident on the surface, because the naturalisation of
ownership masks how property relations operate in ways
in which selected private interests are protected and
upheld(Harris, 1993, 1730).
As reporting apps invest and reinvest the data capital
they own, they reap the benets enabled by the surplus
value of those investmentsin this case, Callisto, Spot,
and their corporate collaborators. In a CNN Money
article, Ladd, Callistos founder, acknowledges seeing a
clear market opportunity but success wont be monetary.
Proting off of rape is something that makes people very
uncomfortable(OBrien, 2017). She describes how she
sees a need to shift how organisations investigate wrong-
doing, including police brutality and immigrant
deportations, envisioning Callisto as a catalyst (McHugh,
2019). To encourage this normative shift, Callistos code
is available on Github for others to copy the code and
implement in their HR systems. This push, however,
negates how human experiencesin this case, those dir-
ectly associated with harassment and violenceare
annexed and reorganised within the broader data
economy. In addition to whitewashing data under the pre-
tence of data privacy, Callisto and Spot do not provide
support for those who are historically illegible as victims
of violence, such as non-white, disabled, and LGBTQ
people. In open sourcing their matching algorithm,
Callisto is not simply sharing; it is seeking to institutionalise
digital reporting systems and create more value from data
about usersexperiences of violence and from interactions
with these systems.
These practices reect what Victor Ray (2019, 42) calls
racialised decoupling, which enables organisations to
maintain legitimacy and appear race neutral or even pro-
gressive while doing little to intervene in pervasive patterns
of racial inequality. As universities and corporations
often unmarked spaces of whiteness (Moore, 2008)insti-
tutionalise reporting apps, racialised decoupling is central to
how data capital benets reporting apps, these predomi-
nantly white organisations, and Big Tech entities. While
Callisto and Spot promote the sentiment that your data
should work for you(, they
sustain markets that rely on gender-based violence to
enhance data about users by promoting a better datapara-
digm. The business model requires more data to generate
better data, incentivising expansion that can amount to a
form of predatory inclusion (see Cottom, 2020). That is,
survivors of violence may be better positioned to report
through apps, but they do so on exploitative terms driven
by data dispossession, with no publicly available evidence
of effective institutional action based on their use.
Revisiting Ladds vision, these platforms can be used to
respond to other abuses, including police brutality and
immigrant deportations, which increases markets and
opportunities for the exchange of data capital (OBrien,
2017). As people of colour disproportionately experience
gender-based violence, police violence, and immigrant
deportations, these systems of data extraction often target
them but do not challenge white racialised organisations
that are the beneciaries of datas use value.
Consider, for example, how media conglomerate
Thompson Reuters has sold access to their private database
containing more than 400 million names, addresses, and
service records from morethan 80 utility companies covering
all the staples of modern lifeto U.S. Immigration and
Customs Enforcement and other law enforcement agencies
(Hartwell, 2021). This example illustrates how data are
shared across markets and in ways that can fortify surveil-
lance regimes, even if users had not intended their data to
be used for those purposes. In the case of reporting apps,
Henne et al. 7
violence enabled through gender inequity, structural racism,
and exclusionary citizenship can become sources from which
data markets can extract valueand to benet of institutions,
such as law enforcement, that propagate racialised sexual
violence (Purvis and Blanco, 2020; Ritchie, 2017). In
doing so, the better dataparadigm does not necessarily
support justice and accountability; it does, however,
support a broader marketplace, which can reinscribe pro-
foundly racialised congurations of power between users,
data, corporate actors, and the state apparatus.
Representatives of these apps have made public state-
ments suggesting that they understand corporate and institu-
tional consumers of these apps might privilege the value of
data over direct services for survivors. According to Kate
L. Lazarov, Callistos project ofcer, recording informa-
tion about an assault rather than reporting it could be valu-
able for colleges(Fabris, 2015). Value generation relies on
the circulation and exchange of standardised information
about survivors (and perpetrators if named in the reports)
within organisations, across digital survivor communities,
and to HR and legal systems. Organisational cultures,
however, can isolate employees and foreclose conversation
among co-workers about harassment and other workplace
misconduct. In the United States, it is already common to
avoid the harasser, deny, or downplay the gravity of the
situation, or attempt to ignore, forget, or endure the behav-
ior(Feldblum and Lipnic, 2016, n.p.), with intersectional
analyses demonstrating that Black women are dispropor-
tionately represented in sexual harassment charges across
industries (e.g. Rossie et al., 2018). A report by the
National Womens Law Center indicates 35.8% of women
who led charges alleging harassment also reported work-
place retaliation in which they risked a pay cut, job loss, dam-
aging future career options, and developing a reputation as a
troublemaker(Rossie et al., 2018). The fear of retaliation is
a central reason why employees do not report, especially
among those working in low-wage jobs. Although apps
purport to increase incident reports, they do not offer protec-
tion against retaliation. By black boxingsubmissions and
making their processing of complaints opaque, they can, in
fact, afrm cultures of silence.
In contributing to practices on universities and in busi-
nesses, reporting apps can cultivate new digital survivor
subjects, which produce opportunities for potentially
biased decision-making through data analytics. As Crooks
(2021: n.p.) notes, data work, like other bureaucratic pro-
cesses, is race work . . .by virtue of the interpretive and cal-
culative mechanisms which pre-emptively translate racial
projects into questions of computation. By design,
reports apps elide concerns of racism in gender-based vio-
lence. They confer benets that support the operation of
racialised organisations in their current form, not institu-
tional changes that counteract violence. For example,
Spot aims to create workows for sensitive issues,
which has expanded to include other workplace problems
from Covid-19 concerns to corrective action to bullying.
The company offers a case management platform to inves-
tigate reports and claimsand get actionable insights on
the health of your organizationthrough reporting on
trends. Similarly, Callisto provides their university partners
with aggregated data reports twice a year meant to offer
better data for the existing ofine reporting infrastructure
of survivor advocates, Title IX coordinators, and counsel-
lors (EEOC, n.d.). While they present these practices as
modes of catalysing organisational change by providing
better datato upstream stakeholders, their circulation of
whitewashed data reies modes of racialised decoupling.
There is also evidence that these apps create incentives
for predatory practices that sustain corporate power exer-
cised by Big Tech. Both Callisto and Spot use persistent
cookies, which remain on the reporters computer until
deleted. As explicitly surveillance-oriented tools, cookies
monitor, track, and log user activity, and in persistent
form, could enable reidentication of those users when
they returned to the site later on(Cohen, 2019, 54). Data
on usage trends are collected, stored, and analysed for
several stated reasons, such as to understand how our
users are using our website and platform, and which fea-
tures or content appear to be more useful and have the
most impact(Callisto Privacy Policy). Spot similarly
notes, We may use or disclose the personal information
we collectFor testing, research, analysis, and product
development, including to develop and improve our web-
sites and services(Spot Privacy Policy). These statements
reveal how reporting apps continually aim to generate value
from data capital through product optimisation, which
translates into value through better dataproducts. In
doing so, organisations, including those selling big data
analytics such as Google Analytics and Hotjar, rather than
victims, stand to benet most. Anti-violence apps, particu-
larly those designed for corporate entities, enable signicant
reinvestmentand not in ways that aim to provide direct
benets for survivors. By prioritising commodication and
capital accumulation, rather than data justice, the digital
infrastructure of racialised organisations may become more
robust; however, their discriminatory features can go
Racial commodication through the promotion of
diversity and inclusion
As other analyses of racial capitalism attest, non-white
racial identities can become commodities that predomi-
nantly white institutions instrumentalise as a form of
social capital (Leong, 2013; Ralph and Singhal, 2019). As
public relations are integral to capitalism (Madianou, 2019),
promotional materials often showcase representations of racia-
lised persons as a tool for reaching these groups as well as a
sign of their potential, or virtual, market share(Marino, 2014,
8Big Data & Society
7). Reporting apps are no exception. Although the functional
mechanisms of Callisto and Spot may whitewash data, their
marketing mobilises diversity and inclusionrhetoric.
Callisto has worked to cultivate a reputation as an orga-
nisation that upholds and helps others enact diversity and
inclusion mandates. As it has recruited a more diverse lea-
dership over time, its visual promotions have evolved to
support the perception that their product is race conscious
and supports equity. In fact, some universities consider
Callisto part of efforts to expand diversity and inclusion
initiatives that help marginalized groups on campus, includ-
ing victims of sexual assault and violence(Pollack, 2020).
These visualisations centre representations of non-white
people to support advertising efforts, evincing a kind of
tokenism that “‘leverages undervalued identitiesand pre-
serves commodied values of race by parading an excep-
tion’” (Leong, 2013, 2195). Callistos abstract illustrations
cultivate an image of racial inclusion by emphasising
bodily features stereotypically ascribed to Black and Brown
people, including curly hair and fuller lips (Figures 1 and 2).
In contrast to this racialised imagery, the design of
Callisto does not facilitate modes of attending to how
people of colour disproportionately experience violence.
Appealing to racial inclusion thus works in the service of
marketing the app, not in tailoring its services for survivors.
Carefully curated, this aesthetic conveys promotions that
may appeal to people of colour while still being palatable
for primarily white academic institutions and investors. Its
commodication of race is of interest to institutions that
want to support an image of a diverse user base, doing so
without disrupting the racial status quo. Moreover, as scho-
lars of racial capitalism explain, these kinds of tokenising
practices facilitate a framing of white racialised universities
as nonracist and culturally competent actor[s](Leong,
2013, 2013).
While Callisto actively employs racial difference to
support its market expansion, Spots advertising targets cor-
porate clients, which are primarily predominantly white
organisations that have had problems with harassment
(e.g. Ramos Law, 2017). Rather than explicitly confront
issues of age, disability, gender, or race, Spot depicts its
product through images of gender and racial ambiguity.
Its promotional videos utilise abstract animations of
humans that are coloured red and blue to signal a bifurca-
tion between women and men. In picking primary colours
rather than skin tones to differentiate these gures in their
advertisements, Spot promotes a colourblind approach that
does not seerace or chooseracial groups. It also presents
its AI as completely unbiasedwhen addressing workplace
issues, because Spot is a bot and not a human, it cannot
judge or assess you(Mercer, 2019). The discourse that
the Spot AI listens without judgment(Childs, 2019)
invites users to produce data in a context somehow discon-
nected from the gendered, heteronormative, and racialised
environments that users navigate. This marketing, of
course, is misleading, as digital tools can encode gendered,
Figure 1. Callisto homepage (source: 2021).
Henne et al. 9
racial, and other prejudicial ideas and values into the system
(Noble, 2018)
This race-neutral rhetoric carries over into Spots
unmarked AI chatbot. Spot CEO Jessica Collier has stated
explicitly that its genderless, personality-neutral inter-
vieweris to effectively eliminate reporting bias and the dis-
comfort of telling another person (of another gender, race, or
background)about traumatic experiences (Childs, 2019).
Though asserting neutrality, Spot is nonetheless a white
racialised AI: its seemingly benevolent design is to
enhance the extraction of data through a performance of an
educated, literate, middle-class (white) persona without indi-
cation of a lived culture or history (see Cave and Dihal, 2020;
Phan, 2019). As such, Spots proactive framing of race as
unmarked equate[s] the dominant cultural identity group
with a universalized vision of humanity(Marino, 2014,
3). The racial capitalistic elements become clear when con-
sidering how these appeals to whiteness aim to enhance
capital accumulation and value.
Spots embrace of normative whiteness extends into its
approach to diversity. One of its turnkey business solutions
is a one-hour diversity, equity, and inclusion (DEI) training,
delivered to employees in 10-min episodes [that] are more
like Instagram stories than the cringe-worthy videos we all
love to hate, covering topics primarily related to gender
discrimination, sexual orientation, and unwelcome
conduct ( Spots DEI train-
ing has an additional one-hour training for supervisors,
including one covering how to debunk false reports.
Such training seems antithetical to the appsmissionto
help facilitate delicate information because it reinforces the
myth that false reporting is rampant by teaching supervisors
to recognise credibility discounts, which are always
already classed, gendered, and racialised (Tuerkheimer,
2017). Reinforcing claims that supervisors need to get at
the truthof the matter obscures the nature of the systemic
problem, which is not false reporting, but poor institutional
action and lack of accountability.
These tensions reect how priorities of data extraction
and value generation often undermine the appsstated
goal of empowering users. Although different in their
approaches, both Callisto and Spot commodify race. It sup-
ports their shared aim of enhancing their ability to produce
data on experiences of harassment and violence in the
pursuit of an expanded consumer base and access to new
markets. The users of reporting apps are further alienated
from the benets of capital produced from their labour.
The value derived from these apps is not simply eco-
nomic. Racial commodication generates capital that has
social, cultural, and human dimensions. Capital that supports
favourable social status and reputation can also attract new
economic resources. For example, in 2018, Callisto received
the Skoll Award for Social Entrepreneurship, which awards
social entrepreneurs who take the risks to right the most
unjust systemswith $1.25 million USD in grant funding
(Callisto, 2018). While grant funds support Callistos opera-
tion, such awards also provide social capital through reputa-
tion enhancement and extended networks. More broadly,
Figure 2. Callisto aboutpage (source: 2021).
10 Big Data & Society
prestigious awards, which often come from primarily white
foundation funders, can be leveraged in other fundraising
efforts in which having prior awards is helpful in winning
new awards and grants from other primarily white funders.
These practices illuminate additional dimensions of the
complex racialised dynamics in which Callisto and Spot
are situated, which exceed inequalities observed in relation
to reporting. Not only do apps fail to explicitly confront
power structures that uphold racial inequalities, but they
also maintainand expandties to legal systems that per-
petuate harms disproportionately experienced by people of
colour. Take, for example, Spots co-founder, Julia Shaw, a
memory expert and psychologist who previously trained
police and military personnel to conduct interviews on emo-
tional events (Reynolds, 2019). She has repeatedly empha-
sised the AI replicates police interview techniques that use
cognitive science to focus on neutrality and gather factual,
detailed evidence from memories(e.g. Byers, 2019;
Reynolds, 2019). In a Digital HR Leaders podcast, Shaw
describes, just like a police investigation, if you have an
HR investigation, you want as high-quality information as
possible and you want the evidence that youre using what-
ever it is to be high quality(Green, 2020). Shaw falsely
presents police techniques as neutral and draws upon
claims to scientic authority, expertise, and the credibility
conferred upon police interviewing to position the AI as
offering more reliable evidence (as opposed to forcing or
leading responses to questioning). In short, Spot actively
brings carceral techniques into corporate HR processes.
Callisto is similarly entangled in civil and criminal legal
processes. Although its original purpose was to use its
matching algorithm to report, track, and hold accountable
repeat offenders through Title IX claims, Callisto has since
incorporated Legal Options Counsellors as the key resource
for users to navigate modes of recourse. These actors open
the door to more legal actions, including getting a restrain-
ing order,engaging a criminal justice process or a civil
lawsuit,or legally mediated restorative justice. While
Callisto suggests twelve possible avenues for survivors,
their choice to call the survivor liaison a Legal Options
Counsellorreinscribes legal forms of recourse as preferred,
negating how various legal processeswhether administra-
tive, civil, or criminalfail to provide adequate support for
complainants who are people of colour, have a disability,
and/or LGBTQ (e.g. Kim, 2014; Spade, 2013). Here again,
by prioritising value generation and supporting existing
systems, these apps work to uphold, rather than contest,
racialised power relations.
Through an analysis of reporting apps, this article illustrates
how emergent questions of data capital should be consid-
ered in relation to interlocking systems of domination and
oppression. It illuminates how an intersectional sensibility
might enhance the analysis of technological solutions to
criminological concerns specically and data capital more
generally (see Henne and Troshynski, 2019). Our aim,
however, is not simply to demonstrate the possibilities of
intersectional analyses of data. Rather, it is to provide a
grounded example of how racial capitalism persists
through a gendered project, sustaining racialised regimes
of value extraction and exchange. In this case, as dataca-
tion transforms social action, experience, and identity,
these apps support modes of accumulating and circulating
capital in ways that do not necessarily translate into direct
benet or use value for survivors who are the sources of
this valuable information.
Our ndings raise serious questions about how reporting
platforms comprise a new front-line approach for addres-
sing gender-based violence. While Callisto and Spot are
promoted as empowerment tools for survivors, examining
them through the lens of racial capitalism aids in under-
standing how they fail to challenge systems of oppression
that contribute to gender-based violence and the inequitable
realities of reporting. In this case, apps can be used by orga-
nisations and institutions to avoid making meaningful
changes to structures that enable harassment and discrimin-
ation. This seeming paradox reects larger societal shifts in
which the control of knowledge is fast becoming the foun-
dation upon which economic, legal, political, and social
inuence is exercised (Haggart et al., 2019).
Having illustrated how racism operates even when
something appears race neutral, this case study is instructive
for future critical and feminist interventions. Data policies,
infrastructures, and practices are critical sites of justice. If
conceptualisations of data capital are to contribute to libera-
tory agendas, they must unveil and confront how racialised
property relations are inextricably linked to its generation.
Anonymising data does not offer sufcient means of protec-
tion (see Shelby et al., 2021), as it does not prevent power-
ful actors from exchanging, investing, and ultimately
beneting from mined data. In the context of gender-based
violence, the use of digital apps may promise better evi-
dence, but they also add new layers to pursuing justice
through formal remedieswhich are already often opaque
and at times invisible to survivors. For such interventions
to effectively serve survivors, theses data infrastructures
must centre the needs and rights of historically excluded
Recognising that scholars concerned with big data, plat-
form governance, and surveillance are asking pressing
questions about how data extend and challenge understand-
ings of contemporary capitalism, we hope this article
serves as a cautionary example of why not to accept the
appearance of race neutrality when pursuing this critical
line of research. Data capital tends to benet predominantly
white institutions and organisations because it can continu-
ally be reinvested and redeployed in the service of actors
that have few to no incentives to ensure it has use or
Henne et al. 11
exchange value for users. Through its focus on constitutive
relationships, racial capitalism offers an analytic that can
strengthen and expand analyses of whiteness in
data-intensive applications, supporting more robust scholar-
ship at the intersections of data and power. It illuminates
how racism is embedded in inequitable data relations and
how gendered data projects can advance racist processes
in all, pointing to the need to better adapt intersectional ana-
lysis for data studies.
The authors would like to thank the anonymous reviewers for their
comments and ideas that contributed to this papers development.
Declaration of conicting interests
The author(s) declared no potential conicts of interest with
respect to the research, authorship, and/or publication of this
The author(s) disclosed receipt of the following nancial support
for the research, authorship, and/or publication of this article:
This work was supported by the Mellon/American Council of
Learned Societies Fellowship and the Australian National
University Futures Scheme.
Renee Shelby
1. At the time of writing, we could not identify any reporting apps
that collect user information on disability, ethnicity, race, or
sexual orientation (including #NotMe, AllVoices, Botler AI,
Callisto, Hello Cass, JDoe, SafeSport, Talk to Spot) or that
suggest supports tailored for historically illegible survivors.
2. While the reporting feature is public, the apps matching
system is still limited to certain universities.
3. Callisto denes a Legal Options Counselloras an attorney,
vetted by Callisto, who helps users navigate their options for
taking action.
4. Adding racial categorisation would not correct the disposses-
sive qualities of data capital. Without recognising sociohistori-
cal context and power, it re-inscribes violence on communities
that already experience structural violence(Hanna et al., 2020,
502), becoming another axis for value generation.
5. At the time of writing, Spot has not shared the data it uses to
train its AI or what steps it takes to prevent replicating racia-
lised problems with ofine reporting.
Albury K, Burgess J, Light B, et al. (2017) Data cultures of
mobile dating and hook-up apps: Emerging issues for
critical social science research. Big Data & Society 4(2):
Andrejevic M (2012) Exploitation in the data mine. In: Fuchs F,
Boersma K, Albrechtslund A and Sandoval M (eds) Internet
and Surveillance: The Challenges of web 2.0 and Social
media. London: Routledge, 7188.
Andrejevic M (2014) The big data divide. International Journal of
Communication 8(1): 1673-1689.
Bhandar B (2018) Colonial Lives of Property: Law, Land, and
Racial Regimes of Ownership. Durham: Duke University
Bhatia M (2021) Racial surveillance and the mental health
impacts of electronic monitoring on migrants. Race &
Class 62(3): 1836.
Bhattacharyya G (2018) Rethinking Racial Capitalism: Questions
of Reproduction and Survival. Lanham, MD: Rowman &
Bivens R and Hasinoff AA (2018) Rape: Is there an app for that?
An empirical analysis of the features of anti-rape apps.
Information, Communication & Society 21(8): 10501067.
Bonilla-Silva E (2012) The invisible weight of whiteness: The
racial grammar of everyday life in contemporary America.
Ethnic and Racial Studies 35(2): 173194.
Browne S (2015) Dark Matters: On the Surveillance of Blackness.
Durham: Duke University Press.
Byers A (2019) Yukon Human Rights Commission adopts online
tool to report harassment and discrimination. CBC News,1May.
Cacho LM (2014) The presumption of white innocence. American
Quarterly 66(4): 10851090.
Callisto (2018) Year 3 of combatting sexual assault, empowering
survivors, and advancing justice // 20172018 academic year
report. Report, Callisto, October.
Callisto (2020) Privacy Policy. Available at: https://www. (accessed 21 January 2021).
Cave S and Dihal K (2020) The whiteness of AI. Philosophy &
Technology 33: 685703.
Childs M (2019) Building bots with empathy requires nding the
right balance. IBM Watson Blog. Available at: https://www.
requires-nding-the-right-balance/ (accessed 5 June 2020).
Cohen JE (2019) The biopolitical public domain. In: Cohen JE
(eds) Between Truth and Power: The Legal Constructions of
Informational Capitalism. New York: Oxford University
Press, 48137.
Combahee River Collective (1982) A black feminist statement.
In: Hull GT, Bell-Scott P and Smith B (eds) All the
Women are White, all the Blacks are men, but Some of us
are Brave: Black womens studies. New York: Feminist
Press, 1322.
Cottom TM (2020) Where platform capitalism and racial
capitalism meet: The sociology of race and racism in
the digital society. Sociology of Race and Ethnicity 6(4):
Couldry N and Mejias UA (2019) Data colonialism: Rethinking
Big Datas Relation to the contemporary subject. Television
& New Media 20(4): 336349.
Crenshaw K (1991) Mapping the margins: Intersectionality, iden-
tity politics, and violence against women. Stanford Law Review
43(6): 12411299.
Crooks R (2021) Productive myopia: Racial organisations and
EdTech. Big Data & Society: [insert when available].
12 Big Data & Society
Daniels J (2013) Race and racism in internet studies: A review and
critique. New Media & Society 15(5): 695719.
Daniels J (2016) The trouble with white (online) feminism. In: Noble
SU and Tynes BM (eds) The Intersectional Internet: Race, Class,
and Culture Online. New York: Peter Lang, 4160.
Davis A (1981) Women, Race, and Class. New York: Vintage.
EEOC (U.S. Equal Employment Opportunity Commission) (n.d.)
Written testimony of Jess Ladd Founder & CEO. Available at:
(accessed 1 February 2021).
Esqueda CW and Harrison LA (2005) The inuence of gender role
stereotypes, the womans Race, and level of provocation and
resistance on domestic violence culpability attributions. Sex
Roles 53(11/12): 821834.
Eubanks V (2018) Automating Inequality: How High-Tech Tools
Prole, Police, and Punish the Poor. New York: St Martins
Fabris C (2015) Callisto to offer new reporting system for survivors of
sexual assault. The Chronical of Higher Education, 16 April.
Feldblum CR and Lipnic VA (2016) Select Task Force on the
Study of Harassment in the Workplace. Washington, DC: US
Equal Employment Opportunity Commission.
Ferguson RA and Hong GK (2012) The sexual and racial contra-
dictions of neoliberalism. Journal of Homosexuality 59(7):
Fuchs C (2012) Dallas Smythe today The audience commodity, the
digital labour debate, Marxist political economy and critical
theory. Prolegomena to a digital labour theory of value. tripleC:
Cognition, Communication, Co-operation 10(2): 692740.
Gesinsky L, Merola N, Dana A, et al. (2018) Transforming
Workplace Culture in the era of #MeToo, #BlackLivesMatter,
and More. New York: Seyfarth Shaw.
Glaser A (2016) How simple software could help prevent sexual
assault. Wired, 9 December.
Green D (2020) Episode 27: How can technology reduce bias in
the workplace? Available at:
bias-in-the-workplace (accessed 6 February 2021).
Haggart B, Henne K and Tusikov N (2019) Information,
Technology, and Control in A Changing World: Shifting
Power Structures in the 21st Century. Basingstoke: Palgrave
Hanna A, Denton E, Smart A, et al. (2020) Towards a critical race
methodology in algorithmic fairness. Proceedings of the 2020
Conference on Fairness, Accountability, and Transparency:
Hall S (2017) The Fateful Triangle. Cambridge, MA: Harvard
University Press.
Harris CI (1993) Whiteness as property. Harvard Law Review
106(8): 17071791.
Hartwell D (2021) ICE investigators used a private utility database
covering millions to pursue immigration violations. The
Washington Post, 27 February.
Henne K and Troshynski EI (2019) Intersectional criminologies
for the contemporary moment: Crucial questions of power,
praxis, and technologies of control. Critical Criminology
27(1): 5571.
Jefferson BJ (2018) Predictable policing: Predictive crime
mapping and geographies of policing and race. Annals of the
American Association of Geographers 108(1): 116.
Katz Y (2020) Articial Whiteness: Politics and Ideology in
Articial Intelligence. New York: Columbia University
Kim ME (2014) VAWA@ 20: The mainstreaming of the crimin-
alization critique: Reections on VAWA 20 years later. CUNY
Law Review Footnote Forum (18): 5258.
Leonelli S, Rappert B and Davies G (2017) Data shadows:
Knowledge, openness, and absence. Science, Technology, &
Human Values 42(2): 191202.
Leong N (2013) Racial capitalism. Harvard Law Review 126(8):
Light B, Burgess J and Duguay S (2018) The walkthrough
method: An approach to the study of apps. New Media &
Society 20(3): 881900.
Madianou M (2019) Technocolonialism: Digital innovation and
data practices in the humanitarian response to refugee crises.
Social Media +Society 5(3): 113.
Magalhães JC and Couldry N (2021) Giving by taking away: Big
tech, data colonialism, and the reconguration of social good.
International Journal of Communication 15: 343362.
Marino MC (2014) The racial formation of chatbots. CLC:
Comparative Literature and Culture 16(5): 111.
Market Research Future (2020) Global smart safety and security
device market research report. Market Research Future,
Mason CL and Magnet S (2012) Surveillance studies and violence
against women. Surveillance & Society 10(2): 105118.
McGuire DL (2010) At the Dark end of the Street: Black Women,
Rape, and ResistanceA new History of the Civil Rights
Movement From Rosa Parks to the Rise of Black Power.
New York: Vintage.
McHugh J (2019) An online tool to catch workplace sexual preda-
tors. The Wall Street Journal, 10 January.
McPhee J and Dowden JP (2018) The constellation of factors
underlying Larry Nassars abuse of athletes. Ropes & Gray,
Mead R (2018) Can an app track sexual predators in the theatre?
The New Yorker, 2 April.
Melamed J (2011) Represent and Destroy: Rationalizing Violence
in the new Racial Capitalism. Minneapolis: University of
Minnesota Press.
Mercer S (2019) Spot, report, stop: AI tackles age-old problem.
Counsel Magazine, 21 October.
Moore WL (2008) Reproducing Racism: White Space, Elite law
Schools, and Racial Inequality. Lanham, MD: Rowman &
Noble SU (2018) Algorithms of Oppression: How Search Engines
Reinforce Racism. New York: NYU Press.
Noble SU and Roberts ST (2019) Technological elites, the
meritocracy, and post-racial myths in silicon valley. In:
Mukherjee R, Banet-Weiser S and Gray H (eds) Racism
Postrace. Durham: Duke University Press, 113134.
OBrien SA (2017) She wants her rape reporting software to be
universal. CNN Money, 31 March.
Obar JA and Oeldorf-Hirsch A (2020) The biggest lie on the inter-
net: Ignoring the privacy policies and terms of service policies
of social networking services. Information, Communication &
Society 23(1): 128147.
Phan T (2019) Amazon Echo and the aesthetics of whiteness.
Catalyst: Feminism, Theory, Technoscience 5(1): 137.
Henne et al. 13
Phipps A (2019) Every woman knows a Weinstein: Political
whiteness and white woundedness in #MeToo andpublic femin-
isms around sexual violence. Feminist Formations 31(2): 125.
Pollack A (2020) SA members introduce app to improve sexual
assault reporting process. The Daily Orange, 6 September.
Purvis DE and Blanco M (2020) Police sexual violence: Police
brutality, #MeToo, and masculinities. California Law Review
108: 14871529.
Ralph M and Singhal M (2019) Racial capitalism. Theory and
Society 48(6): 851881.
Ramos Law (2017) Latest DaVita lawsuit echoes charges in John
Olivers Last Week Tonight takedown, 18 May.
Ray V (2019) A theory of racialized organizations. American
Sociological Review 84(1): 2653.
Reidenberg JR, Russell NC, Callen AJ, et al. (2014) Privacy harms
and the effectiveness of the notice and choice framework. I/S:
A Journal of Law and Policy for the Information Society 11(2):
Reynolds E (2019) How technology is tackling the stigma around
sexual assault. i-D Magazine, 15 March.
Richie BE (2012) Arrested Justice. New York: NYU Press.
Ritchie A (2017) Invisible no More: Police Violence Against black
Women and Women of Color. Boston: Beacon Press.
Robinson C (1983) Black Marxism: The Making of the Black
Radical Tradition. Chapel Hill: University of North Carolina
Rossie A, Tucker J and Patrick K (2018) Out of the Shadows: An
Analysis of Sexual Harassment Charges by Working Women.
Washington, DC: National Women Law Center.
Sadowski J (2019) When data is capital: Datacation, accumula-
tion, and extraction. Big Data & Society 6(1): 112. 10.1177/
Schlesinger A, OHara KP and Taylor AS (2018) Lets talk about
race: Identity, chatbots, and AI. Proceedings of the 2018 CHI
Conference on Human Factors in Computing Systems:114.
Shelby R (2021) Technology, sexual violence, and power-evasive
politics: Mapping the anti-violence sociotechnical imaginary.
Science, Technology, & Human Values:130. doi: 10.1177/
Shelby R, Harb J and Henne K (2021) Whiteness in and through
data protection: An intersectional approach to anti-violence
apps and #MeToo bots. Internet Policy Review 10(4).
Sim K (2021) Respond and resolve: A critical feminist inquiry for
technologies of sexual governance. Global Perspectives 2(1):
25434. 10.1525/gp.2021.25434.
Sommerville DM (2004) Rape and Race in the Nineteenth-Century
South. Chapel Hill: University of North Carolina Press.
Spade D (2013) Intersectional resistance and law reform. Signs:
Journal of Women in Culture and Society 38(4): 10311055.
Spot (2018) Privacy Policy. Available at:
privacy (accessed 21 January 2021).
Spot (2020) Terms of Use. Available at:
terms-of-use (accessed 21 January 2021).
Spot (n.d.) Case Management. Available at:
case-management (accessed 30 September 2021).
Srnicek N (2017) The challenges of platform capitalism:
Understanding the logic of a new business model. Juncture
23(4): 254257.
Thatcher J, OSullivan D, et al. (2016) Data colonialism through
accumulation by dispossession: New metaphors for daily
data. Environment and Planning D: Society and Space 34(6):
Timmermans S and Tavory I (2012) Theory construction in quali-
tative research: From grounded theory to abductive analysis.
Sociological Theory 30(3): 167186.
Tuerkheimer D (2017) Incredible women: Sexual violence and the
credibility discount. University of Pennsylvania Law Review
166(1): 158.
Virdee S (2019) Racialized capitalism: An account of its
contested origins and consolidation. The Sociological Review
67(1): 327.
14 Big Data & Society
... Like scholarship on racial capitalism, our research scrutinises how racism has been central to capitalist accumulation (Robinson 2000), particularly, as other geographers acknowledge, how "capital profits from variegated landscapes of difference" (Inwood et al. 2021(Inwood et al. :1084. This Australian case study deepens emergent insights into how digital systems can instantiate and extend racial capitalism by exploring how platform technologies exercise a distinct form of sovereign power (Henne et al. 2021). In the pages that follow, we begin by situating the deaths of food delivery couriers within the wider landscape of gig economy work in Australia. ...
Full-text available
Although recent deaths of multiple couriers on the road have raised awareness of the dangers of on-demand food delivery, there remains limited government regulation of the industry in many jurisdictions. In this article, we argue that the labour conditions of platform couriers in Australia constitute a case of necrocapitalism (Banerjee), a contemporary form of accumulation through which organisational structures harness the power of debilitation and death for economic gain. After contextualising food delivery within the Australian gig economy, our analysis underscores how necropower operates through courier labour. We illustrate three dimensions: how this form of labour entails corporeal risks and harms; how these harms are heightened by platform infras-tructures; and how strategic regulatory inaction maintains necropolitical orders. The article concludes with a reflection on how this contemporary example of necrocapitalism illuminates intersecting vectors of domination underpinning the logics and practices of platform governance.
... The third contribution to the theme, an article by Henne, Shelby, and Harb (2021), illustrates how racial capitalism can enhance understanding of data capital and inequality through an in-depth study of digital platforms used for intervening in gender-based violence. Examining how reporting apps use data to support institutionally legible narratives of violence, the authors draw attention to how reporting reinforce racialized property relations built on extraction and ownership, the capital accumulation that reinforces the inequitable distribution of benefits derived through and from data, and the commodification of diversity and inclusion. ...
Full-text available
This special theme of Big Data & Society explores connections, relationships, and tensions that coalesce around data, power, and racial formation. This collection of articles and commentaries builds upon scholarly observations of data substantiating and transforming racial hierarchies. Contributors consider how racial projects intersect with interlocking systems of oppression across concerns of class, coloniality, dis/ability, gendered difference, and sexuality across contexts and jurisdictions. In doing so, this special issue illuminates how data can both reinforce and challenge colorblind ideologies as well as how data might be mobilized in support of anti-racist movements.
Conference Paper
El objetivo del estudio es dar a conocer el estado actual de la aplicación de la Inteligencia Artificial en la Justicia Penal en ley comparativa. Describir qué actores y operadores de justicia aplican y explicar la motivación para su uso en casos específicos, a través de una revisión sistemática de la literatura científica. Para esto, 66 artículos obtenidos de las bases de datos Scopus, Scielo y Dialnet fueron revisados. Entre sus resultados se encontró que la mayor número de autores investigaron la aplicación de Artificial Inteligencia en el Sistema Judicial; Los casos en que artificiales inteligencia se aplican más es para la prevención/predicción de delitos y toma de decisiones judiciales. Se concluyó que Aunque los riesgos del uso de la inteligencia artificial son inevitables. También es ineludible que su uso tenga importantes beneficios. Por lo tanto, es necesario establecer puntos de control antes, durante y después su implementación.
Increases in protests have contributed to US courts and law enforcement referring more and different types of cases to alternative justice pathways including variously named cautionary, reparative, restorative, and rehabilitative options that involve wide-ranging expectations of the accused person(s). Of the options that involve programming requirements, many originated as extensions of court diversion, having honed their approaches with lower tier, non- or less-serious cases. Challenged by a referral of a case involving non-violent civil disobedience (NVCD) to a local Community Justice Centre (CJC), Burford, Jolly, and Gehman carried out an exploratory study aimed to better understand the strengths and limitations of the Centre’s standard, incident-based approach with more complex matters including environmental harms and other complex situations that intersect with structurally reproduced inequalities. Drawing from multiple sources, the authors conclude that widening the restorative vision, along with the menu and sequencing of restorative and responsive pathways on offer, is urgently needed to engage competently and ethically with such matters.
Full-text available
This paper reports on a two-year, field-based study set in a charter management organization (CMO-LAX), a not-for-profit educational organization that operates 18 public schools exclusively in the Black and Latinx communities of South and East Los Angeles. At CMO-LAX, the nine-member Data Team pursues the organization's avowed mission of making public schools data-driven, primarily through the aggregation, analysis, and visualization of digital data derived from quotidian educational activities. This paper draws on the theory of racialized organizations to characterize aspects of data-driven management of public education as practiced by CMO-LAX. I explore two examples of how CMO-LAX shapes data to support racial projects: the reconstruction of the figure of chronic truants and the incorporation of this figure in a calculative regime of student accomplishment. Organizational uses of data support a strategy I call productive myopia, a way of pursuing racial projects via seemingly independent, objective quantifications. This strategy allows the organization to claim to mitigate racial projects and, simultaneously, to accommodate them. This paper concludes by arguing for approaches to research and practice that center racial projects, particularly when data-intensive tools and platforms are incorporated into the provision of public goods and services such as education.
Full-text available
This article analyses apps and artificial intelligence chatbots designed to offer survivors of sexual violence with emergency assistance, education, and a means to report and build evidence against perpetrators. Demonstrating how these technologies both confront and constitute forms of oppression, this analysis complicates assumptions about data protection through an intersectional feminist examination of these digital tools. In surveying different anti-violence apps, we interrogate how the racial formation of whiteness manifests in ways that can be understood as the political, representational, and structural intersectional dimensions of data protection.
From higher education to workplaces, institutions are increasingly adopting data-driven and semiautomated technologies to facilitate, manage, and arbitrate sexual affairs. These largely US-based systems, which I term “technologies of sexual governance,” are encoded with and reify particular ideologies about sexual (mis)conduct, and thus call for a critical feminist inquiry about their cultural, political, and moral implications for advancing a feminist sexual politics. Drawing from Halley et al.’s “governance feminism” framework, this article makes the case that a critical feminist inquiry into technologies of sexual governance must take into account the co-constitutive nature of feminist sexual politics and technology. Specifically, I argue that critical inquiries must begin by interrogating which feminist ideologies about sex and power gain purchase with and through particular computational logics and form. To demonstrate this approach, I offer two ways of reading feminist scholarly and popular responses to “antirape technologies” that capture both readings’ shortcomings, and I propose a third approach that captures the cultural work that particular feminist ideologies and technologies mutually perform. This article concludes by demonstrating how the third approach can advance a feminist analysis of workplace misconduct management softwares.
In the global convulsions in the aftermath of World War II, one dominant world racial order broke apart and a new one emerged. This story portrays the postwar racial break as a transition from white supremacist modernity to a formally antiracist liberal capitalist modernity in which racial violence works normatively by policing representations of difference. Following the institutionalization of literature as a privileged domain for Americans to get to know difference—to describe, teach, and situate themselves with respect to race—the text focuses on literary studies as a cultural technology for transmitting liberal racial orders. It examines official antiracism in the United States and finds that these were key to ratifying the country’s global ascendancy. It shows how racial liberalism, liberal multiculturalism, and neoliberal multiculturalism made racism appear to be disappearing, even as they incorporated the assumptions of global capitalism into accepted notions of racial equality. Yet this book also recovers an anticapitalist“race radical” tradition that provides a materialist opposition to official antiracisms in the postwar United States—a literature that sounds out the violence of liberal racial orders, relinks racial inequality to material conditions, and compels desire for something better than U.S. multiculturalism.
Recent discussions on technology and gender-violence prevention emphasize that technoscientific applications often advance pro-punishment logics that enact gendered inequalities. Less attention has focused on racialized dimensions and how technology might advance abolitionist and transformative justice agendas. In response, this article considers how inventors mobilize technology as a frontline response to sexual violence, in which technoscience—rather than police—enables individuals, friends, and family to provide safety and mutual aid. Through analysis of seven popular technologies produced between 2010 and 2020, this paper documents how their “abolitionist sensibility” is accompanied by fellow-traveler discourses that are unattuned to intersecting power relations. Findings suggest that while this sociotechnical imaginary is reacting to state power, it reinforces a race-neutral and techno-optimistic vision for building a violence-free future. These power-evasive politics may thus signal increased susceptibility to carceral creep and coercive surveillant regimes. After discussing these double-edge politics, I conclude by discussing power formations that are left unexamined in the imaginary and how to cultivate a counter-carceral praxis in line with transformative justice goals.
Since the late 1990s, the government has used outsourced electronic monitoring (also known as tagging) in England and Wales for criminal sentencing and punishment. Under the Asylum and Immigration (Treatment of Claimants) Act 2004, s36, the use of this technology extended to immigration controls, and individuals deemed as ‘high risk’ of harm, reoffending or absconding can be fitted with an ankle device and subjected to curfew. The tagging of migrants is not authorised by the criminal court and therefore not considered a punitive sanction. It is managed by the immigration system and treated as an administrative matter. Nevertheless, people who are tagged experience it as imprisonment and punishment. Drawing on data from an eighteen-month ethnographic research project, this article examines the impact of electronic monitoring on people seeking asylum, who completed their sentences for immigration offences. It uncovers the psychological effects and mental health impacts of such technologies of control. The article sheds light on how tagging is experienced by racialised minorities, and adds to the literature on migration, surveillance studies, state racism and violence.