The Dataﬁcation of #MeToo: Whiteness,
Racial Capitalism, and Anti-Violence
, Renee Shelby
and Jenna Harb
This article illustrates how racial capitalism can enhance understandings of data, capital, and inequality through an in-depth
study of digital platforms used for intervening in gender-based violence. Speciﬁcally, we examine an emergent sociotech-
nical strategy that uses software platforms and artiﬁcial intelligence (AI) chatbots to offer users emergency assistance,
education, and a means to report and build evidence against perpetrators. Our analysis details how two reporting
apps construct data to support institutionally legible narratives of violence, highlighting overlooked racialised dimensions
of the data capital generated through their use. We draw attention to how they reinforce property relations built on
extraction and ownership, capital accumulation that reinforces beneﬁts derived through data property relations and own-
ership, and the commodiﬁcation of diversity and inclusion. Recognising these patterns are not unique to anti-violence
apps, we reﬂect on how this example aids in understanding how racial capitalism becomes a constitutive element of digital
platforms, which more generally extract information from users, rely on complex ﬁnancial partnerships, and often sustain
problematic relationships with the criminal legal system. We conclude with a discussion of how racial capitalism can
advance scholarship at the intersections of data and power.
Racial capitalism, data capital, whiteness, AI, sexual violence, data justice
This article is a part of special theme on Data, Power and Racial Formations. To see a full list of all articles in this
special theme, please click here: https://journals.sagepub.com/page/bds/collections/dataandracialformations
Scholarship increasingly acknowledges how technology,
data, and the internet—far from being post-racial or colour-
blind (Daniels, 2016; Noble and Roberts, 2019)—enshrine
whiteness (Daniels, 2013). Whiteness, the racial grammar
that reinforces logics of white privilege and racial hierarch-
ies, maintains power and status associated with racial cat-
egories (Bonilla-Silva, 2012). Research has traced
economic, political, and social harms of these practices,
including how minoritised communities endure heightened
forms of surveillance, face challenges obtaining resources,
and experience more interactions with criminal legal
systems (see Bhatia, 2021; Browne, 2015; Eubanks, 2018;
Jefferson, 2018). The disproportionate effects of dataﬁcation
rely on variegated systems of capital, which are enabled
through the interplay of data, labour, knowledge, technical
School of Regulation and Global Governance (RegNet), The Australian
National University, Canberra, Australia
Gender & Sexuality Studies Program, Northwestern University, Evanston,
Renee Shelby, Gender & Sexuality Studies Program, Northwestern
University, Evanston, USA
Creative Commons Non Commercial CC BY-NC: This article is distributed under the terms of the Creative Commons Attribution-
NonCommercial 4.0 License (https://creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and
distribution of the work without further permission provided the original work is attributed as speciﬁed on the SAGE and Open Access page (https://us.
Original Research Article
Big Data & Society
© The Author(s) 2021
Article reuse guidelines:
expertise, infrastructure, and noninterventionist regulation.
Commodiﬁcation, extraction, and exploitation—what some
refer to as data colonialism (Couldry and Mejias, 2019) or
technocolonialism (Madianou, 2019)—are not only central
to reaping value from data; they also maintain racialised
sources of privilege and status associated with accessing
and using data.
While scholars recognise data as a core feature of 21
century capitalism (Sadowski, 2019) and that capitalism
perpetuates gendered and racialised oppression (Melamed,
2011), research on big data still reﬂects little engagement
with critical scholarly traditions that explore entanglements
of capitalism and racial formation. This article offers an
analysis of how racial capitalism, which is central to
commodiﬁcation for the purpose of “deriving social or eco-
nomic value”(Leong, 2013, 2152), operates in and through
data—even when racialised dimensions are not evident on
the surface. Our aim is to illustrate how racial capitalism,
inclusive of its distinct strands (Bhattacharyya, 2018;
Cottom, 2020; Leong, 2013; Ralph and Singhal, 2019;
Robinson, 1983; Virdee, 2019), offers an analytic for under-
standing data practices often misunderstood as neutral or as
colourblind. Akin to other analyses of racial capitalism,
this article sheds light on how systems of capital accumula-
tion are constitutive of racialised forms of “exploitation,
expropriation, and expulsion”(Bhattacharyya, 2018, 37),
which include—but can exceed—concerns of whiteness.
To illustrate, we examine two prominent digital platforms
used for intervening in sexual harassment and gender-based
violence. They are part of an emergent sociotechnical strat-
egy that aim to offer emergency assistance, education, and
a means to report and build evidence against perpetrators.
While other studies illustrate gendered processes articulated
through these anti-violence technologies (Bivens and
Hasinoff, 2018), our analysis points to overlooked racialised
contours. We highlight how racial capitalism implicates the
design and use of anti-violence apps, particularly the data
capital generation and accumulation that they enable. Of
course, these patterns are not unique to anti-violence apps:
platforms more generally extract information from users,
rely on complex ﬁnancial partnerships, and often sustain pro-
blematic relationships with criminal legal systems (Jefferson,
2018; Mason and Magnet, 2012; Srnicek, 2017). Distinct in
this example is the accumulation of economic value “directly
out of the act of doing social good”—a set of concerns that
have not yet received the same level of attention as wealth
generation by large for-proﬁt companies (Magalhães and
Couldry, 2021, 353).
In the pages that follow, we trace arguments that detail
how systems of racial capitalism contribute to the
commodiﬁcation of race for value generation, including
how whiteness often becomes embedded in these processes.
We highlight distinguishing features that play out in rela-
tion to digital modes of collecting data and the role of
data capital (Sadowski, 2019). We then contextualise the
rise of anti-violence apps and explain our methods before
discussing thematic concerns that emerge in relation to
two exemplars in the ﬁeld, Callisto, a conﬁdential reporting
platform whose matching algorithm connects victims of the
same perpetrator to identify repeat offenders, and Spot, the
ﬁrst cognitive interview artiﬁcial intelligence (AI) chatbot
for reporting sexual and other workplace misconduct.
Extending our earlier analyses of the data protection chal-
lenges of anti-violence apps (Shelby, 2021; Shelby et al.,
2021), this article focuses on how both Callisto and Spot
evince three key areas identiﬁed in the literature on racial
capitalism: racialised property relations built on extraction
and ownership, capital accumulation that reinforces beneﬁts
derived through property relations and ownership, and the
commodiﬁcation of diversity and inclusion promotion.
We conclude with a discussion of how these reporting
apps operate in tension with stated concerns of data justice.
Situating data and whiteness within
concerns of racial capitalism
Scholars have traced whiteness as a normative logic embed-
ded in and sustained through the technological tendency to
default to whiteness—that is, racial privilege remains unac-
knowledged and taken for granted (Schlesinger et al.,
2018). For instance, data-based devices, including
Amazon’s Alexa, and representations of technology in ﬁlm,
media, and marketing exhibit white attributes that range
from skin colour, facial features, voice, language, and cultural
sensibilities (Cave and Dihal, 2020; Phan, 2019).
Commonplace characteristics attributed to technology, such
as higher levels of professional status and “smartness”,also
reﬂect longstanding hierarchies of racial power: features
have been selectively associated with advantaged social posi-
tions along axes of class, gender, race, and ethnicity—
namely, middle-class, cisgender, masculine, white groups
(Cave and Dihal, 2020).
These attributes are not unique to digital technologies.
Earlier Critical Race Theory (CRT) analyses demonstrate
how whiteness not only has value as a property (as in pos-
session), but also has constitutively taken on characteristics
of property (as power relationships) (Harris, 1993). More
recent studies reveal whiteness has become an “entitlement
to social goods”, carrying “reputational value”and “the
power to exclude”(Bhandar, 2018, 7). Racial capitalism
examines these processes of differentiation: it, as outlined
by Gargi Bhattacharyya (2018, 103), is not as “an
account of how capitalism treats different ‘racial groups’”,
but an approach that illuminates constitutive relationships
between racism and capitalism (see also Cottom, 2020;
Robinson, 1983). It enables investigation of how the
inequalities observed in relation to property “are not
givens or inevitabilities, but rather are conscious selections
regarding structuring of social relations”(Harris, 1993,
2Big Data & Society
1730). In sum, racial capitalism captures more than how
whiteness may transfer privilege to those who can acquire
it; it scrutinises how racism operates through practices of
accumulation, simultaneously shaping the distribution of
beneﬁts and forms of exclusion.
Digital data bring distinct dimensions of property rela-
tions to the fore through cycles of capital accumulation.
User activity yields reporting data, metadata, transaction
data, technical data, and other types of information. Such
data are then used to shape content, develop and enhance
products, and expand digital consumer bases for conversion
into economic value (Sadowski, 2019). Their use also
aims to inﬂuence behaviour, purchasing patterns, latent
demand, and “the attention of speciﬁcally targeted
groups”(Fuchs, 2012, 705), which Andrejevic (2012, 76)
frames as a form of intentional “control and manipulation”.
Alienation, appropriation, and coercion shape users’contri-
butions to generating value and creating data capital (Fuchs,
2012), as corporate entities often own and deﬁne how their
contributions are used and enjoyed. As their labour is
appropriated, the lack of informed consent over the full
range of data usages sustains the coercive nature of these
Although nuanced, many reﬂections on data capital
often miss how capital is itself racialised. Race can
operate as capital, both in a social sense and in the
Marxian tradition. In terms of social capital, markets have
arisen with value placed on diversity, which as Nancy
Leong (2013, 2181) explains, have enabled white people
and institutions to engage in ways that “enhance their
status by signalling cross-cultural credibility”. In terms of
Marxian capital, social processes are also important; they
facilitate the conversion of labour into commodities with
exchange value, not simply things that are useful.
According to Leong (2013), racial identity –regardless of
whether it reﬂects the privileges of whiteness –can
become a commodity when it is conceded via exchange.
Racial capital becomes pronounced through this process.
The commodiﬁcation of racial identity enables value gener-
ation and extraction vis-à-vis markets. As capital is often
invested and reinvested, white persons and institutions
(the dominant class for Marx) can receive additional and
cyclical beneﬁts from surplus value.
These observations about racial capital have direct rele-
vance for analyses of data capital. Cottom (2020) argues the
logics of racial capitalism are embedded across the plat-
forms enabling the internet economy, suggesting data
capital is similarly shaped by them. As most technologies
for collecting and processing data rest with companies,
“the forms of ‘knowing’associated with big data mining
are available only to those with access to the machines,
the databases, and the algorithms”, which renders them as
“advantageously positioned compared to those without
such access”(Andrejevic, 2014: 1676). Modes of data
appropriation and alienation are notably uneven. Certain
sources of data are valued more than others; data are dispro-
portionately solicited and extracted from certain groups, while
others tend to be excluded (Sadowski, 2019). Looking at
these digitised processes through the lens of racial capitalism
sheds light on the racialised contours of the harms caused
through extraction and alienation—an observation that our
analysis of anti-violence apps illuminates.
Situating the rise of anti-violence apps in
response to gender-based violence
Reporting apps rely on data about gender-based violence to
assist institutional decision-making and responses. Though
functionality varies, they tend to feature algorithms that
accumulate data in information escrows and match assault
incidents to identify repeat offenders (e.g. Callisto, JDoe)
as well as AI chatbots that interview victims and produce
incident reports, aggregating data in dashboards to identify
organisational patterns of misconduct (e.g. Talk to Spot,
#NotMe, Botler AI). Although responding to—and some-
times explicitly appropriating—language from the #MeToo
movement, existing analyses of anti-violence apps ﬁnd
they reinforce gendered rape myths (Bivens and Hasinoff,
2018), strengthen surveillance structures (Mason and
Magnet, 2012), and support limited legal responses (Sim,
The broader market for anti-violence apps emerged after
2010, driven by advances in smart technology and a U.S.
government call for software innovators to leverage
mobile technology to address sexual violence. The
Obama administration’s 2011 Apps Against Abuse compe-
tition sparked venture capitalist interest in the possibilities
of anti-violence technology, bolstered by the emerging
“smart”personal safety industry and its promotion of
machine learning, AI, and big data analysis. The 2017
mainstream #MeToo movement increased scrutiny of the
failings of legal institutions and bureaucracies to protect
survivors, prompting renewed pleas for innovators to
develop accessible technologies to increase reporting
(Glaser, 2016). In light of high-proﬁle repeat offender
cases, such as Larry Nassar and Harvey Weinstein, apps
seemed an effective tool to produce evidence that would
counter authorities’apathy towards survivors (McPhee
and Dowden, 2018). Numerous apps have since prolifer-
ated, particularly in higher-income countries, with the per-
sonal smart safety and security market ballooning to 2904
million USD in 2019 (Market Research Future, 2020).
They are now implemented in multi-national corporations,
such as Instacart, Kickstarter, and Zillow, and across uni-
versity campuses in North America.
The embrace of reporting apps offers a site for examining
the value they provide organisations in the post-#MeToo
economy—a viral landscape that critical observers argue
has been steered by white feminism and the desire for state
Henne et al. 3
and institutional redress to personal injury (see Phipps,
2019). For example, as part of the 2018 report,
Transforming Workplace Culture in the Era of #MeToo,
#BlackLivesMatter, and More, Seyfarth, a large corporate
law ﬁrm, endorses reporting apps, including Callisto and
Talk to Spot, as a strategy for employers to “react quickly
or bear the brunt of public backlash and shareholder disap-
proval”(Gesinsky et al., 2018, 2). The #NotMe reporting
app promotes its value as helping companies “reverse the
trends”of costly employee absences, lawsuits, and turnover.
AllVoices, an anonymous reporting platform for businesses,
indicates its “goal is to support companies in creating
healthy, safe feedback cultures that directly lead to more pro-
ductive and engaged employees, higher productivity, and
less turnover”. These framings underscore how apps are ima-
gined as projects that protect organisations, increase worker
productivity, and ensure a competitive edge.
The promise of these apps relies on a presumption that
technology can optimise a survivor’s report by collecting
and transforming personal accounts of violence into institu-
tionally legible data. Reporting, however, cannot be
abstracted from material conditions in which legibility is
understood. Feminist scholars have traced how racist for-
mations of sexism shape understandings of what gender
violence is and who is legible as a survivor (e.g.
Combahee River Collective, 1982; Davis, 1981; McGuire,
2010). For much of U.S. history, law has not provided
formal protection to all women, excluding women of
colour (Sommerville, 2004). Even though whiteness is
elastic and shifting, it remains a property of gender violence
legibility (Cacho, 2014). Reﬂecting on the ﬁgure of the sur-
vivor in the viral #MeToo movement, Alison Phipps (2019,
17) explains that whiteness is still central to shaping narra-
tives of fragility and “woundability”—that is, “the assumed
purity and vulnerability of white women”.Peopleof
colour, as well as gay, lesbian, bisexual, transgender, and
queer (LGBTQ) people, are often not afforded this presump-
tion of innocence. Instead, they are more likely to be blamed
for their assaults than their white and heterosexual cisgender
counterparts (Esqueda and Harrison, 2005). Anti-violence
interventions that rely on hegemonic notions of gender
tacitly reify experiences of privileged survivors (Richie,
2012), reinforcing a false assumption there is a universal
experience of gender-based violence (Crenshaw, 1991).
Despite these observations, reporting apps respond to the
problem of violence through a universalising design that
imagines algorithms, machine learning, and data analytics
as generating new institutional power relations untethered
to the inequitable politics of survivorship. Negating how
class, dis/ability, racism, and sexuality particularise reporting
, they present computational modalities as
viable solutions to the systemic inaction of authorities and
devaluing of survivors’accounts. Promises of cryptography-
based individual privacy, user-friendly and efﬁcient inter-
faces, and institutional dashboards render technical responses
and big data production as modes of increasing accountabil-
ity. These design features also subject users to processes of
data collection, accumulation, and circulation. In doing so,
the belief that digital reporting will present survivors’
accounts in ways that lead to the recognition and support
of victims, including the realisation of their right to bodily
integrity, encourages participation in what are extractive
In addition to these extractive logics, racialised dynam-
ics that contribute to the inequities among survivors carry
over into processes of generating data capital. As Katz
(2020, 8) notes, AI is a technology that “works to ﬂexibly
serve a social order premised on white supremacy”.
Reporting apps do not escape this critique, even when
they mobilise diversity and inclusion discourses. They
both mobilise and obscure racialised experiences in dataﬁ-
cation, mirroring the recognised contradictions of ofﬂine
racial and gender politics (Ferguson and Hong, 2012).
These tools exemplify a longstanding trend: that contem-
porary capitalism “has depended as much on the production
and negotiation of difference as it has through enforcing
sameness, standardization, and homogenization”(Hall,
2017: 118–119; quoted in Virdee, 2019, 9). Their contradic-
tions are the outgrowth of constitutive relationships
between data, capital, and inequality as well as how racial
capitalism underscores the interplay between them.
Our argument draws upon a qualitative analysis of two
popular reporting apps—Callisto and Talk to Spot. They
rely on similar logics: accumulating data for decision-
making about gender-based violence and using collected
data to support institutional action. For Callisto, they are
to identify repeat offenders; for Talk to Spot, they are to
optimise HR responses. Project Callisto is a 501(c)3 organi-
sation whose mission is to “create technology to combat
sexual assault, empower survivors, and advance justice”
(https://www.mycallisto.org/about). Initially formed in
2011, Callisto reﬁned their reporting app based on survivor
feedback in 2020. This app allows university users to enter
a record into their matching algorithm to identify repeat
offenders, review resources and potential actions to con-
sider (e.g. report to police or talk to an attorney), and
create a record of the assault using a standardised form
In contrast, Talk to Spot is a chatbot meant to encourage
employees to speak out about discrimination, harassment,
bullying, misconduct, and other inappropriate behaviour.
Founded in 2017, Talk to Spot’s chatbot feeds into a
webapp where managers and HR professionals ﬁeld
reports. The system is designed for organisations and has
been adopted by various corporations, including ASOS,
Davita Dialysis, and Kickstarter.
While using two examples of anti-violence technologies
aids in illustrating how racial capitalist formations operate
4Big Data & Society
in diverse ways, this approach did limit the scope of our
analysis to primarily U.S. contexts. We felt the beneﬁts out-
weighed the tradeoffs: the combined analysis of Callisto,
which is supported by a non-proﬁt organisation, and Talk
to Spot, which has a for-proﬁt model, revealed that both
tools generate data capital in ways that embed whiteness
and contribute to the commodiﬁcation of race for value.
For each app, we collected textual and qualitative data,
including the app websites and advertisements, legal
cal reports, functionality assessments, and media coverage
We located media and news coverage of Callisto and
Talk to Spot through targeted NexisUni and web searches
of technology publications and other mixed-media, includ-
ing Ted Talks and gender-based violence-themed podcasts.
In total, we reviewed 273 articles and 15 videos (including
podcasts) published between 2015 and 2021. Where possi-
ble, our research team conducted “walkthroughs”of the
apps, making step-by-step observations of the apps’fea-
tures and ﬂows to understand the expected functionality
and cultural messages conveyed in their design (Light
et al., 2018). As Talk to Spot is for organisations, we had
to review videos produced by the company to document
user and employer interfaces and the processes for sample
reports. Given the multimedia nature of the data, we open-
coded documents in vivo using Atlas.TI. In carrying out
data analysis, we also reﬂected on insights gleaned
through archival work that began in 2017 and autoethno-
graphic insights from being approached by anti-violence
technology developers and funding bodies.
The analysis presented here draws on materials that
contextualised big data practices in relation to racial capit-
alism and ﬁndings that reﬂected commonalities and gaps
across the text. We iteratively revisited our analytical
themes of commodiﬁcation, data extraction, ownership,
property relations, and instances of racialisation through
the lens of gendered racial capitalism to isolate patterns
and distinctions in and across the collected data
(Timmermans and Tavory, 2012). In line with qualitative
analyses of big data cultures and practices, this approach
examines “the institutionalised routines, habits, and knowl-
edge practices of the app publishers with respect to data”
(Albury et al., 2017, 2). It also enabled us to identify fea-
tures that aligned with insights about racial capitalism,
informing the structure and argument of the article. The
inherent opacity that comes with studying proprietary tech-
nology, however, prevented us from closely examining
algorithms and functionality—aspects often considered
trade secrets. We were, however, able to observe some
operative intricacies of these technologies and their pro-
cesses of generating and extracting value from data
capital. In the next sections, we examine how reporting
apps produce data capital that is not simply gendered but
Racial capitalism in and through the
dataﬁcation of gender violence reporting
Reporting apps mediate and control extractive data ﬂows
among victims, organisations, and third-party data brokers
in ways that obscure and mobilise racial dynamics. While
their dataﬁcation of violence abstracts information about
the assault and identities of survivors, data accumulation
by dispossession—a practice of producing privatised data
capital (Thatcher et al., 2016, 995)—facilitates the com-
modiﬁcation of social hierarchies as reporting data comes
to take on value. Our scrutiny of these apps reveals practices
of racial capitalism emerging in three ways, which are often
overlapping and interrelated: (1) through racialised property
relations built on data extraction and ownership; (2) through
capital accumulation that reinforces beneﬁts derived
through property relations and data ownership; and (3)
through the commodiﬁcation of diversity and inclusion pro-
motion. These techniques of racial capitalism operate as
mechanisms of coercive power in the data economy by
tapping into survivors’desire for institutional recognition
Property relations through coercive data extraction
and data ownership
Critical analyses of property relations and ownership high-
light how law bolsters the generation of racial capital and
inequality by securing property rights, enabling the enforce-
ment of contracts, and creating subjects (Bhandar, 2018). In
the United States, various legal frameworks articulate the
terms of data ownership, generally privileging the rights
of data controlling and processing entities. For example,
and Spot facilitate data brokers’property interests by
enshrining their right to own, share, and retain personal
and technical data through a ‘notice and choice’framework.
While this framework purports “to put individuals in charge
of the collection and use of their personal information”
(Reidenberg et al., 2014, 489), the consent request facili-
tates claims to data ownership that ensure use value for
companies with access.
With Spot, collected personal information “is controlled
by All Turtles Corporation, San Francisco, California”
(Spot, 2018); users “agree to permit All Turtles to collect,
access, process, and use a variety of information when
you use Spot for Teams”(Spot, 2020). To encourage
consent, both Callisto and Spot emphasise that their tech-
nologies increase disclosure: Callisto (2018) notes survi-
vors are six times more likely to report after visiting the
app, and Spot (n.d.) reports a 70–80% follow-up response
rate from employees who use the software. Although
appealing to the desire for institutional legibility and
accountability, neither app makes the processes and out-
comes of its services transparent. Obtaining users’
Henne et al. 5
consent becomes explicitly coercive when apps are institu-
tionalised within an organisation. Consider, for instance,
when apps become the formal means of reporting harass-
ment and other workplace conduct to HR. They require sur-
vivors to agree to service terms that demand data about
themselves and acts of violence to enable the processing
and circulation of data. The generation of data is the key
source of these apps’value, yet the extraction of data
from users relies on opacity, obfuscation, and information
asymmetry. While survivors may manually enter informa-
tion through the digital interface, the apps can collect
much more, such as user analytic data that can be personal
and technical in nature, anonymised IP address, device,
operating system, and browser information, and statistics
regarding how the user engages with the website (e.g.
areas of the website visited, number of clicks, time spent
on each page).
Callisto encrypts entered information so it cannot be
shared with its staff and Legal Options Counsellors
unless permission is granted. Stored information regarding
an alleged offender includes “unique identiﬁers such as
your telephone number, email address, or social media
account information”(Callisto, 2020). Spot notes the only
mandatory personal data required is one’s email;
however, it enables the collection of “optional”informa-
tion, such as name, demographics (e.g. gender, occupation,
but not race), information about their employer, name of the
alleged offender, and details of the event. In this case, col-
lecting information about gender but not race amalgamates
diverse identities, creating consumable data subjects that
may align with the privacy norms of the internet economy
but not the goal of transformative gender violence justice.
By annexing race to the data shadows (Leonelli et al.,
2017), these practices whitewash subjects and their experi-
ences of violence, which are inextricably linked to inter-
locking systems of oppression. In short, reporting apps
actively racialise data as if subjects are the same—that is,
as normatively white
The design of reporting apps, particularly how they
collect information from users, ensures they are the
primary generators and conduits of data ﬂows. In fact,
Callisto users must repeatedly engage with the app to
derive direct beneﬁt or opt out—for example, to be
matched with another victim of the same perpetrator or to
have a legal counsellor advise on the best recourse. If
one’s information matches with another user’s submission
in the system, “the entry will not be deleted until the
Legal Options Counsellor closes your case”, and survivors
must contact their assigned Counsellor if they wish to close
their case (Callisto, 2020). As such, it reﬂects one mechan-
ism through which reporting apps turn survivors into
objects of data collection rather than subjects with agency
and power over their data. As an organisational and enter-
prise technology, Spot encourages in-app communications
by offering multiple channels for HR and employees to
Although Spot’s storage terms differ slightly, it, like
Callisto, relies on the extraction of data, which can be
reused and reinvested—even in cases where users do not
beneﬁt directly. Further, reporting apps promise data will
be anonymised, encrypted, and conﬁdential to encourage
users to submit information. While these design features
prompt users to engage the apps in pursuit of an algorithmic
match or institutional accountability that may never occur,
Callisto and Spot beneﬁt from the exchange value of the
data capital they circulate in the broader digital economy.
These practices attest that reporting app privacy policies
and data governance are less about protecting users and
more about enshrining property relations that create more
capital using survivors’data. This distinct is important.
The realities of how data become commodiﬁed tip its use
value in favour of apps, organisations, and data brokers, a
central feature of both data capital and systems of racial
capitalism (Leong, 2013; Sadowski, 2019). For instance,
in an interview for The New Yorker, Jessica Ladd,
Callisto’s founder, was asked “Will Callisto retain the
reports submitted by victims who may no longer be enrolled
in the service?”Ladd replied, Don’t tell anybody, but yes”
it is therefore important that our users’entries are main-
tained and stored over time to enable the matching
feature. This is why even after your account is terminated,
some or all of your data may still be stored on our platform.
The power of data property rights, explained here as a
public secret, is the ability to exercise both data accumula-
tion and dispossession, regardless of how users’desire for
Extracting data along these terms also ensures surplus
value through the sharing of data with third parties, the
direct beneﬁts of which are not for users. MailChimp,
Callisto’s email service provider, facilitates data sharing
with social media platforms: users with “the ‘Social
Proﬁles’feature”turned on actually enable “additional infor-
mation”to be exchanged and more widely circulated
(Callisto, 2020). Its use of Google Analytics has similar
implications, enabling the sharing of “information collected
by Google Analytics about your visits to our site and other
(Callisto, 2020). While Google Analytics can provide report-
ing apps with information about how people interact with
their products in ways that make them more accessible to sur-
vivors, Google Analytics and the other big data technologies
they talk to through APIs are fundamentally marketing tech-
nologies. Non-proﬁts and apps might use them for “social
good”, but the data they circulate ﬂows into the big data
economy. These policies thus reﬂect distributive politics
characteristic of Big Tech markets, with information
6Big Data & Society
ﬂowing from users to reporting apps to corporate partners
who can translate and repurpose data into proﬁt. Callisto
and Spot share data with Facebook, Google, Mailchimp,
and Twitter; the Spot AI further integrates with business
apps, such as BambooHR, Namely, Slack, Workday, and
Zeneﬁts. As reporting apps and technology companies can
continuously draw out new value from data, these arrange-
ments reﬂect asymmetrical power relationships of data
Privacy Policies and Terms of Service operate as forms
of law that facilitate data brokers’interests and privileges.
For example, they shift liability by detailing how parent
companies, despite owning and beneﬁting from data
capital, are not responsible for data leaks, identiﬁcation or
loss, or poor HR responses. In doing so, they elide culpabil-
ity for harms arising from authorised access to data or insti-
tutional failure. In addition, the acknowledged problem of
“TL;DR”(too long, didn’t read) in privacy notices (Obar
and Oeldorf-Hirsch, 2020) supports these coercive capital-
ist conditions. The resulting dispersion of data ownership
and responsibility puts the onus on the actual producers
of data (survivors) to understand, locate, and take control
of their reporting information. Scrutinising the distribution
of beneﬁts aids in unveiling the racialised dimensions of
these dynamics, as discussed in the next sections.
Data capital accumulation as reinforcing beneﬁts
derived from data ownership
Reporting apps and their partners’advantageous position
within systems generating data capital reﬂect the overlap-
ping and negotiated nature of ownership. Capital accumula-
tion reinforces beneﬁts for data controlling and owning
entities in three interrelated ways: 1) through data capital
extracted from users, 2) through exchange and surplus
value gained by sharing and investing with other users
and consumers (like businesses and universities), and 3)
through data reinvestment in developing and improving
reporting websites, their services, and business decisions.
The racialised contours of these mechanisms are not neces-
sarily evident on the surface, because the naturalisation of
ownership masks how property relations operate in ways
in which “selected private interests are protected and
upheld”(Harris, 1993, 1730).
As reporting apps invest and reinvest the data capital
they own, they reap the beneﬁts enabled by the surplus
value of those investments—in this case, Callisto, Spot,
and their corporate collaborators. In a CNN Money
article, Ladd, Callisto’s founder, acknowledges seeing “a
clear market opportunity —but success won’t be monetary.
Proﬁting off of rape is something that makes people very
uncomfortable”(O’Brien, 2017). She describes how she
sees a need to shift how organisations investigate wrong-
doing, including police brutality and immigrant
deportations, envisioning Callisto as a catalyst (McHugh,
2019). To encourage this normative shift, Callisto’s code
is available on Github for “others to copy the code and
implement in their HR systems”. This push, however,
negates how human experiences—in this case, those dir-
ectly associated with harassment and violence—are
annexed and reorganised within the broader data
economy. In addition to whitewashing data under the pre-
tence of data privacy, Callisto and Spot do not provide
support for those who are historically illegible as victims
of violence, such as non-white, disabled, and LGBTQ
people. In open sourcing their matching algorithm,
Callisto is not simply sharing; it is seeking to institutionalise
digital reporting systems and create more value from data
about users’experiences of violence and from interactions
with these systems.
These practices reﬂect what Victor Ray (2019, 42) calls
racialised decoupling, which enables “organisations to
maintain legitimacy and appear race neutral or even pro-
gressive while doing little to intervene in pervasive patterns
of racial inequality”. As universities and corporations—
often unmarked spaces of whiteness (Moore, 2008)—insti-
tutionalise reporting apps, racialised decoupling is central to
how data capital beneﬁts reporting apps, these predomi-
nantly white organisations, and Big Tech entities. While
Callisto and Spot promote the sentiment that “your data
should work for you”(https://talktospot.com/index), they
sustain markets that rely on gender-based violence to
enhance data about users by promoting a “better data”para-
digm. The business model requires more data to generate
better data, incentivising expansion that can amount to a
form of predatory inclusion (see Cottom, 2020). That is,
survivors of violence may be better positioned to report
through apps, but they do so on exploitative terms driven
by data dispossession, with no publicly available evidence
of effective institutional action based on their use.
Revisiting Ladd’s vision, these platforms can be used to
respond to other abuses, “including police brutality and
immigrant deportations”, which increases markets and
opportunities for the exchange of data capital (O’Brien,
2017). As people of colour disproportionately experience
gender-based violence, police violence, and immigrant
deportations, these systems of data extraction often target
them but do not challenge white racialised organisations
that are the beneﬁciaries of data’s use value.
Consider, for example, how media conglomerate
Thompson Reuters has sold access to their private database
containing “more than 400 million names, addresses, and
service records from morethan 80 utility companies covering
all the staples of modern life”to U.S. Immigration and
Customs Enforcement and other law enforcement agencies
(Hartwell, 2021). This example illustrates how data are
shared across markets and in ways that can fortify surveil-
lance regimes, even if users had not intended their data to
be used for those purposes. In the case of reporting apps,
Henne et al. 7
violence enabled through gender inequity, structural racism,
and exclusionary citizenship can become sources from which
data markets can extract value—and to beneﬁt of institutions,
such as law enforcement, that propagate racialised sexual
violence (Purvis and Blanco, 2020; Ritchie, 2017). In
doing so, the “better data”paradigm does not necessarily
support justice and accountability; it does, however,
support a broader marketplace, which can reinscribe pro-
foundly racialised conﬁgurations of power between users,
data, corporate actors, and the state apparatus.
Representatives of these apps have made public state-
ments suggesting that they understand corporate and institu-
tional consumers of these apps might privilege the value of
data over direct services for survivors. According to Kate
L. Lazarov, Callisto’s project ofﬁcer, “recording informa-
tion about an assault rather than reporting it could be valu-
able for colleges”(Fabris, 2015). Value generation relies on
the circulation and exchange of standardised information
about survivors (and perpetrators if named in the reports)
within organisations, across digital survivor communities,
and to HR and legal systems. Organisational cultures,
however, can isolate employees and foreclose conversation
among co-workers about harassment and other workplace
misconduct. In the United States, it is already common to
“avoid the harasser, deny, or downplay the gravity of the
situation, or attempt to ignore, forget, or endure the behav-
ior”(Feldblum and Lipnic, 2016, n.p.), with intersectional
analyses demonstrating that Black women are dispropor-
tionately represented in sexual harassment charges across
industries (e.g. Rossie et al., 2018). A report by the
National Women’s Law Center indicates 35.8% of women
who ﬁled charges alleging harassment also reported work-
place retaliation in which they risked a pay cut, job loss, dam-
aging future career options, and developing a reputation as a
‘troublemaker’(Rossie et al., 2018). The fear of retaliation is
a central reason why employees do not report, especially
among those working in low-wage jobs. Although apps
purport to increase incident reports, they do not offer protec-
tion against retaliation. By “black boxing”submissions and
making their processing of complaints opaque, they can, in
fact, afﬁrm cultures of silence.
In contributing to practices on universities and in busi-
nesses, reporting apps can cultivate new digital survivor
subjects, which produce opportunities for potentially
biased decision-making through data analytics. As Crooks
(2021: n.p.) notes, “data work, like other bureaucratic pro-
cesses, is race work . . .by virtue of the interpretive and cal-
culative mechanisms which pre-emptively translate racial
projects into questions of computation”. By design,
reports apps elide concerns of racism in gender-based vio-
lence. They confer beneﬁts that support the operation of
racialised organisations in their current form, not institu-
tional changes that counteract violence. For example,
Spot aims to create “workﬂows for sensitive issues”,
which has expanded to include other workplace problems
“from Covid-19 concerns to corrective action to bullying”.
The company offers a case management platform to “inves-
tigate reports and claims”and “get actionable insights on
the health of your organization”through reporting on
trends. Similarly, Callisto provides their university partners
with aggregated data reports twice a year meant to offer
better data for the existing ofﬂine reporting infrastructure
of survivor advocates, Title IX coordinators, and counsel-
lors (EEOC, n.d.). While they present these practices as
modes of catalysing organisational change by providing
“better data”to upstream stakeholders, their circulation of
whitewashed data reiﬁes modes of racialised decoupling.
There is also evidence that these apps create incentives
for predatory practices that sustain corporate power exer-
cised by Big Tech. Both Callisto and Spot use persistent
cookies, which remain on the reporter’s computer until
deleted. As explicitly surveillance-oriented tools, cookies
monitor, track, and log user activity, and “in ‘persistent’
form, could enable reidentiﬁcation of those users when
they returned to the site later on”(Cohen, 2019, 54). Data
on usage trends are collected, stored, and analysed for
several stated reasons, such as “to understand how our
users are using our website and platform, and which fea-
tures or content appear to be more useful and have the
notes, “We may use or disclose the personal information
we collect…For testing, research, analysis, and product
development, including to develop and improve our web-
reveal how reporting apps continually aim to generate value
from data capital through product optimisation, which
translates into value through “better data”products. In
doing so, organisations, including those selling big data
analytics such as Google Analytics and Hotjar, rather than
victims, stand to beneﬁt most. Anti-violence apps, particu-
larly those designed for corporate entities, enable signiﬁcant
reinvestment—and not in ways that aim to provide direct
beneﬁts for survivors. By prioritising commodiﬁcation and
capital accumulation, rather than data justice, the digital
infrastructure of racialised organisations may become more
robust; however, their discriminatory features can go
Racial commodiﬁcation through the promotion of
diversity and inclusion
As other analyses of racial capitalism attest, non-white
racial identities can become commodities that predomi-
nantly white institutions instrumentalise as a form of
social capital (Leong, 2013; Ralph and Singhal, 2019). As
public relations are integral to capitalism (Madianou, 2019),
promotional materials often showcase representations of racia-
lised persons as “a tool for reaching these groups as well as a
sign of their potential, or virtual, market share”(Marino, 2014,
8Big Data & Society
7). Reporting apps are no exception. Although the functional
mechanisms of Callisto and Spot may whitewash data, their
marketing mobilises “diversity and inclusion”rhetoric.
Callisto has worked to cultivate a reputation as an orga-
nisation that upholds and helps others enact diversity and
inclusion mandates. As it has recruited a more diverse lea-
dership over time, its visual promotions have evolved to
support the perception that their product is race conscious
and supports equity. In fact, some universities consider
Callisto part of efforts to “expand diversity and inclusion
initiatives that help marginalized groups on campus, includ-
ing victims of sexual assault and violence”(Pollack, 2020).
These visualisations centre representations of non-white
people to support advertising efforts, evincing a kind of
tokenism that “‘leverages undervalued identities’and ‘pre-
serves commodiﬁed values of race by parading an excep-
tion’” (Leong, 2013, 2195). Callisto’s abstract illustrations
cultivate an image of racial inclusion by emphasising
bodily features stereotypically ascribed to Black and Brown
people, including curly hair and fuller lips (Figures 1 and 2).
In contrast to this racialised imagery, the design of
Callisto does not facilitate modes of attending to how
people of colour disproportionately experience violence.
Appealing to racial inclusion thus works in the service of
marketing the app, not in tailoring its services for survivors.
Carefully curated, this aesthetic conveys promotions that
may appeal to people of colour while still being palatable
for primarily white academic institutions and investors. Its
commodiﬁcation of race is of interest to institutions that
want to support an image of a diverse user base, doing so
without disrupting the racial status quo. Moreover, as scho-
lars of racial capitalism explain, these kinds of tokenising
practices facilitate a framing of white racialised universities
as “nonracist and culturally competent actor[s]”(Leong,
While Callisto actively employs racial difference to
support its market expansion, Spot’s advertising targets cor-
porate clients, which are primarily predominantly white
organisations that have had problems with harassment
(e.g. Ramos Law, 2017). Rather than explicitly confront
issues of age, disability, gender, or race, Spot depicts its
product through images of gender and racial ambiguity.
Its promotional videos utilise abstract animations of
humans that are coloured red and blue to signal a bifurca-
tion between women and men. In picking primary colours
rather than skin tones to differentiate these ﬁgures in their
advertisements, Spot promotes a colourblind approach that
does not ‘see’race or ‘choose’racial groups. It also presents
its AI as “completely unbiased”when addressing workplace
issues, “because Spot is a bot and not a human, it cannot
judge or assess you”(Mercer, 2019). The discourse that
the Spot AI “listens without judgment”(Childs, 2019)
invites users to produce data in a context somehow discon-
nected from the gendered, heteronormative, and racialised
environments that users navigate. This marketing, of
course, is misleading, as digital tools can encode gendered,
Figure 1. Callisto homepage (source: myCallisto.org 2021).
Henne et al. 9
racial, and other prejudicial ideas and values into the system
This race-neutral rhetoric carries over into Spot’s
unmarked AI chatbot. Spot CEO Jessica Collier has stated
explicitly that its “genderless, personality-neutral inter-
viewer”is to “effectively eliminate reporting bias and the dis-
comfort of telling another person (of another gender, race, or
background)”about traumatic experiences (Childs, 2019).
Though asserting neutrality, Spot is nonetheless a white
racialised AI: its seemingly benevolent design is to
enhance the extraction of data through a performance of an
educated, literate, middle-class (white) persona without indi-
cation of a lived culture or history (see Cave and Dihal, 2020;
Phan, 2019). As such, Spot’s proactive framing of race as
unmarked “equate[s] the dominant cultural identity group
with a universalized vision of humanity”(Marino, 2014,
3). The racial capitalistic elements become clear when con-
sidering how these appeals to whiteness aim to enhance
capital accumulation and value.
Spot’s embrace of normative whiteness extends into its
approach to diversity. One of its turnkey business solutions
is a one-hour diversity, equity, and inclusion (DEI) training,
delivered to employees in “10-min episodes [that] are more
like Instagram stories than the cringe-worthy videos we all
love to hate”, covering topics primarily related to gender
discrimination, sexual orientation, and unwelcome
conduct (https://talktospot.com/training). Spot’s DEI train-
ing has an additional one-hour training for supervisors,
including one covering how to “debunk false reports”.
Such training seems antithetical to the app’smissionto
help facilitate delicate information because it reinforces the
myth that false reporting is rampant by teaching supervisors
to recognise “credibility discounts”, which are always
already classed, gendered, and racialised (Tuerkheimer,
2017). Reinforcing claims that supervisors need to get at
the “truth”of the matter obscures the nature of the systemic
problem, which is not false reporting, but poor institutional
action and lack of accountability.
These tensions reﬂect how priorities of data extraction
and value generation often undermine the apps’stated
goal of empowering users. Although different in their
approaches, both Callisto and Spot commodify race. It sup-
ports their shared aim of enhancing their ability to produce
data on experiences of harassment and violence in the
pursuit of an expanded consumer base and access to new
markets. The users of reporting apps are further alienated
from the beneﬁts of capital produced from their labour.
The value derived from these apps is not simply eco-
nomic. Racial commodiﬁcation generates capital that has
social, cultural, and human dimensions. Capital that supports
favourable social status and reputation can also attract new
economic resources. For example, in 2018, Callisto received
the Skoll Award for Social Entrepreneurship, which awards
“social entrepreneurs who take the risks to right the most
unjust systems”with $1.25 million USD in grant funding
(Callisto, 2018). While grant funds support Callisto’s opera-
tion, such awards also provide social capital through reputa-
tion enhancement and extended networks. More broadly,
Figure 2. Callisto “about”page (source: myCallisto.org 2021).
10 Big Data & Society
prestigious awards, which often come from primarily white
foundation funders, can be leveraged in other fundraising
efforts in which having prior awards is helpful in winning
new awards and grants from other primarily white funders.
These practices illuminate additional dimensions of the
complex racialised dynamics in which Callisto and Spot
are situated, which exceed inequalities observed in relation
to reporting. Not only do apps fail to explicitly confront
power structures that uphold racial inequalities, but they
also maintain—and expand—ties to legal systems that per-
petuate harms disproportionately experienced by people of
colour. Take, for example, Spot’s co-founder, Julia Shaw, a
memory expert and psychologist who previously trained
police and military personnel to conduct interviews on emo-
tional events (Reynolds, 2019). She has repeatedly empha-
sised the AI replicates police interview techniques that use
cognitive science to “focus on neutrality and gather factual,
detailed evidence from memories”(e.g. Byers, 2019;
Reynolds, 2019). In a Digital HR Leaders podcast, Shaw
describes, “just like a police investigation, if you have an
HR investigation, you want as high-quality information as
possible and you want the evidence that you’re using what-
ever it is to be high quality”(Green, 2020). Shaw falsely
presents police techniques as neutral and draws upon
claims to scientiﬁc authority, expertise, and the credibility
conferred upon police interviewing to position the AI as
offering more reliable evidence (as opposed to forcing or
leading responses to questioning). In short, Spot actively
brings carceral techniques into corporate HR processes.
Callisto is similarly entangled in civil and criminal legal
processes. Although its original purpose was to use its
matching algorithm to report, track, and hold accountable
repeat offenders through Title IX claims, Callisto has since
incorporated Legal Options Counsellors as the key resource
for users to navigate modes of recourse. These actors open
the door to more legal actions, including “getting a restrain-
ing order,”engaging a “criminal justice process or a civil
lawsuit,”or legally mediated “restorative justice”. While
Callisto suggests twelve possible avenues for survivors,
their choice to call the survivor liaison a “Legal Options
Counsellor”reinscribes legal forms of recourse as preferred,
negating how various legal processes—whether administra-
tive, civil, or criminal—fail to provide adequate support for
complainants who are people of colour, have a disability,
and/or LGBTQ (e.g. Kim, 2014; Spade, 2013). Here again,
by prioritising value generation and supporting existing
systems, these apps work to uphold, rather than contest,
racialised power relations.
Through an analysis of reporting apps, this article illustrates
how emergent questions of data capital should be consid-
ered in relation to interlocking systems of domination and
oppression. It illuminates how an intersectional sensibility
might enhance the analysis of technological solutions to
criminological concerns speciﬁcally and data capital more
generally (see Henne and Troshynski, 2019). Our aim,
however, is not simply to demonstrate the possibilities of
intersectional analyses of data. Rather, it is to provide a
grounded example of how racial capitalism persists
through a gendered project, sustaining racialised regimes
of value extraction and exchange. In this case, as dataﬁca-
tion transforms social action, experience, and identity,
these apps support modes of accumulating and circulating
capital in ways that do not necessarily translate into direct
beneﬁt or use value for survivors who are the sources of
this valuable information.
Our ﬁndings raise serious questions about how reporting
platforms comprise a new front-line approach for addres-
sing gender-based violence. While Callisto and Spot are
promoted as empowerment tools for survivors, examining
them through the lens of racial capitalism aids in under-
standing how they fail to challenge systems of oppression
that contribute to gender-based violence and the inequitable
realities of reporting. In this case, apps can be used by orga-
nisations and institutions to avoid making meaningful
changes to structures that enable harassment and discrimin-
ation. This seeming paradox reﬂects larger societal shifts in
which the control of knowledge is fast becoming the foun-
dation upon which economic, legal, political, and social
inﬂuence is exercised (Haggart et al., 2019).
Having illustrated how racism operates even when
something appears race neutral, this case study is instructive
for future critical and feminist interventions. Data policies,
infrastructures, and practices are critical sites of justice. If
conceptualisations of data capital are to contribute to libera-
tory agendas, they must unveil and confront how racialised
property relations are inextricably linked to its generation.
Anonymising data does not offer sufﬁcient means of protec-
tion (see Shelby et al., 2021), as it does not prevent power-
ful actors from exchanging, investing, and ultimately
beneﬁting from mined data. In the context of gender-based
violence, the use of digital apps may promise better evi-
dence, but they also add new layers to pursuing justice
through formal remedies—which are already often opaque
and at times invisible to survivors. For such interventions
to effectively serve survivors, theses data infrastructures
must centre the needs and rights of historically excluded
Recognising that scholars concerned with big data, plat-
form governance, and surveillance are asking pressing
questions about how data extend and challenge understand-
ings of contemporary capitalism, we hope this article
serves as a cautionary example of why not to accept the
appearance of race neutrality when pursuing this critical
line of research. Data capital tends to beneﬁt predominantly
white institutions and organisations because it can continu-
ally be reinvested and redeployed in the service of actors
that have few to no incentives to ensure it has use or
Henne et al. 11
exchange value for users. Through its focus on constitutive
relationships, racial capitalism offers an analytic that can
strengthen and expand analyses of whiteness in
data-intensive applications, supporting more robust scholar-
ship at the intersections of data and power. It illuminates
how racism is embedded in inequitable data relations and
how gendered data projects can advance racist processes—
in all, pointing to the need to better adapt intersectional ana-
lysis for data studies.
The authors would like to thank the anonymous reviewers for their
comments and ideas that contributed to this paper’s development.
Declaration of conﬂicting interests
The author(s) declared no potential conﬂicts of interest with
respect to the research, authorship, and/or publication of this
The author(s) disclosed receipt of the following ﬁnancial support
for the research, authorship, and/or publication of this article:
This work was supported by the Mellon/American Council of
Learned Societies Fellowship and the Australian National
University Futures Scheme.
Renee Shelby https://orcid.org/0000-0003-4720-3844
1. At the time of writing, we could not identify any reporting apps
that collect user information on disability, ethnicity, race, or
sexual orientation (including #NotMe, AllVoices, Botler AI,
Callisto, Hello Cass, JDoe, SafeSport, Talk to Spot) or that
suggest supports tailored for historically illegible survivors.
2. While the reporting feature is public, the app’s matching
system is still limited to certain universities.
3. Callisto deﬁnes a “Legal Options Counsellor”as an attorney,
vetted by Callisto, who helps users navigate their options for
4. Adding racial categorisation would not correct the disposses-
sive qualities of data capital. Without recognising sociohistori-
cal context and power, it “re-inscribes violence on communities
that already experience structural violence”(Hanna et al., 2020,
502), becoming another axis for value generation.
5. At the time of writing, Spot has not shared the data it uses to
train its AI or what steps it takes to prevent replicating racia-
lised problems with ofﬂine reporting.
Albury K, Burgess J, Light B, et al. (2017) Data cultures of
mobile dating and hook-up apps: Emerging issues for
critical social science research. Big Data & Society 4(2):
Andrejevic M (2012) Exploitation in the data mine. In: Fuchs F,
Boersma K, Albrechtslund A and Sandoval M (eds) Internet
and Surveillance: The Challenges of web 2.0 and Social
media. London: Routledge, 71–88.
Andrejevic M (2014) The big data divide. International Journal of
Communication 8(1): 1673-1689.
Bhandar B (2018) Colonial Lives of Property: Law, Land, and
Racial Regimes of Ownership. Durham: Duke University
Bhatia M (2021) Racial surveillance and the mental health
impacts of electronic monitoring on migrants. Race &
Class 62(3): 18–36.
Bhattacharyya G (2018) Rethinking Racial Capitalism: Questions
of Reproduction and Survival. Lanham, MD: Rowman &
Bivens R and Hasinoff AA (2018) Rape: Is there an app for that?
An empirical analysis of the features of anti-rape apps.
Information, Communication & Society 21(8): 1050–1067.
Bonilla-Silva E (2012) The invisible weight of whiteness: The
racial grammar of everyday life in contemporary America.
Ethnic and Racial Studies 35(2): 173–194.
Browne S (2015) Dark Matters: On the Surveillance of Blackness.
Durham: Duke University Press.
Byers A (2019) Yukon Human Rights Commission adopts online
tool to report harassment and discrimination. CBC News,1May.
Cacho LM (2014) The presumption of white innocence. American
Quarterly 66(4): 1085–1090.
Callisto (2018) Year 3 of combatting sexual assault, empowering
survivors, and advancing justice // 2017–2018 academic year
report. Report, Callisto, October.
mycallisto.org/privacy-policy (accessed 21 January 2021).
Cave S and Dihal K (2020) The whiteness of AI. Philosophy &
Technology 33: 685–703.
Childs M (2019) Building bots with empathy requires ﬁnding the
right balance. IBM Watson Blog. Available at: https://www.
requires-ﬁnding-the-right-balance/ (accessed 5 June 2020).
Cohen JE (2019) The biopolitical public domain. In: Cohen JE
(eds) Between Truth and Power: The Legal Constructions of
Informational Capitalism. New York: Oxford University
Combahee River Collective (1982) A black feminist statement.
In: Hull GT, Bell-Scott P and Smith B (eds) All the
Women are White, all the Blacks are men, but Some of us
are Brave: Black women’s studies. New York: Feminist
Cottom TM (2020) Where platform capitalism and racial
capitalism meet: The sociology of race and racism in
the digital society. Sociology of Race and Ethnicity 6(4):
Couldry N and Mejias UA (2019) Data colonialism: Rethinking
Big Data’s Relation to the contemporary subject. Television
& New Media 20(4): 336–349.
Crenshaw K (1991) Mapping the margins: Intersectionality, iden-
tity politics, and violence against women. Stanford Law Review
Crooks R (2021) Productive myopia: Racial organisations and
EdTech. Big Data & Society: [insert when available].
12 Big Data & Society
Daniels J (2013) Race and racism in internet studies: A review and
critique. New Media & Society 15(5): 695–719.
Daniels J (2016) The trouble with white (online) feminism. In: Noble
SU and Tynes BM (eds) The Intersectional Internet: Race, Class,
and Culture Online. New York: Peter Lang, 41–60.
Davis A (1981) Women, Race, and Class. New York: Vintage.
EEOC (U.S. Equal Employment Opportunity Commission) (n.d.)
Written testimony of Jess Ladd Founder & CEO. Available at:
(accessed 1 February 2021).
Esqueda CW and Harrison LA (2005) The inﬂuence of gender role
stereotypes, the woman’s Race, and level of provocation and
resistance on domestic violence culpability attributions. Sex
Roles 53(11/12): 821–834.
Eubanks V (2018) Automating Inequality: How High-Tech Tools
Proﬁle, Police, and Punish the Poor. New York: St Martin’s
Fabris C (2015) Callisto to offer new reporting system for survivors of
sexual assault. The Chronical of Higher Education, 16 April.
Feldblum CR and Lipnic VA (2016) Select Task Force on the
Study of Harassment in the Workplace. Washington, DC: US
Equal Employment Opportunity Commission.
Ferguson RA and Hong GK (2012) The sexual and racial contra-
dictions of neoliberalism. Journal of Homosexuality 59(7):
Fuchs C (2012) Dallas Smythe today –The audience commodity, the
digital labour debate, Marxist political economy and critical
theory. Prolegomena to a digital labour theory of value. tripleC:
Cognition, Communication, Co-operation 10(2): 692–740.
Gesinsky L, Merola N, Dana A, et al. (2018) Transforming
Workplace Culture in the era of #MeToo, #BlackLivesMatter,
and More. New York: Seyfarth Shaw.
Glaser A (2016) How simple software could help prevent sexual
assault. Wired, 9 December.
Green D (2020) Episode 27: How can technology reduce bias in
the workplace? Available at: https://www.myhrfuture.com/
bias-in-the-workplace (accessed 6 February 2021).
Haggart B, Henne K and Tusikov N (2019) Information,
Technology, and Control in A Changing World: Shifting
Power Structures in the 21st Century. Basingstoke: Palgrave
Hanna A, Denton E, Smart A, et al. (2020) Towards a critical race
methodology in algorithmic fairness. Proceedings of the 2020
Conference on Fairness, Accountability, and Transparency:
Hall S (2017) The Fateful Triangle. Cambridge, MA: Harvard
Harris CI (1993) Whiteness as property. Harvard Law Review
Hartwell D (2021) ICE investigators used a private utility database
covering millions to pursue immigration violations. The
Washington Post, 27 February.
Henne K and Troshynski EI (2019) Intersectional criminologies
for the contemporary moment: Crucial questions of power,
praxis, and technologies of control. Critical Criminology
Jefferson BJ (2018) Predictable policing: Predictive crime
mapping and geographies of policing and race. Annals of the
American Association of Geographers 108(1): 1–16.
Katz Y (2020) Artiﬁcial Whiteness: Politics and Ideology in
Artiﬁcial Intelligence. New York: Columbia University
Kim ME (2014) VAWA@ 20: The mainstreaming of the crimin-
alization critique: Reﬂections on VAWA 20 years later. CUNY
Law Review Footnote Forum (18): 52–58.
Leonelli S, Rappert B and Davies G (2017) Data shadows:
Knowledge, openness, and absence. Science, Technology, &
Human Values 42(2): 191–202.
Leong N (2013) Racial capitalism. Harvard Law Review 126(8):
Light B, Burgess J and Duguay S (2018) The walkthrough
method: An approach to the study of apps. New Media &
Society 20(3): 881–900.
Madianou M (2019) Technocolonialism: Digital innovation and
data practices in the humanitarian response to refugee crises.
Social Media +Society 5(3): 1–13.
Magalhães JC and Couldry N (2021) Giving by taking away: Big
tech, data colonialism, and the reconﬁguration of social good.
International Journal of Communication 15: 343–362.
Marino MC (2014) The racial formation of chatbots. CLC:
Comparative Literature and Culture 16(5): 1–11.
Market Research Future (2020) Global smart safety and security
device market research report. Market Research Future,
Mason CL and Magnet S (2012) Surveillance studies and violence
against women. Surveillance & Society 10(2): 105–118.
McGuire DL (2010) At the Dark end of the Street: Black Women,
Rape, and Resistance—A new History of the Civil Rights
Movement From Rosa Parks to the Rise of Black Power.
New York: Vintage.
McHugh J (2019) An online tool to catch workplace sexual preda-
tors. The Wall Street Journal, 10 January.
McPhee J and Dowden JP (2018) The constellation of factors
underlying Larry Nassar’s abuse of athletes. Ropes & Gray,
Mead R (2018) Can an app track sexual predators in the theatre?
The New Yorker, 2 April.
Melamed J (2011) Represent and Destroy: Rationalizing Violence
in the new Racial Capitalism. Minneapolis: University of
Mercer S (2019) Spot, report, stop: AI tackles age-old problem.
Counsel Magazine, 21 October.
Moore WL (2008) Reproducing Racism: White Space, Elite law
Schools, and Racial Inequality. Lanham, MD: Rowman &
Noble SU (2018) Algorithms of Oppression: How Search Engines
Reinforce Racism. New York: NYU Press.
Noble SU and Roberts ST (2019) Technological elites, the
meritocracy, and post-racial myths in silicon valley. In:
Mukherjee R, Banet-Weiser S and Gray H (eds) Racism
Postrace. Durham: Duke University Press, 113–134.
O’Brien SA (2017) She wants her rape reporting software to be
universal. CNN Money, 31 March.
Obar JA and Oeldorf-Hirsch A (2020) The biggest lie on the inter-
net: Ignoring the privacy policies and terms of service policies
of social networking services. Information, Communication &
Society 23(1): 128–147.
Phan T (2019) Amazon Echo and the aesthetics of whiteness.
Catalyst: Feminism, Theory, Technoscience 5(1): 1–37.
Henne et al. 13
Phipps A (2019) “Every woman knows a Weinstein”: Political
whiteness and white woundedness in #MeToo andpublic femin-
isms around sexual violence. Feminist Formations 31(2): 1–25.
Pollack A (2020) SA members introduce app to improve sexual
assault reporting process. The Daily Orange, 6 September.
Purvis DE and Blanco M (2020) Police sexual violence: Police
brutality, #MeToo, and masculinities. California Law Review
Ralph M and Singhal M (2019) Racial capitalism. Theory and
Society 48(6): 851–881.
Ramos Law (2017) Latest DaVita lawsuit echoes charges in John
Oliver’s Last Week Tonight takedown, 18 May.
Ray V (2019) A theory of racialized organizations. American
Sociological Review 84(1): 26–53.
Reidenberg JR, Russell NC, Callen AJ, et al. (2014) Privacy harms
and the effectiveness of the notice and choice framework. I/S:
A Journal of Law and Policy for the Information Society 11(2):
Reynolds E (2019) How technology is tackling the stigma around
sexual assault. i-D Magazine, 15 March.
Richie BE (2012) Arrested Justice. New York: NYU Press.
Ritchie A (2017) Invisible no More: Police Violence Against black
Women and Women of Color. Boston: Beacon Press.
Robinson C (1983) Black Marxism: The Making of the Black
Radical Tradition. Chapel Hill: University of North Carolina
Rossie A, Tucker J and Patrick K (2018) Out of the Shadows: An
Analysis of Sexual Harassment Charges by Working Women.
Washington, DC: National Women Law Center.
Sadowski J (2019) When data is capital: Dataﬁcation, accumula-
tion, and extraction. Big Data & Society 6(1): 1–12. 10.1177/
Schlesinger A, O’Hara KP and Taylor AS (2018) Let’s talk about
race: Identity, chatbots, and AI. Proceedings of the 2018 CHI
Conference on Human Factors in Computing Systems:1–14.
Shelby R (2021) Technology, sexual violence, and power-evasive
politics: Mapping the anti-violence sociotechnical imaginary.
Science, Technology, & Human Values:1–30. doi: 10.1177/
Shelby R, Harb J and Henne K (2021) Whiteness in and through
data protection: An intersectional approach to anti-violence
apps and #MeToo bots. Internet Policy Review 10(4).
Sim K (2021) Respond and resolve: A critical feminist inquiry for
technologies of sexual governance. Global Perspectives 2(1):
Sommerville DM (2004) Rape and Race in the Nineteenth-Century
South. Chapel Hill: University of North Carolina Press.
Spade D (2013) Intersectional resistance and law reform. Signs:
Journal of Women in Culture and Society 38(4): 1031–1055.
privacy (accessed 21 January 2021).
terms-of-use (accessed 21 January 2021).
Spot (n.d.) Case Management. Available at: https://talktospot.com/
case-management (accessed 30 September 2021).
Srnicek N (2017) The challenges of platform capitalism:
Understanding the logic of a new business model. Juncture
Thatcher J, O’Sullivan D, et al. (2016) Data colonialism through
accumulation by dispossession: New metaphors for daily
data. Environment and Planning D: Society and Space 34(6):
Timmermans S and Tavory I (2012) Theory construction in quali-
tative research: From grounded theory to abductive analysis.
Sociological Theory 30(3): 167–186.
Tuerkheimer D (2017) Incredible women: Sexual violence and the
credibility discount. University of Pennsylvania Law Review
Virdee S (2019) Racialized capitalism: An account of its
contested origins and consolidation. The Sociological Review
14 Big Data & Society