Forthcoming, The Oxford Handbook of Digital Ethics
The Ethics of Sex Robots
Aksel B. Sterri & Brian D. Earp
What, if anything, is wrong with having sex with a robot? For the sake of this chapter, the
authors will assume that sexbots are ‘mere’ machines that are reliably identifiable as such,
despite their human-like appearance and behaviour. Under these stipulations, sexbots
themselves can no more be harmed, morally speaking, than your dishwasher. However, there
may still be something wrong about the production, distribution, and use of such sexbots. In
this chapter, the authors examine whether sex with robots is intrinsically or instrumentally
wrong and critically assess different regulatory responses. They defend a harm reduction
approach to sexbot regulation, analogous to the approach that has been considered in other
areas, concerning, for example, drugs and sex work.
Keywords: sex robots; sexbots; artificial intelligence; sexual perversion; feminism; virtue
ethics; rape; harm reduction; prohibition; digisexual
This is the authors’ copy of an in-press book chapter. It may be cited as follows:
Sterri, A. B., & Earp, B. D. (in press). The ethics of sex robots. In C. Véliz (ed.), The Oxford
Handbook of Digital Ethics. Oxford: Oxford University Press. Available online ahead of
print at https://www.academia.edu/42871768/The_ethics_of_sex_robots
With rapid advances in robot technology and artificial intelligence (AI), sex robots, or sexbots
for short, are expected to become much more popular than the sex ‘dolls’ currently on the
Some proponents of sexbots, such as Neil McArthur (2017), argue that as long as no
one with moral standing is harmed, people should be free to do as they like in their own
bedroom (or indeed anywhere else). David Levy (2007) envisions a positive future for human–
robot sexual relations. According to Levy, sexbots will provide a healthy outlet for sexual
desires and will, over time, become fine-tuned to the preferences of their users. Nevertheless,
many worry that sexbots will not be so ethically innocent. There is something chilling about
the very idea of a sexual servant—even a technological one—that is always ready and willing
to satisfy one’s desires, whatsoever they may be. As Jeannie Suk Gersen (2019: 1798) states:
‘In the context of sex robots, the idea of forced servitude is especially disturbing because, for
many people, the sexual realm is a site of our deepest ideals and fears about personal autonomy
and personal relationships.’
Some ethical worries about sexbots relate to scenarios that may never materialize. In
the popular TV series Westworld, extreme forms of cruelty are on constant display, including
what is by all appearances the forcible rape of robots who are indistinguishable from humans
and who may well be sentient and even capable of forming and acting on life plans.
purposes of this chapter, we will set aside such concerns and assume that sexbots of the future
will be non-sentient and lack moral standing. They will be neither moral victims nor moral
agents; that is, we will assume that sexbots are ‘mere’ machines that are reliably identifiable
as such, despite their human-like appearance and behaviour. Under these stipulations, sexbots
themselves can no more be harmed, morally speaking, than your dishwasher. As we will
explore, however, there may still be something wrong about the production, distribution, or use
of such sexbots.
One kind of wrong could have to do with likely (or at least potential) effects on users.
Another has to do with such effects on society: our seemingly private actions may have wider
implications for the communities of which we are a part. There may also be intrinsic wrongs
associated with sexbot use, depending on one’s views on the nature of sex and sexual activity,
and the relationship between human beings and inanimate objects more generally.
Supposing it is wrong for human beings to engage sexually with robots, what then?
Should sexbots be banned? We will discuss arguments in favour of prohibition, as well as
arguments against it. We will also explore the possibility that sexbots could be, on balance,
good for society if appropriately regulated, and that human–robot sex could be ethical. We
explore the prospect of a harm reduction approach to sexbot regulation, analogous to the
approach that has been considered for drugs and sex work. Is there a way for regulators to steer
the development and use of sexbot technology in ways that would maximize its benefits and
minimize its harms, both for the user and for society at large?
What is a sex robot?
A sexbot is a robot that is designed and manufactured primarily for the purpose of being used
for sexual gratification. In this chapter, we consider sexbots with four particular features: (1)
they are shaped like human beings and have a human-like appearance; (2) they move in ways
that are relevant for sexual interactions of various kinds; (3) they are embodied (rather than,
say, holographic); and (4) they have AI of a sufficiently sophisticated level to allow the user to
communicate with the robot, at least in a rudimentary way.
The human shape sets sexbots
apart from sex toys, such as vibrators, fleshlights, dildos, etc. The ability to move and interact
with users sets them apart from sex dolls. And their embodiment sets them apart from virtual
How likely, and how soon, are we to confront the existence of humanoid sexbots? The
company Realbotix, which for several years has produced sex dolls, suggests that such sexbots
are highly likely in the reasonably near future: ‘We’re working to create the next generation of
the well-known anatomically correct RealDolls, which we intend to blend with Artificial
Intelligence, Robotics, touch sensors, internal heaters, virtual and augmented reality
interfaces.’ These new bots, they continue, ‘will have an animated face synchronized with an
application that users can talk to and interact with [and] will have the ability to listen,
remember, and talk naturally, like a living person. They will have hyper-realistic features,
warmth, and sensors that react to touch’ (RealBotix 2020).
There are three types of sexbots that have received the most attention in the ethics
literature. The first type is a female sexbot with features that are stereotypical of mainstream
pornography. The sex dolls that are currently on the market mirror, as one author puts it, ‘a
strong Eurocentric male gaze’, such that ‘their design takes semantic coding and stereotyping
along hegemonic gender lines to the extreme, basically reducing “robot companions” to large-
breasted Barbie dolls with glimpses of artificial intelligence’ (Kubes 2019: 3). The prospect of
doubling down on this gaze and associated attitudes or expectations from mainstream
pornography (widely understood to be harmful and misogynistic) is one of the leading worries
that has been raised about sexbots. The concern is that these attitudes and expectations will be
reinforced in the minds of users, which, in addition to being potentially objectionable in its
own right, may also increase the likelihood of tangible harms to others, in particular women
(e.g. by encouraging users to think of women as objects to be sexually exploited, rather than
as equal partners in a sexual encounter). This worry is a central issue in both the Campaign
against Sex Robots, spearheaded by the anthropologist Kathleen Richardson (2015),
Robert Sparrow’s chapter, ‘How Robots Have Politics’, in this volume.
The second type of sexbot that has received the most attention is the type that is
designed to be ‘raped’ (i.e. to simulate resistance to sexual advances and a lack of consent).
The rudimentary sexbot Roxxxy, for example, comes with different personalities. One of them
is ‘Frigid Farrah’, which is described by the manufacturer as an entity that is ‘very reserved
and does not always like to engage in intimate activities’ (Brown 2019: 112). It has been argued
that Frigid Farrah could be used to satisfy rape fantasies—or to encourage such fantasies in the
first place—and that this would be harmful to the moral character of the people who engage in
such acts, while likely contributing to negative effects on society at large (Danaher 2017a;
The third type of sexbot is the type that is designed to look like a prepubescent child (a
child sexbot, for short). It seems that child sexbots, at least in a crude form, are currently under
production in Japan. While some, such as roboticist Robert Arkin (quoted in Hill 2014), have
proposed that child sexbots could ethically be used in the treatment of paedophilia (and thus
reduce the likelihood of adult sexual contact with actual children), others are afraid that such
sexbots would only serve to sexualize children, leading to more, not less, abuse (Danaher
2019b; Strikwerda 2017).
These three types of sexbots, insofar as they are (or continue to be) developed, raise
important, and partially dissociable, ethical concerns. In their current form, the robots are not
much more sophisticated than dolls. However, given the rapid pace of technological
innovation, including in such areas as artificial speech, touch, and affect, it is reasonable to
expect that these robots will, within the next few decades, become much more human-like
along the four dimensions we listed above.
Value from sexbots
The prospect of human-like sexbots raises several worries. It might seem unthinkable that they
could add any real value to society. But as the hypothetical use of such sexbots in treating
paedophilia might suggest, there could potentially be some good associated with their
development in certain cases. The main good that some authors have suggested sexbots would
allow is that more people could have (at least some kind of) sex.
This is not a trivial issue.
On some views, sex is not a luxury; rather, it is a central ingredient in many people’s conception
of a good life (Jecker 2020). Sex of different kinds can facilitate extraordinary pleasure,
vulnerability, exploration, and positive bodily awareness; and under the right conditions, it may
have beneficial implications for physical and mental health (Brody 2010; Diamond and
Huebner 2012; Kashdan et al. 2014).
That many people who desire sex are unable to
experience it thus seems lamentable. Insofar as sexbots could provide at least some of the goods
of sex to those who would otherwise be unable to access such goods, their availability would
be a source of considerable value. On this view, we have at least a prima facie reason to favour
their production, distribution, and use (Danaher 2020).
We can distinguish different ways in which sexbots could be valuable. One sort of value
they might bring is in the treatment of recognized sexual dysfunctions or disadvantages; the
other is in the pursuit of enhanced sexual pleasure or enjoyment, potentially going beyond what
is currently achievable for most people.
Ezio Di Nucci (2017), Neil McArthur (2017), Jeannie Gersen (2019), David Levy
(2007, 2011), and Nancy Jecker (2020), discuss various ways in which sexbots could be used
for treatment. Di Nucci (2017) argues that some people, including those with certain
disabilities, may find it prohibitively difficult to find a willing sexual partner. Given that sex is
so important for most people, Di Nucci argues, sexbots could be used to fulfil the strong desire
for sex that someone in this situation may have. People with relevant disabilities are not the
only ones who may struggle to find a willing sexual partner, however.
As suggested by
McArthur (2017), in countries such as India and China, a skewed sex ratio imbalance (more
males than females born each year), coupled with powerful cultural expectations of monogamy,
may prevent a sizable number of men from finding a partner. The same difficulty may face
people in prisons or in the military, or others who, for various reasons, cannot find a willing
sexual partner for shorter or longer stretches of time (Gersen 2019). For people with rare sexual
preferences, sexbots might be the only available (or perhaps the only ethical) way of having
those preferences fulfilled. For these groups, sexbots may serve as a (potentially suboptimal)
replacement for sex with humans under the relevant circumstances.
Sexbots might also have value insofar as they could be used to help some people have
sex (or better sex) with humans. As McArthur (2017: 40) points out: ‘People with severe
anxiety surrounding performance or body image, an incidence of sexual trauma (such as rape
or incest), adults with limited or no experience, or people who have transitioned from one sex
to another, may find that their anxieties about sex inhibit their ability to form [intimate]
relationships.’ Sexbots could potentially help people in such situations to gain greater self-
acceptance and sexual skill or confidence, which, in turn, might help them to prepare for
positive human sexual relationships. In such a case, access to sexual learning via a sexbot
partner might be considered a form of psychosexual treatment.
What about people who fall more squarely within the (statistically) normal range of
sexual capacity and experience? Might access to sexbots help them to have better sexual
experiences and become even better sexual partners to their fellow human beings? Within the
context of relationships, sexbots could be designed to give feedback to improve sensitivity to
a partner’s likes and dislikes. Sexbots might also be used in a manner that is similar to, but
goes beyond, the way that many people currently use sex toys, namely, to spice up their sex
lives. For example, a sexbot might allow a couple to have a threesome without involving other
human partners (Gersen 2019). Sexbots could also help couples to deal with discrepancies in
sexual desire by facilitating a potentially more fulfilling alternative to masturbation. Indeed,
insofar as mismatched sexual desire between a couple is contributing to the likelihood of an
otherwise avoidable—and undesirable—breakup or divorce, consensual use of a sexbot by one
or both partners may reduce this likelihood (McArthur 2017).
In principle, therefore, sexbots could facilitate, supplement, treat, or possibly even
super-enhance ethical sex between human beings (which is almost universally recognized as a
good). That being said, sexbots might also in some cases replace sex between human beings.
So-called ‘digisexuals’ prefer robots as sex partners because they favour the ‘company’ of
inanimate objects (McArthur and Twist 2017). Insofar as digisexuals can be considered to have
a sexual orientation towards robots, it would likely be valuable for them to be able to act on
their orientation with a desired object/partner (Danaher 2020).
Having canvassed some of the potential goods associated with sexbots, we can explore
in more detail the arguments that have been levied against them. We will begin with the
argument that there is something intrinsically wrong with having sex with a robot and go on to
discuss whether, even if it is not intrinsically wrong, it may still be harmful, on balance, to the
user (or others) to have sex with robots.
Can it be wrong to have sex with a robot? We have stipulated that sexbots will lack moral
standing, so it seems that they could not be wronged by such sex.
But it might still be true
that robots are among the things it is morally unacceptable for us to sexually desire, much less
engage with in a sexual manner.
The view that there are at least some kinds of entities that should not be ‘sexualized’—
even as objects of private, unacted-upon desire—has some precedence. Most people think of
children and non-human animals this way, for example. They think it is wrong even to want to
have sex with a child or non-human animal, even if one does not actually follow through on
this desire. One explanation for such a view is that certain desires are wrong in themselves by
virtue of being perverted (roughly, morally corrupted deviations from something’s original or
In Thomas Nagel’s (1969) account of sexual perversions, inspired by Jean-
Paul Sartre’s (1956) analysis of sexual desire in Being and Nothingness, Nagel includes sex
with inanimate objects alongside ‘intercourse with animals [and] infants’ as examples of such
perversions. ‘If the object is not [even] alive, the experience is’, in his view, ‘reduced entirely
to an awareness of one’s own sexual embodiment’ (Nagel 1969: 12, 14).
This reduction, he
thinks, is something that is morally troubling in its own right.
Objections to certain kinds of sexual activities on the basis of their alleged perversity
come in strong and weak variants. The strong version of Nagel’s account of sexual perversion
is what David Benatar (2002) calls the Significance View: ‘for sex to be morally acceptable, it
must be an expression of (romantic) love [signifying] feelings of affection that are
commensurate with the intimacy of sexual activity’. On this view, ‘a sexual union can be
acceptable only if it reflects the reciprocal love and affection of the parties to that union’
(Benatar 2002: 182). The advantage of such a view is that it is able to explain several
interconnected phenomena: (a) the special status that sex is often taken to have; (b) why sex
against a person’s will is almost universally regarded as an especially egregious violation of
their dignity; and (c) why it would be wrong to have sex with children or non-human animals
(i.e. it would be, among other things, non-reciprocal, where reciprocity requires a certain kind
of equal standing). This view also entails that sex with robots is outside the boundaries of what
is morally acceptable.
There are some difficulties with this view, however. One is that it seems to imply that
so-called ‘casual’ sex is morally impermissible on the grounds that it does not (usually) reflect
mutual love between the partners. But casual sex, if mutually desired and consented to, is a
practice that many people see as acceptable or even, under the right conditions, good. As
Sascha Settegast (2018: 388) argues: ‘Successfully engaging in a “lovely fling” may, on
occasion... result in feelings of general benevolence—that such an intimate, positive experience
of affirmation and community is in fact possible with otherwise strangers.’ The Significance
View also seems to exclude masturbation with sex toys, insofar as this can be considered a
kind of sex with an inanimate object. Taken together, these implications make the Significance
View seem much less plausible as an account of why certain forms of sex are permissible and
some are not.
The weak version of Nagel’s theory, which seems to reflect his own position, might be
called the Gradient View. This view does not seek to create a clear demarcation line between
permissible and non-permissible forms of sex. Rather, on this view, the degree to which a
sexual act is a perversion tracks just one aspect of its potential goodness: the less perverted an
act is, all else being equal, the better it is, and the more perverted, the worse. Insofar as sex
with a robot is a perversion, then, it could on this view be regarded as morally worse than
consensual sex with an adult human being without necessarily being impermissible. Indeed,
although perversions are morally worse than non-perversions according to this perspective,
they may nevertheless be better than nothing. This is Nagel’s (1969) own view:
Even if perverted sex is to that extent not so good as it might be, bad sex is generally
better than none at all. This should not be controversial: it seems to hold for other
important matters, like food, music, literature, and society. In the end, one must choose
from among the available alternatives, whether their availability depends on the
environment or on one’s own constitution. And the alternatives have to be fairly grim
before it becomes rational to opt for nothing.
Framed in terms of the value that sex provides for the person in question, the weak, ‘gradient’
version of the sexual perversion objection opens up the possibility that sex with robots may be
valuable for individuals who struggle to have ethical, much less good, sex with human beings.
Even if robot sex is suboptimal, that is, it may nevertheless be better than nothing. Moreover,
if sexbots could be used to help some people connect better with their human sexual partners,
such perversion would (on this view) be instrumental to the achievement of non-perverted, and
hence more valuable, forms of sex. Finally, although perverted sex is, in one respect, worse, it
could still perhaps be equal to or even better than some instances of non-perverted sex, on
balance, so long as other redeeming features are present which outweigh the badness of the
perversion, such as a much more pleasurable experience overall.
Assuming that the use of sexbots could, at least in some cases, survive the weaker (and
we think more plausible) version of the ‘perversion’ objection, what other moral objections
might be raised? One is the potential harm that such use would cause to the user’s moral
Harm to moral character
An argument against sexbots, raised by Robert Sparrow (2017), Litska Strikwerda (2017), and
John Danaher (2017a, 2017c), among others, is that the use of sexbots would causally
corrupt—or constitute corruption of—the user’s moral character. This concern applies
especially to the use of child sexbots and (other) robots used to simulate rape. Someone who
takes pleasure in ‘raping’ a robot that is made to represent a child or a non-consenting (typically
female) adult will, according to this view, show a profound deficiency in their moral
sensibilities, either because such pleasure entails a desire to harm and/or show serious
disrespect to children or women or because it reveals a lack of sensitivity to a highly likely
moral interpretation of one’s behaviour.
Now, one possibility is that such an argument rests on a conflation of symptom and
cause. ‘Even though such rape is really just the representation of rape’, Sparrow (2017: 473)
argues, taking pleasure in such an activity ‘reveals [the user] to be sexist, intemperate, and
However, if the use of child sexbots or (other) rape robots only reveals that the user
has an impermissible attitude towards vulnerable human beings, it is not clear how such use
could, on its own, constitute a further wrong or harm. Rather, it would be a powerful source of
information about the deficient moral character the user already has.
A more plausible reading of this argument is that simulating rape with a robot would
not only reveal a bad moral character, but would also causally contribute to, or exacerbate, its
badness. Those who have sex with robots will likely imagine themselves to be partaking in the
real-world equivalent of such sex. Danaher (2017a) argues that this possibility is particularly
worrisome in the case of engagement with robots compared to other forms of virtual
engagement. A person could be so fully immersed in the simulated rape of a robot that the
difference between representation and reality disappears.
Alternatively, it could be argued that there is an important distinction between a
representation and that which is represented. Gert Gooskens (2010), for example, draws on the
work of Edmund Husserl to argue that there always will be a psychological distance between
what is represented and what is real. When one looks at a photograph of a friend, one is not
directly perceiving one’s friend: ‘it is only as if I do’ (Gooskens 2010: 66). The same is true,
Goosken argues, for all representations.
Who is right? Do people desire the real-world equivalent of that which is represented
within a fantasy or is there a disconnection between the two? Consider so-called rape ‘play’
(i.e. mutually desired and consensual simulation of rape) as an analogy. Many women and men
have fantasies about being raped or raping others (Bivona and Critelli 2009; Critelli and
Bivona, 2008). Suppose, then, that two consenting adults engage in rape play. Some may argue
that such role play is inherently repugnant, given how many people suffer from sexual violence.
For instance, they may argue that the role play amounts to ‘making light’ of such suffering.
However, interpretive context matters. Without further evidence, it would be a leap to conclude
that such consenting adults in fact desire to rape another human being or to be raped, much less
that they endorse rape in real life.
Moreover, people who engage in BDSM
intentionally inflict pain on another human being as a part of a sexual encounter. However, it
does not follow from this that they desire sexualized torture in general (Airaksinen 2018). By
analogy, it seems plausible to conclude that desiring to engage in simulated rape with a robot
does not, on its own, entail that one desires to rape a sentient human being.
Nevertheless, it may still be true that to engage in certain acts requires an objectionable
lack of moral sensitivity. If one simulates rape with a child sexbot, for example, one must be
able to ‘bracket’ certain vital norms surrounding impermissible, indeed, repugnant, sex in our
society. The sheer capacity to do this with an entity that cannot give meta-consent (as may be
given for certain BDSM activities, for instance), or indeed any kind of consent, may indicate
that one has a defective moral character.
And actually engaging in the simulated rape may
reinforce the defect (which would be bad in itself), as well as erode behavioural inhibitions
towards such rape in real life (which would be instrumentally bad).
The notion of instrumental badness can be applied more generally. One worry is that the
widespread use of sexbots could cause indirect harms, including undermining important moral
norms that regulate sex between humans. By buying and using sexbots, users would uphold,
and potentially normalize, a social practice in which robots with certain characteristics were
produced and sold in stores or online. This social practice might be interpreted by others in
ways that users cannot control and may not intend (Sparrow 2017).
Such interpretations, in
turn, could change the norms regarding human sexual behaviour for the worse.
One possibility is that objectionable norms concerning attitudes towards, and ultimately
treatment of, women would be reinforced. Currently, sex dolls and rudimentary sexbots are
made to portray women as submissive, sexualized, and always ‘available’. Sinziana Gutiu
(2012: 2) argues that sexbots will thus contribute to the belief that women are sexual objects
who should be ready to satisfy men at any time: ‘By circumventing any need for consent, sex
robots eliminate the need for communication, mutual respect and compromise in the sexual
relationship.’ Gutiu is concerned that such non-consensual sex will ‘affect men’s ability to
identify and understand consent in sexual interactions with women’. In short, sexbots may
serve to undermine women’s ability to have their boundaries and desires readily recognized
and thus respected (see also Sparrow 2017: 471).
Florence Gildea and Kathleen Richardson (2017) similarly argue that sexbots
‘represent a step backwards by perpetuating objectification and hence blurring distinctions
between sex, masturbation and rape’. Richardson (2015: 292), who makes a parallel with
prostitution central to her opposition to sexbots, argues that ‘the development of sex robots will
further reinforce relations of power that do not recognize both parties as human subjects’.
In the context of child pornography, Suzanne Ozt (2009: 105) argues that legalization
of such content is ‘objectifying children as sexual objects or resources for unbridled
exploitation, [which] may promote the reduction of children to this status’. The idea is that the
production, distribution, and use of sexbots (like certain pornographic content) may increase
the sexualization of children and women in society, making it more difficult for them to live
fulfilling lives. If children or women see themselves as (predominately) sexual objects, this
may also make them more vulnerable to harmful and unwanted sexual encounters.
It should be noted that these objections point to possible empirical effects that may or
may not materialize. These possibilities should nevertheless be taken very seriously. As it is, it
is an uphill battle to move societal norms around sex towards greater mutuality and respect,
focusing on the need to ensure consent
and to be sensitive to the needs and desires of one’s
sexual partner(s) (Danaher 2017c; Gildea and Richardson 2017). If sexbots of the sort that are
currently envisioned become widely available, this may create the impression that non-
consensual sex is a preference that one can be justified in satisfying. In fact, as Sparrow argues
in his chapter in this volume, there is a deeper bind. If sexbots are designed in a way that allows
simulated rape, this may signal or express that it is permissible to engage in non-consensual
sex. However, if female sexbots are designed to always simulate consent, this may express that
women should be perpetually ready to satisfy men’s sexual desires.
In either case, the
expressive significance of the sexbots is problematic, and there is an unknown further risk that
it will indirectly contribute to an increase in sexual violence (see also Sparrow 2017).
As can be seen, there are multiple potential risks associated with the development, production,
and use of humanoid sexbots. These risks have led some opponents of sexbots, such as
Richardson, to favour their outright prohibition. Sparrow also leans in this direction. Although
Sparrow (2017: 475) concedes that there may be some benefits involved in allowing sexbots,
he argues that, ‘given the importance of the goal of gender equality and of a rape-free society,
[...] it may well turn out to be the case that there is no ethical way to design sexbots modelled
on women’ (for further discussion, see Nyholm forthcoming). Others, such as Gutiu (2012),
Arkin (quoted in Hill 2014), and Danaher (2017c, 2019a), favour regulating the production and
distribution of sexbots. What might such regulation look like from the perspective of harm
Harm reduction as an ethical framework has been applied to several contested
commodities or activities, such as drugs and prostitution or sex work (Mac and Smith 2018;
Ritter and Cameron 2006; Single 1995; Bioethicists and Allied Professionals for Drug Policy
Reform 2021). The inspiration for such a framework can be traced to Jeremy Bentham’s
writings in the eighteenth and nineteenth centuries (see, e.g. Bentham 1789/2007). Bentham’s
progressive proposals regarding homosexuality arguably constituted an early instance of a
harm reduction approach (1785/1978). Instead of engaging with debates about whether
homosexuality might be intrinsically immoral, Bentham proposed that society should ask a
simple question: Does homosexual sex harm anyone and will a ban lead to less harm? If the
answer to either one of these questions is ‘No’, then people should, he believed, be free to
pursue whatever activities they found meaningful or which gave them pleasure. Could such an
approach be applied to the case of sexbots?
The harm reduction framework depends on three ideas: first, that prohibition is not
always the most efficient or effective means of reducing even genuinely harmful outcomes of
a given activity; second, that harms may not be linearly connected to the activity: more of the
activity does not necessarily mean more harm, and how harmful something is depends on how
it is used and what the alternatives are; and third, that we should think not only about the
potential harms of allowing some activity, but also to the potential harms of prohibition, while
giving due consideration to the potential benefits of the activity (in light of what could plausibly
be achieved through responsible regulation) (Earp et al. 2014; Single 1995).
Consider the example of people who are addicted to heroin. Research strongly suggests
that making the drug safely available to such individuals reduces harm, without increasing
overall drug use (Ferri et al. 2011; Strang et al. 2015). This phenomenon could be explained
by safer consumption practices, reduction in the perceived need for criminal behaviour, and
freeing up time, resources, and mental capacity which the individual can use to get their life in
order, which in turn reduces the underlying cause(s) of their addiction (Ferri et al. 2011; Strang
et al. 2015). In this case, avoiding outright prohibition appears to allow the turning of many
regulatory dials to promote benefits and discourage harms (Earp et al. 2021). The way that
many countries have regulated tobacco and alcohol may be another example of success by this
approach. Taxes and restrictions on sale and consumption seem to be effective in preventing
harmful overuse (Bader et al. 2011; Wagenaar et al. 2009).
Let us now apply these lessons to sexbots. First, as we have seen, there are several
potential harms and benefits that the availability and use of sexbots may bring about. There is
thus at least a prima facie reason to try to reap the benefits at the smallest cost possible. Sexbots
for treatment and remedial purposes, or to help humans improve their sexual practices with
each other, are probably the least controversial. If sexbots could be designed to allow the
practicing of positive and respectful sexual habits, they might increase the likelihood of people
achieving a more optimal version of the good(s) of sex. Insofar as there are other ways of
practicing such habits, however, that do not carry the risks associated with sexbots, then those
other means should likely be preferred.
The harms involved in the production, distribution, and use of sexbots are to a large
degree contingent on how people interpret the practice, which depends on how these activities
are situated within the public awareness. To encourage the plausibly beneficial, and discourage
the plausibly harmful, uses of sexbots, it may be prudent to ban certain kinds of advertisements
or in other ways regulate how sexbots are designed and sold, instead of banning their
production altogether. In addition, certain kinds of sexbots could be banned, such as those that
allow the simulation rape, while therapeutic sexbots could perhaps be permitted, available only
to those who are appropriately licensed in their use.
When we think about how to alleviate the harms involved in allowing sexbots, we also
need to keep in mind what they could replace. Levy (2011) has argued that sexbots could
replace human prostitution and perhaps also pornography, which might be an improvement
over the current situation. It seems plausible that the putatively harmful effects of representing
women as enjoying acts ‘that are objectifying, degrading, or even physically injurious’, as
Anne W. Eaton (2007: 682) argues, are more likely or more profound when the women in
question are humans, rather than robots (as long as one can tell them apart).
What about child sexbots? Arkin is among the few who have argued that a harm
reduction model should be adopted for such sexbots. He is interested in how child sexbots
might be used to prevent the abuse of real human children. ‘Child-like robots could’, Arkin
suggests, ‘be used for pedophiles the way methadone is used to treat drug addicts’ (quoted in
Whether sexbots would in fact replace prostitution, pornography, and child sex abuse,
are, of course, unresolved empirical questions. Richardson (2015: 291) expresses doubts that
‘sex robots could help to reduce prostitution... If an artificial substitute [in the form of
pornography] reduced the need to buy sex, there would be a reduction in prostitution, but no
such correlation is found.’ Acknowledging such empirical uncertainty, Danaher (2017c) has
proposed that an empirical approach be taken, in which experiments could be conducted, in
certain contexts, to produce the knowledge needed to proceed. Gutiu (2012: 22), who is critical
of sexbots, similarly raises the possibility of further research: ‘[s]ex robots could provide an
opportunity to understand and correct violent and demeaning attitudes towards women. If
regulated, sex robots could provide a means of researching the roots of sex and intimacy for
both genders, demystifying female sexuality, and addressing the roots of women’s oppression.’
Even if sexbots never become sentient, we have good reasons to be concerned with their
production, distribution, and use. Our seemingly private activities have social meanings that
we do not necessarily intend, but which can be harmful to others. Sex can both be beautiful
and valuable—and ugly or profoundly harmful. We therefore need strong ethical norms to
guide human sexual behaviour, regardless of the existence of sexbots. Interaction with new
technologies could plausibly improve our sexual relationships, or make things worse (see
Nyholm et al. forthcoming, for a theoretical overview). In this chapter, we have explored some
ways in which a harm reduction framework may have the potential to bring about the alleged
benefits of sexbots with a minimum of associated harms. But whatever approach is taken, the
goal should be to ensure that our relationships with robots conduce to, rather than detract from,
the equitable flourishing of our fellow human beings.
Airaksinen, Timo (2018), ‘The Language of Pain: A Philosophical Study of BDSM’, SAGE
Open 8(2), doi: https://doi.org/10.1177/2158244018771730.
Bader, Pearl, Boisclair, David, and Ferrence, Roberta (2011), ‘Effects of Tobacco Taxation
and Pricing on Smoking Behavior in High Risk Populations: A Knowledge Synthesis’,
International Journal of Environmental Research and Public Health 8(11), 4118–4139,
Benatar, David (2002), ‘Two Views of Sexual Ethics: Promiscuity, Pedophilia, and Rape’,
Public Affairs Quarterly 16(3), 191–201, http://www.jstor.org/stable/40441324,
accessed 12 August 2021.
Bentham, Jeremy (1785/1978), ‘Offences against One’s Self’, Journal of Homosexuality 3(4),
389–406, doi: https://doi.org/10.1300/J082v03n04_07.
Bentham, Jeremy (1789/2007),. An Introduction to the Principles of Morals and Legislation
(Mineola, NY: Dover Publications).
Bivona, Jenny M., and Critelli, Joseph W. (2009), ‘The Nature of Women’s Rape Fantasies:
An Analysis of Prevalence, Frequency, and Contents’,. Journal of Sex Research, 46(1),
33–45, doi: https://doi.org/10.1080/00224490802624406.
Bloom, Paul, and Harris, Sam (2018), ‘It’s Westworld. What’s Wrong with Cruelty to
Robots?’, New York Times, 23 April,
morality.html, accessed 12 August 2021.
Brody, Stuart (2010), ‘The Relative Health Benefits of Different Sexual Activities’, Journal of
Sexual Medicine 7(4, Part 1), 1336–1361, doi: https://doi.org/10.1111/j.1743-
Brown, Christina (2019), ‘Sex Robots, Representation, and the Female Experience’, The
American Papers 37, 105–118,
age=105, accessed 1 September 2021.
Bryson, Joanna J. (2010), ‘Robots Should Be Slaves’, in Yorick Wilks, ed), Close
Engagements with Artificial Companions: Key Social, Psychological, Ethical and
Design Issues (Amsterdam: John Benjamins Publishing Co.), 63–74.
Critelli, Joseph W., and Bivona, Jenny M. (2008), ‘Women’s Erotic Rape Fantasies: An
Evaluation of Theory and Research’, Journal of Sex Research 45(1), 57–70, doi:
Danaher, John (2017a), ‘Robotic Rape and Robotic Child Sexual Abuse: Should They Be
Criminalised?’, Criminal Law and Philosophy 11(1), 71–95, doi:
Danaher, John (2017b), ‘Should We Be Thinking about Sex Robots?’, in John Danaher and
Neil McArthur, eds., Robot Sex: Social Implications and Ethical Implications
(Cambridge, MA: MIT Press), 3–14.
Danaher, John (2017c), ‘The Symbolic-Consequences Argument in the Sex Robot Debate’, in
John Danaher and Neil McArthur, eds, Robot Sex: Social and Ethical Implications
(Cambridge, MA: MIT Press), 103–131.
Danaher, John (2019a), ‘Building Better Sex Robots: Lessons from Feminist Pornography’, in
Yuefang Zhou and Martin H. Fischer, eds, AI Love You: Developments in Human–
Robot Intimate Relationships (Cham: Springer International Publishing), 133–147.
Danaher, John (2019b), ‘Regulating Child Sex Robots: Restriction or Experimentation?’,
Medical Law Review 27(4), 553–575, doi: https://doi.org/10.1093/medlaw/fwz002.
Danaher, John (2019c), ‘Welcoming Robots into the Moral Circle: A Defence of Ethical
Behaviourism’, Science and Engineering Ethics, doi: 10.1007/s11948-019-00119-x.
Danaher, John (2020), ‘Sexuality’, in Markus Dubber, Frank Pasquale, and Sunit Das, eds,
Oxford Handbook of the Ethics of Artificial Intelligence (Oxford: Oxford University
Danaher, John, Earp, Brian D., and Sandberg, Anders (2017), ‘Should We Campaign against
Sex Robots?’, in John Danaher and Neil McArthur, eds, Robot Sex: Social and Ethical
Implications (Cambridge, MA: MIT Press), 47–71.
Di Nucci, Ezio (2017), ‘Sex Robots and the Rights of the Disabled’, in John Danaher and Neil
McArthur, eds, Robot Sex: Social and Ethical Implications (Cambridge, MA: MIT
Diamond, Lisa M., and Huebner, David M. (2012), ‘Is Good Sex Good for You? Rethinking
Sexuality and Health’, Social and Personality Psychology Compass 6(1), 54–69, doi:
Earp, Brian D., and Grunt-Mejer, Katarzyna (2020), ‘Robots and Sexual Ethics’, Journal of
Medical Ethics 47(1), 1–2, doi: https://doi.org/10.1136/medethics-2020-107153.
Earp, Brian D., Lewis, Jonathan, Hart, Carl L., and Bioethicists and Allied Professionals for
Drug Policy Reform (2021), ‘Racial Justice Requires Ending the War on Drugs’,
American Journal of Bioethics, 21(4), 4-19 doi:
Earp, Brian D., and Moen, Ole Martin (2016), ‘Paying for Sex—Only for People with
Disabilities?’, Journal of Medical Ethics 42(1), 54–56, doi:
Earp, Brian D., and Savulescu, Julian (2020), Love Drugs: The Chemical Future of
Relationships (Stanford, CA: Stanford University Press).
Earp, Brian D., Sandberg, Anders, and Savulescu, J. (2014), ‘Brave New Love: The Threat of
High-Tech “Conversion” Therapy and the Bio-Oppression of Sexual Minorities’,
AJOB Neuroscience 5(1), 4–12, doi: https://doi.org/10.1080/21507740.2013.863242.
Eaton, Anne W. (2007), ‘A Sensible Antiporn Feminism’, Ethics 117, 674–715, doi:
Ferri, Marica, Davoli, Marina, and Perucci, Carlo A. (2011), ‘Heroin Maintenance for Chronic
Heroin‐Dependent Individuals’, Cochrane Database of Systematic Reviews 12(1), 1–
55, doi: https://doi.org/10.1002/14651858.CD003410.pub4.
Fischel, Joseph J. (2019), Screw Consent: A Better Politics of Sexual Justice (Berkeley, CA:
University of California Press).
Frank, Lily, and Nyholm, Sven (2017), ‘Robot Sex and Consent: Is Consent to Sex between a
Robot and a Human Conceivable, Possible, and Desirable?’, Artificial Intelligence and
Law 25(3), 305–323, doi: https://doi.org/10.1007/s10506-017-9212-y.
Gersen, Jeannie S. (2019), ‘Sex Lex Machina and Artificial Intelligence’, Columbia Law
Review 119(7), 1793–1810, doi: https://doi.org/10.2307/26810849.
Gildea, Florence, and Richardson, Kathleen (2017), ‘Sex Robots: Why We Should Be
Concerned’, Campaign against Sex Robots, 12 May,
accessed 1 September 2021.
Gooskens, Gert (2010), ‘The Ethical Status of Virtual Actions’, Ethical Perspectives 17, 59–
78, doi: https://doi.org/10.2143/EP.17.1.2046957.
Gupta, Kristina (2011), ‘“Screw Health”: Representations of Sex as a Health-Promoting
Activity in Medical and Popular Literature’, Journal of Medical Humanities 32(2), 127–
140, doi: https://doi.org/0.1007/s10912-010-9129-x.
Gupta, Kristina (2016), ‘Why Not a Mannequin?: Questioning the Need to Draw Boundaries
around Love When Considering the Ethics of “Love-Altering” Technologies’,
Philosophy, Psychiatry, & Psychology 23(2), 97–100, doi:
Gupta, Kristina (forthcoming), ‘What Is a Sexual Act?’, in Brian D. Earp, C. Chambers, and
L. Watson, eds, Routledge Handbook on Philosophy of Sex and Sexuality (London:
Gutiu, Sinziana (2012), ‘Sex Robots and Roboticization of Consent’, Paper presented at the
We Robot Conference, http://robots.law.miami.edu/wp-
content/uploads/2012/01/Gutiu-Roboticization_of_Consent.pdf, accessed 12 August
Hill, Kashmir (2014), ‘Are Child Sex-Robots Inevitable?’, Forbes,
inevitable/#519f9bb7e460, accessed 1 September 2021.
Jecker, Nancy S. (2020), ‘Nothing to Be Ashamed of: Sex Robots for Older Adults with
Disabilities’, Journal of Medical Ethics 47(1), 26–32, doi:
Kashdan, Todd B., Adams, Leah M., Farmer, Antonina S., Ferssizidis, Patty, McKnight,
Patrick E., and Nezlek, John B. (2014), ‘Sexual Healing: Daily Diary Investigation of
the Benefits of Intimate and Pleasurable Sexual Activity in Socially Anxious Adults’,
Archives of Sexual Behavior 43(7), 1417–1429, doi: https://doi.org/10.1007/s10508-
Kubes, Tanja (2019), ‘New Materialist Perspectives on Sex Robots. A Feminist
Dystopia/Utopia?’, Social Sciences 8(224), 1–14, doi:
Kukla, Quill R. (2018), ‘That’s What She Said: The Language of Sexual Negotiation’, Ethics
129(1), 70–97, doi: https://doi.org/10.1086/698733.
Langton, Rae (1993), ‘Speech Acts and Unspeakable Acts’, Philosophy & Public Affairs 22(4),
293–330, http://www.jstor.org/stable/2265469, accessed 12 August 2021.
Levy, David (2007), Love and Sex with Robots: The Evolution of Human–Robot Relationships
(New York: Harper Collins).
Levy, David (2011), ‘Robot Ethics: The Ethical and Social Implications of Robotics), in Patrick
Lin, Keith Abney, and George A. Bekey, eds, Robot Ethics: The Ethical and Social
Implications of Robotics (Cambridge, MA: MIT Press), 223–231.
Luck, Morgan (2009), ‘The Gamer’s Dilemma: An Analysis of the Arguments for the Moral
Distinction between Virtual Murder and Virtual Paedophilia’, Ethics and Information
Technology 11(1), 31–36, doi: https://doi.org/10.1007/s10676-008-9168-4.
Mac, Juno, and Smith, Molly E. (2018), Revolting Prostitutes: The Fight for Sex Workers’
Rights (London: Verso).
MacKinnon, Catharine A. (1995), Only Words (London: Harper Collins).
McArthur, Neil (2017), ‘The Case for Sexbots’, in John Danaher and Neil McArthur, eds,
Robot Sex: Social and Ethical Implications (Cambridge, MA: MIT Press), 31–45.
Moen, Ole Martin, and Sterri, Aksel B. (2018), ‘Pedophilia and Computer-Generated Child
Pornography’, in David Boonin, ed., The Palgrave Handbook of Philosophy and Public
Policy (Cham: Springer International Publishing), 369–381.
Nagel, Thomas (1969), ‘Sexual Perversion’, Journal of Philosophy 66(1), 5–17, doi:
Nyholm, Sven (forthcoming), ‘The Ethics of Humanoid Sex Robots’, in Brian D. Earp, C.
Chambers, and L. Watson, eds, Routledge Handbook on Philosophy of Sex and
Sexuality (London: Routledge).
Nyholm, Sven, Danaher, John, and Earp, Brian D. (forthcoming), ‘The Technological Future
of Love’, in: N. McKeever et al., eds, Love: Past, Present, and Future.
Ozt, Suzanne (2009), Child Pornography and Sexual Grooming: Legal and Societal Responses
(Cambridge: Cambridge University Press).
Petersen, Steve (2017), ‘Is It Good for Them Too? Ethical Concern for the Sexbots’, in John
Danaher and Neil McArthur, eds, Robot Sex: Social and Ethical Implications
(Cambridge, MA: MIT Press), 155–171.
RealBotix (2020), ‘FAQ’, RealBotix, https://realbotix.com/FAQ#q1, accessed 12 August
Richardson, Kathleen (2015), ‘The Asymmetrical “Relationship”: Parallels between
Prostitution and the Development of Sex Robots’, SIGCAS Computers & Society
45(3), 290–293, doi: https://doi.org/10.1145/2874239.2874281.
Ritter, Alison, and Cameron, Jacqui (2006), ‘A Review of the Efficacy and Effectiveness of
Harm Reduction Strategies for Alcohol, Tobacco and Illicit Drugs’, Drug and Alcohol
Review 25(6), 611–624, doi: https://doi.org/10.1080/09595230600944529.
Sartre, Jean-Paul (1956), Being and Nothingness, trans. H. E. Barnes (New York: Philosophical
Settegast, Sascha (2018), ‘Prostitution and the Good of Sex’, Social Theory and Practice 44(3),
377–403, http://www.jstor.org/stable/44987073, accessed 12 August 2021.
Single, Eric (1995), ‘Defining Harm Reduction’, Drug and Alcohol Review 14(3), 287–290,
Sparrow, Robert (2004), ‘The Turing Triage Test’, Ethics and Information Technology 6(4),
203–213, doi: https://doi.org/10.1007/s10676-004-6491-2.
Sparrow, Robert (2017), ‘Robots, Rape, and Representation’, International Journal of Social
Robotics 9(4), 465–477, doi: https://doi.org/10.2139/ssrn.2044797.
Strang, John, Groshkova, Teodora, Uchtenhagen, Ambros et al. (2015), ‘Heroin on Trial:
Systematic Review and Meta-Analysis of Randomised Trials of Diamorphine-
Prescribing as Treatment for Refractory Heroin Addiction’,. British Journal of
Psychiatry 207(1), 5–14, doi: https://doi.org/10.1192/bjp.bp.114.149195.
Strikwerda, Litska (2017), ‘Legal and Moral Implications of Child Sex Robots’, in John
Danaher and Neil McArthur, eds, Robot Sex: Social and Ethical Implications
(Cambridge, MA: MIT Press), 133–151.
Wagenaar, Alexander C., Salois, Matthew J., and Komro, Kelli A. (2009), ‘Effects of Beverage
Alcohol Price and Tax Levels on Drinking: A Meta-Analysis of 1003 Estimates from
112 Studies’, Addiction 104(2), 179–190, doi: https://doi.org/10.1111/j.1360-
For contributions to the debate about sentient robots, see John Danaher (2019c), Robert
Sparrow (2004), Paul Bloom and Sam Harris (2018), Steve Petersen (2017), Joanna J. Bryson
(2010), and Lily Frank and Sven Nyholm (2017).
The third feature, its embodiment in physical space, distinguishes this definition from
Danaher’s (2017: 4–5).
For a detailed analysis of this campaign, see Danaher et al. (2017).
It is philosophically controversial what sorts of activities ‘count’ as sex (as opposed to, say,
masturbation) (see, e.g. Gupta forthcoming). We will not attempt to settle that controversy
here. For those who are committed to the view that there can be no such thing as ‘sex’ with a
robot, perhaps because they view sex as something that is only possible between members of
an organic species, the word ‘sexlike’ or ‘pseudosex’ can be substituted where relevant in what
For a critical discussion of discourses around sex as a health-promoting activity, see Kristina
Gupta (2011). For an argument that sex robots might promote unhealthy sexual practices or
attitudes, see Brian D. Earp and Katarzyna Grunt-Mejer (2020).
Of course, many people with disabilities of all kinds do not struggle to find willing sexual
partners, unless one defines ‘disability’ in this context in a circular way to mean ‘sexually
disabled’ (which refers not to the physical lack of capacity to have a certain kind sexual
experience, but rather the inability to find a willing sexual partner). For further discussion, see
Brian D. Earp and Ole Martin Moen (2016).
Against this view, someone might argue that certain objects, such as a work of art, or even
some machines, can be worthy of respect. Certainly, if someone were to destroy one of the
most astonishing expressions of human ingenuity, it would not seem sufficient, from a moral
perspective, for the person to respond by saying it was ‘just a machine’. There are various ways
one could attempt to make sense of this moral intuition (e.g. if disrespect is shown in this case,
it is to the inventor or builder of the machine rather than to the machine itself); however, we
will not pursue this digression further.
We are indebted to McArthur (2017) for the references to Thomas Nagel’s and David
Benatar’s writings in this area.
For an alternative view, according to which even a non-mobile mannequin may (at least
potentially) be appropriately considered an object of love or desire, see Gupta (2016).
Similarly, Danaher (2017a: 86) argues that sex with such robots ‘directly expresses a
deficient moral character because [the user desires] real-world rape and child sexual abuse’.
Gert Gooskens (2010), Morgan Luck (2009), Ole Martin Moen and Aksel B. Sterri (2018),
and Jeannie Gersen (2019), are among those who have questioned whether engaging in (or
desiring) behaviour with a robot—behaviour that would be harmful or wrongful if done to a
human—shows that one desires to do the same thing to an actual sentient being.
The reference to Gooskens is indebted to Danaher (2017a).
An objection to this view is that rape play, even in the special context of BDSM, may
involve a kind of endorsement of dominant behaviour that could have negative externalities.
Bondage and Discipline/Dominance and Submission/Sadism and Masochism.
There may be some narrow (theoretical) exceptions. Moen and Sterri (2018: 375), for
instance, argue that if paedophiles—who have not chosen to be sexually attracted to children—
were to use child sexbots to satisfy their desires, this could be virtuous insofar as such use was
necessary and intended to avoid the pursuit of sexual contact with real human children.
Whether there is any substantial proportion of paedophiles for whom such use would be
(practically) necessary and intended in this way is an empirical question to which we do not
know the answer.
For sentences or utterances in a language, the relevant community is the set of speakers
fluent in that language. For actions or social practices, the relevant group is less clearly defined,
but ‘the community’ or ‘society’ are the most plausible reference points.
For discussions of the silencing effects of pornography, see Rae Langton (1993) and
Catharine A. MacKinnon (1995).
However, see the recent work of Quill R. Kukla (2018) and Joseph J. Fischel (2019), both
of whom argue that there is much more to good sex than the ‘mere’ securing of consent, such
that new models focused on more positive exploration and expression of desire are needed.
Sparrow’s bind does not demonstrate that sex robots necessarily will be harmful. It shows
that the option of having robots where rape can be simulated and where such simulation is
impossible may both have harmful effects.
However, as Moen and Sterri (2018) point out, such licensing practices could also reduce
uptake and therefore be less effective in reducing harms than a less regulated market. We are
grateful for being reminded of this possibility by an anonymous reviewer.
Moen and Sterri (2018) suggest that regulation of child sexbots might be pursued in such
a way that paedophiles would have to agree to take part in carefully controlled experiments to
gain access to them. In that way, it might be possible to learn how to deal more effectively with
paedophilia, and perhaps gain insights into (disordered) sexual desire more generally. Danaher
(2019b), by contrast, favours outright prohibition of child sexbots over an experimental, harm
reduction approach in which sexbots are made available in a regulated way. Danaher’s reasons
are that science is too flawed to establish a consensus about the effects of using such sexbots
and that it is implausible that paedophiles will offend less when they have access to them. We
do not aim to settle this debate here.