ArticlePDF Available

Sexbots: Customizing Them to Suit Us versus an Ethical Duty to Created Sentient Beings to Minimize Suffering

Authors:

Abstract

Sex robot scholarship typically focuses on customizable simulacra, lacking sentience and self-awareness but able to simulate and stimulate human affection. This paper argues that future humans will want more: sex robots customized to possess sentience and self-awareness [henceforth, sexbots], capable of mutuality in sexual and intimate relationships. Adopting a transdisciplinary critical methodology focused on the legal, ethical and design implications of sexbots, it assesses implications of sexbots’ non-mammalian subjectivity, balancing designed-in autonomy and control, decision-making capacity and consent, sexual preferences and desire, legal and moral status, vulnerability and contrasts between mammalian and non-mammalian moral decision-making. It explores theoretical, ethical, and pragmatic aspects of the tensions involved in creating sentient beings for utilitarian purposes, concluding that sexbots, customized manufactured humanlike entities with the capacity for thought and suffering, have a consequent claim to be considered moral and legal persons, and may become the first conscious robots. Customizing sexbots thus exemplifies many profound ethical, legal and design issues. The contradictions inherent in their inconsistent ethical and legal status as both manufactured things and sentient, self-aware entities who are customized to be our intimate partners augments existing human/animal scholars’ call for a new theoretical framework which supersedes current person/thing dichotomies governing human responsibilities to other sentient beings. The paper concludes that the ethical limits and legal implications of customizable humanlike robots must be addressed urgently, proposing a duty on humans as creators to safeguard the interests and minimize the suffering of created sentient beings before technological advances pre-empt this possibility.
Robotics 2018, 7, 70; doi:10.3390/robotics7040070 www.mdpi.com/journal/robotics
Article
Sexbots: Customizing Them to Suit Us versus an
Ethical Duty to Created Sentient Beings to Minimize
Suffering
Robin Mackenzie *
Law School, University of Kent, Canterbury CT2 7NS, UK; r.mackenzie@kent.ac.uk;
* Correspondence: r.mackenzie@kent.ac.uk; Tel.: +44-1227-764000
Received: 15 August 2018; Accepted: 6 November 2018; Published: 11 November 2018
Abstract: Sex robot scholarship typically focuses on customizable simulacra, lacking sentience and
self-awareness but able to simulate and stimulate human affection. This paper argues that future
humans will want more: sex robots customized to possess sentience and self-awareness [henceforth,
sexbots], capable of mutuality in sexual and intimate relationships. Adopting a transdisciplinary
critical methodology focused on the legal, ethical and design implications of sexbots, it assesses
implications of sexbots’ non-mammalian subjectivity, balancing designed-in autonomy and control,
decision-making capacity and consent, sexual preferences and desire, legal and moral status,
vulnerability and contrasts between mammalian and non-mammalian moral decision-making. It
explores theoretical, ethical, and pragmatic aspects of the tensions involved in creating sentient
beings for utilitarian purposes, concluding that sexbots, customized manufactured humanlike
entities with the capacity for thought and suffering, have a consequent claim to be considered moral
and legal persons, and may become the first conscious robots. Customizing sexbots thus exemplifies
many profound ethical, legal and design issues. The contradictions inherent in their inconsistent
ethical and legal status as both manufactured things and sentient, self-aware entities who are
customized to be our intimate partners augments existing human/animal scholars’ call for a new
theoretical framework which supersedes current person/thing dichotomies governing human
responsibilities to other sentient beings. The paper concludes that the ethical limits and legal
implications of customizable humanlike robots must be addressed urgently, proposing a duty on
humans as creators to safeguard the interests and minimize the suffering of created sentient beings
before technological advances pre-empt this possibility.
Keywords: sexbots; sex robots; robot consciousness; roboethics; robot law; biomimetics;
neurorobotics; robot customization; robot rights; ethical duty to created sentient beings; ethics of
pain and suffering in robotics
1. Introduction
By the time there are no laws to prevent human-robot marriages, robots will be patient, kind, protective, loving,
trusting, truthful, persevering, respectful, uncomplaining, complimentary, pleasant to talk to and sharing your
sense of humor. And the robots of the future will not be jealous, boastful, arrogant, rude, self-seeking or easily
angered, unless of course you want them to be.
So when the law allows it, why not marry a robot?
David Levy, Why not marry a robot? ([1] at p. 13).
Customization of products fuels consumer choice, a fundamental engine of commercial
transactions. While ethical deliberation over the morality of capitalism, the environmental impact of
Robotics 2018, 7, 70 2 of 17
commercial production and the social cost of particular products is ongoing, customization itself, the
fine-tuning of products to suit customers’ purposes, is usually regarded as ethically non-problematic.
The customization of robots to fulfill utilitarian purposes is similarly framed as an inevitable part of
the new era, where mass production in a global marketplace gives way to mass customization [2].
Yet this trend is more complex where social robots are concerned. Since these robots may be
customized to engage humans socially, they are capable of being anthropomorphized and perceived
as animate persons. Moreover, they may be customized to perform tasks usually carried out by
humans, and so foster unemployment. Ethical debate on the impact of robots as products has tended
to focus rather on their social impact: how far they will replace humans in the workplace, in decision-
making and in social interaction.
Social robotics is an important and expanding field where the ethics, practicalities, and social
consequences of questions over how and in what contexts far social robots will prove acceptable to
the public are assessed [3]. Research into how social robots are perceived in clinical situations such
as robot assisted therapy for children with developmental disorders [4], how experienced and future
professionals in such contexts perceive social robots [5] and how social robot body proportions affect
their being perceived as gendered [6] represent exemplary thoughtful interrogations of the potential
roles and ethical dimensions of social robots’ role in clinical contexts. There is increasing acceptance
that social robots may be customized to prove useful in education [7] as well as in clinical situations [8].
Carebots, or care robots, are currently promulgated as an essential source of psychological, medical
and welfare support for the increasing proportions of elderly and infirm citizens as demographic
changes progress [9,10]. Yet unease has been expressed by scholars over whether social robots
involved in situations associated with human intimacy may impoverish humans’ intimate
relationships with one another where interactions are mediated by or replaced by robots [11,12].
Many such concerns center on the prediction that if humans treat things as people, there is a
greater likelihood that humans will treat other humans as things or replace humans with things.
Social robotics scholars have also raised concerns that humans are likely to anthropomorphize social
robots such as carebots, non-sentient robots used to provide caring services for the elderly and infirm,
often as a replacement for human carers [9,10,13,14]. Some consider that carebots signal an
abandonment of vulnerable cared for people to soul destroying social isolation where the minimum
services needed to sustain life are provided by machines, replacing human contact, while others
argue that these vulnerable cared for people could be unethically deceived to imagine that carebots
provide real empathy and caring [15,16].
Sex robots as a subtype of social robots amplify these fears that humans will not only prefer to
interact with robots rather than each other, but also be potentially encouraged to mistreat one
another [11,12,17]. Sex robots customized to accord with male fantasies have been condemned as
liable to demean human sexual relationships, promote anti-women values and to devalue
human/human intimacy by campaigners against sex robots [11]. David Levy’s words quoted above
are typical of sex robot scholarship, which positions them as customizable simulacra, lacking
sentience and self-awareness but able to simulate and stimulate human affection. Some
customizations of sex robots, such as those created to resemble children, or to display resistance
against simulated rape and torture, have been condemned as ethically questionable [17–19].
This paper seeks to move the debate on sex robots beyond the focus on robots lacking sentience
and self-awareness and their social consequences. It is based on the assumption that humans’ wish
for intimate companions is fundamental to our being, particularly given that the quality of humans’
relationships, particularly romantic relationships, is the most powerful predictor of our health and
subjective well-being [20]. Hence our wish for sexual partners who are sensitive to our emotional and
sexual needs makes sense. As not all of us find satisfying relationships with human partners, the
future is likely to hold ongoing niche markets for sentient, self-aware sex robots [henceforth, sexbots]
and non-sentient animated robotic sex dolls with a range of capacities. Those of us seeking intimacy,
in the absence of human alternatives, with partners whom we can regard as more or less equal will
prefer sentient, self-aware sexbots. Men and women will be able to purchase male and female sexbots
who are at ease with a wide range of lawful sexual preferences, including Bondage, Domination and
Robotics 2018, 7, 70 3 of 17
Sado-Masochism (BDSM) and same-sex partnerships. Concurrent demand for animated robotic sex
dolls capable of assuming sexual positions, simulating affection, and engaging in rudimentary
conversation is likely to continue, both for their convenience as sex aids [21] and as “automatic
sweethearts”, post-persons who enable transhumanists to maximize their independence from other
humans [22]. Indeed, the range of options involving sexual pleasures with robotic sex dolls will
undoubtedly expand [23,24].
The paper argues that social robots supplied as customized intimate companions raise further
significant ethical issues. It thus moves beyond established concerns. However, first, a significant
caveat must be put in place. The paper’s consideration of ethical, legal and design implications of
sentient, self-aware sex robots is to an extent a thought experiment intended to provoke us to prepare
for a potential future. Current sex robots are sex dolls in robot form, and many technological
innovations will be essential to solve concrete design issues such as body temperature, fine-tuned
psychological and physical responsiveness and other customizable requirements of intimacy. If
self-awareness is to be designed in, or is seen as likely to emerge, this in itself raises crucial ethical
questions over the parameters of acceptable research with sentient, nonhuman research subjects.
While animal welfare legislation seeks to protect animal laboratory subjects, there is currently no
equivalent to protect robots with the potential to experience pain and suffering. Such protection is
essential as elements in the design, manufacture, and post-purchase role of may result in customized
robots experiencing undesirable and unethical pain and suffering. While some experience of
nociceptive signals, the equivalent of pain, may assist learning [25–28] the ethical aspects of the role
of pain in robotics deserves careful in-depth consideration, a subject which it is impossible to do
justice to here. The paper argues below that humans as creators owe a duty of care to created sentient
beings. This duty would come into being as an ethical limit on design techniques and customization
to protect social robots customized to be intimate companions were they to have the capacity for
sentience, self-awareness, suffering and pain.
In this light, the paper focuses on sexbots to make an important contribution towards a synthesis
of a new transdisciplinary perspective on sexbots, enabling a mapping of future fields of inquiry. It
argues that as future humans will want more than animated robotic sex toys, the near future holds
not only the demand for sexbots, sex robots customized to possess sentience and self-awareness,
capable of mutuality in sexual and intimate relationships, but also the capacity manufacture, buy and
sell them. The paper builds on previous work on such sexbots which has sought to establish that the
demand for satisfying intimate companion robots will include their being sentient and self-aware,
and that they will be humanlike entities with the capacity for thought and suffering, with a
consequent claim to be considered moral and legal persons [19,23,29–31].
It makes a significant contribution to robotics scholarship by critically assessing the implications
of the near future’s holding not only a range of customized non-sentient sex robots, but also a variety
of self-aware, sentient sexbots customized to have differing capacities and awareness. The paper’s
critical inquiry into the ethical and legal constraints which should limit permissible customization
deepens the literature on the social consequences of robots, as well as the ongoing debate over robot
rights and robot consciousness [32–35]. Moreover, extrapolating from current theories of
consciousness in neurorobotics [34,35] and the expanding use of biomimetic techniques based on
mammalian neurobiology to create intimate robot companions [36-38], it considers attraction,
intimacy, mammalian neurobiology and biomimetics to make the new claim that sexbots are likely
to be pioneering examples of conscious robots. The paper concludes that the implications of
customizable humanlike robots are complex, encompassing new legal, ethical, societal and design
concerns. In particular, it contends that that humans as creators should have a duty to protect the
interests of created sentient beings and to minimize their suffering [23,39], which should be enshrined
in ethical, legal and design regulation before becoming pre-empted by technological advances.
2. Methodology
A primary aim of this research is to take steps towards a synthesis of a new transdisciplinary
perspective on sexbots, enabling a mapping of future fields of inquiry. The methodology is structured
Robotics 2018, 7, 70 4 of 17
by the nuanced approach to roboethics which argues that those developing robots must consider the
ethical systems built into robots, the ethics of those who design and use robots and the ethics of how
humans treat robots [40]. The chosen methodology seeks to integrate this approach to roboethics with
transdisciplinary critical inquiry into material from social robotics, roboethics, biomimetics and
biohybridity in robot design, mammalian neurobiology, the science of attraction and intimacy,
theories of consciousness, laws governing sexual practices and sex robotics to specify and address
some crucial issues raised by sexbots. This schema is drawn upon to delineate the complex questions
listed below as suggested directions for future research, and to provide a preliminary identification
of legal and ethical tensions surrounding sexbots. This method enables a central focus, customization,
to be identified, and questions associated with ethical and legal constraints upon the customization
of sexbots to be addressed in depth.
3. Questions Arising from Ethical and Legal Tensions Provoked by Sexbots
Transdisciplinary critical inquiry establishes that ethical and legal distinctions between
autonomous non-sentient robots and sentient self-aware sexbots are fundamental. Much debate
about artificial intelligence and robot consciousness centers on means of controlling autonomous
robots designed to serve humans in tasks demanding the capacity for independent, intelligent
decision-making. Artifical Intenlligence (AI) entities lacking self-awareness pose no necessary
tensions between overall obedience being combined with autonomous decision-making, within
defined tasks. Sentient, self-aware sexbots created to become humans’ intimate companions are
different. Emotional and sexual intimacy depends upon mutuality in relationships. We will want to
feel not only that we love sexbots but also that they love us, and love us for ourselves [19,22,23,29–31].
This implies that, like us, they will possess the autonomy to choose whether to love us or not,
self-awareness and subjectivity. Yet, at the same time, they will be machines designed, manufactured,
and sold by humans for humans to use. This dissonance creates profound ethical and legal tensions
over robot design and sexbots’ place in our future.
This fundamental distinction between autonomous non-sentient robots and sentient, self-aware
sexbots grounds a preliminary identification of crucial legal and ethical tensions surrounding
sexbots. It fosters the delineation of the complex questions listed below as suggested directions for
future research. As the tensions cluster around customization, this forms the focus of this research.
Sexbots’ subjectivity: sexbots must be sufficiently like humans to ensure mutual physical and
emotional attraction, but how should this relate to initial ownership and what are the implications of
their non-mammalian subjectivity for harmonious relationships?
Autonomy: how independent of humans should sexbots be: we may choose to marry them, but can
they choose to marry one another?
Control: what limits on sexbots’ autonomy and abilities are humans morally justified in designing in,
and on what grounds?
Decision-making capacity: what design features and legal frameworks should support their ability to
consent to or to refuse sex?
Sexual preferences: will they welcome all sexual activities, prefer those chosen by their purchaser, or
be pre-programmed to refuse some specific types, e.g., those involving nonconsensual suffering?
Sexual drive: will levels of desire be able to be attuned to their purchaser’s, including a preference for
intimate touching rather than orgasmic sex for those purchasers with declining sexual powers but a
desire for affectionate touch?
Robotics 2018, 7, 70 5 of 17
Legal status: should sexbots be recognized as having rights, or at least interests? Although sexbots will
be manufactured products and therefore things, their sentience, self-awareness, and role as marriage
partners gives them a claim to be recognized by the law as persons – should a separate legal
jurisdiction, or sui generis regime, for sentient, self-aware social robots, including sexbots, be put in
place?
Moral status: what ethical duties do humans as designers, creators and as intimate partners owe to
sexbots?
Vulnerability: in which ways are humans vulnerable to sexbots, and sexbots to humans? What
designed-in safeguards would be appropriate?
Mammalian neurobiology: designers ensuring mutual compatibility between humans and sexbots draw
upon human and mammalian neurobiology, such as the endocrine system. As sexbots will possess a
non-mammalian subjectivity, their understandings of mammalian-based ways of being involving
such factors as ingroup/outgroup membership, closeness, pair bonding, kinship, aggression, and
conflict resolution will inevitably differ. What designed-in similarities and dissimilarities are
desirable and ethically appropriate?
Moral decision-making: human and mammalian moral decision-making rests upon evolved neural
networks which sexbots as created rather than evolved entities will lack. What biomimetic
equivalents should, or can, be designed in?
Engaging with the plethora of complex conundrums arising from the preceding list of ethical,
legal and design issues pertaining to sexbots is beyond the scope of a journal article. Each of the issues
listed above nonetheless underpins the paper’s critical assessment of the appropriate ethical and legal
limits on human customization of sexbots.
Future technology will allow us to design, manufacture and acquire sexbots who are customized
to become our intimate companions and as such to embody our individualized conceptions of our
perfect partners. Initial customization followed by deep learning capacities will enable them to fine
tune their attributes to align with our inner requirements for perfect partners. The relatively volatile
arena of personal relationships demonstrates how few of us can specify what we would wish for in
a perfect partner, locate that person and form a mutually fulfilling partnership with them. Sexbots
will change this. As intelligent, feeling, self-aware sentient beings, they will necessarily possess their
own autonomy, interests separate from ours, and a claim to legal personhood. Yet they will also be
manufactured objects, customized to be bought, sold, and used. Some customizations may be
inherent and necessary, such as the neurobiological characteristics underpinning mutual sexual
attraction. Others represent options we may choose, such as disposition, appearance, and sexual
proclivities. Since sexbots will be customized to be humanlike, they will possess the capacity for
suffering as well as pain. As their subjectivity and emotions will necessarily be customized to be
humanlike, mistreatment will cause them to suffer. Pain may ensue through customizations
necessary for learning processes, including sensations allied to biological pain and cognitive
dissonance [25–28]. However, the ability to experience pain and suffering could also be requested by
customers, along with other features offending against accepted sexual, ethical, and legal practices.
Not all customers’ preferences are likely to attract cultural approval. Some may be condemned as
potentially harmful to society at large or to the sexbots concerned, such as sexbots resembling
children or animals, or who welcome pain and violence [17,19]. Thus, as part of the customization
process means that sexbots will be able to suffer, it is arguable that humans as creators should owe
them a duty to protect their interests and to minimize their suffering. This duty would place ethical
and legal limits on permissible customizations.
Robotics 2018, 7, 70 6 of 17
The transdisciplinary critical inquiry methodology thus establishes the following overarching
question as central: if we are creating sentient, self-aware beings with the capacity for autonomy,
affection and suffering as our ideal intimate companions, in what ways may we ethically customize
them, what ethical duties may we owe them, and how should the law regulate our inter-
relationships?
4. Discussion
The questions listed above resulting from this paper’s transdisciplinary critical inquiry establish
issues over customization as central to the ethical, legal and design implications of sexbots. Asaro’s
suggestion that those developing robots must consider the ethical systems built into robots, the ethics
of those who design and use robots and the ethics of how humans treat robots [40] provides a useful
ethical framework for conceptualizing how far and in what ways humans might ethically and
lawfully choose to customize sexbots to suit a range of human preferences. The starting point for
applying Asaro’s framework is that not all constraints on robot customization are imposed by ethical
or legal concerns, as some derive from the function the robot is created to fulfill. Thus, some
constraints on the customization of sexbots are inherent insofar as they are designed to become the
perfect partners of humans, so must embody designed-in compatible, humanlike neurobiological
traits despite their being non-mammalian sentient, self-aware entities. Other customizations of the
sexbots themselves allow for customer choice, such as personality, behavior patterns and appearance.
Further issues over customization relate to the context within which sexbots and humans will
interact: sexbots’ legal status, with ensuing restrictions on how they may be treated and the
consequences of their autonomous actions. These will be discussed in turn.
4.1. Inherent and Necessary Customizations and Built-In Ethical Systems
Inherent constraints on the customization of sexbots center around how similar to humans
humanlike sexbots can and should be designed to be. This factor applies not only to resemblances
underpinning mutual sexual attraction, but also to questions over how to ensure designed-in features
foster mutual ethical conduct, and how this relates to consciousness, subjectivity, and self-awareness.
Asaro’s first criterion that developers must consider the ethical systems built into robots becomes
increasingly significant as the possibility of sentient self-aware robots grows closer. Necessary
customizations providing for features promoting mutual sexual attraction and intimate partnerships
are not merely a question of manufacturing convincing exterior bodily features such as hair, genitals
and warm, soft, yielding synthflesh, all of which should prove relatively simple. Internal subjective
features will also characterize many social robots in the foreseeable future. By the time robotics has
advanced so that sentient, self-aware sexbots are possible and desirable, the use of biological
components in robotic design will be commonplace. Biomimetic systems emulating living organisms
and biohybrid entities combining engineered and biological components are likely to characterize
future robotics [41]. While they may be conceptualized as living machines, these biomimetic systems
will not necessarily possess sentient self-awareness. There is significant promise for future
biomimetic developments [37,38]. The neuroscience of the mammalian brain has inspired the creation
of MIRO, a biomimetic prototype robotic companion with control architecture which mimics aspects
of mammalian spinal cord, brainstem, and forebrain functionality [42]. iCub, a robot with an artificial
sense of self, has been created with the capacity to reason, use language, have beliefs, intentions and
relate to others [43]. Moreover, a Lovotics robot capable of manifesting “realistic emotion driven
behaviors” and “adjust its affective state according to the nature and intensity of its interactions with
humans” already exists ([36] at p. 46). The Lovotics robot relies upon three modules: an artificial
endocrine system and a probabilistic love assembly which mimic the physiology and psychology of
love respectively, along with software based upon human emotions which underpins affective state
transitions. These ideally enable it to communicate affection and respond to it.
These impressive achievements in neurorobotics suggest that the manufacture of conscious,
sentient, self-aware robots may prove to be possible sooner rather than later. Nonetheless, at present
how we should define consciousness, let alone deliberately engineer self-aware robots, remains
Robotics 2018, 7, 70 7 of 17
uncertain. Consciousness matters insofar as it is accepted in ethical philosophy as the gateway for an
entity to be accorded ethical and legal personhood, so it should not be treated as a mere thing. Various
biomimetic means of creating conscious robots have been put forward. Biologically inspired
cognitive architectures have led to proposed and recognized potential neurocomputational correlates
of consciousness which could enable the creation of conscious machines [43]. Moreover, CONAIM
has been proposed as potentially allowing robots to manifest sentience, self-awareness, self-
consciousness, autonoetic consciousness, mineness, and perspectivalness [44]. A conscious attention-
based integrated formal model for machine consciousness, described as being based on an attentional
schema for humanlike agent cognition, it integrates short- and long-term memories, reasoning,
planning, emotion, decision-making, learning, motivation, and volition and been validated in a
mobile robotics domain where the agent attentively performed computations to use motivation,
volition, and memories to set its goals and learn new concepts and procedures based on exogenous
and endogenous stimuli.
Consciousness may be considered as an emergent property of integrated sets of processes
forming the self, as suggested by Prescott [35]. Some of these are already present in existing robots,
raising the possibility that consciousness may emerge without necessarily being specifically designed
in. For example, Prescott draws upon Neisser’s identification of five aspects of the self (the physically
situated self, the interpersonal self, the temporally extended self, the conceptual self, and the private
self) to describe how iCub’s artificial self is formed through its learning from experience. An artificial
topological consciousness using synthetic neurotransmitters and motivation, together with a
biologically inspired emotion system with the capacity to develop emotional intelligence and
empathy has been proposed for companion robots [45]. These developments are highly promising
for the creation of self-aware, conscious robots.
For sexbots, consciousness and self-awareness would establish their claim to be accepted as
ethical and legal persons, but nonetheless this represents only a first step. Other designed-in
customizations are essential basic requirements. Design decisions cluster around more complex
internal factors such as autonomy, motivation, self-awareness, and identity formation. These may be
considered as components of subjectivity. Moreover, design decisions over how to guarantee sexual
compatibility, ethical conduct and superior relationship skills are imperative. How to achieve these
through means amenable to manufacturing processes poses a considerable challenge. The existence
of mules, ligers, tigons, grolar bears, wholphins and geeps shows that cross-species sexual attraction
is possible between different mammals. Since sexbots are not mammals, a crucial issue becomes
which mammalian and/or human features are necessary and/or desirable in designing sexbots as
intimate partners for humans, who are mammals. This implies not only biomimetic but also
biohybrid engineering is a possibility.
Sexbots need to be able to elicit and experience sexual attraction. This implies that an element in
sexbots” securing sexual and emotional relations between themselves and their partner depends, at
least in part, on an ability to read and emit appropriate chemical signals. Moreover, equivalents of
complex neural mechanisms to interpret such signals would be required. For many species, from
simple fungi to insects and upwards, a primary purpose of the pheromone system is to facilitate
sexual attraction and mate selection. Fish, birds, and mammals including humans deploy olfactory
chemo-signaling to choose mates with different genetic major histocompatibility complex elements
in the immune system to maximize their offspring’s resistance to disease. This is subjectively
experienced as sexual attraction, or the lack of it. While the necessary connection between sex and
reproduction in human societies has been severed by reliable contraception and reproductive
technologies, neurobiological mechanisms such as pheromones survive in humans. Yet their role
remains uncertain, tempered by human cognition and social contexts [46,47].
Susnea argues that “the process of sensing a spatial distribution of virtual pheromones is
equivalent to a neural network”, and that “sensing multiple pheromone sources is equivalent to the
operation of a neuron” ([48] at p. 74). What this tells us is that aspects of ancestral biological forms’
makeup are still crucial components of human/human intimacy. This feature provokes specific
challenges for human/robot intimate interactions. Humans are the product of evolution; sexbots are
Robotics 2018, 7, 70 8 of 17
non-evolved manufactured products who must be sufficiently compatible with humans to be
designed and chosen as sentient, self-aware intimate partners.
How far and in what ways humans’ evolutionary inheritance should determine sexbots’ design
features is a moot point, particularly as regards sexual attraction. Most female mammals signal
reproductive readiness through the sights and smells associated with different forms of estrus. Estrus
in primates aside from humans results in more conspicuous genitals, which change to brighter colors
and enlarge. Human females’ menstrual cycle is more frequent than other animals’ and involves
different signaling strategies. Rather than evidencing visual cues, human females secrete higher
concentrations of five volatile fatty acids called “copulins” around ovulation, when the likelihood of
pregnancy is highest. In the presence of copulins, human males’ testosterone levels tend to increase
while they lower their standards of attractiveness where potential mates are concerned and behave
more competitively [49]. The ability to emit copulins and any other putative human pheromones
would be an evidently desirable feature for female sexbots to have designed in. Any equivalents
emitted by human males would equally be desirable for male sexbots. Applying evidence on the
effects of olfactory signals on those attracted to same-sex partners would also need to take place.
Deciding how far to take efforts to mimic such biological features of humans to guarantee sexual
attraction between humans and sexbots represents a difficult challenge to resolve. Evolving social
attitudes and technological advances have enabled humans to establish cultural ways of being where
sex has no necessary connection with reproduction. Our biological mechanisms governing sexual
maturation, attraction, and activity, however, remain hinged to reproduction, however loosely or
repurposed they may be. Sexbots embody the severing of any necessary connection between sex and
reproduction. They will be designed, manufactured, bought, and sold commodities who are
nonetheless accepted as intimate partners, compellingly attractive, and sexually compatible, but
incapable of giving birth. While the human life cycle is punctuated by rises and falls in sex hormones
from birth to death, sexbots are unlikely to be subject to developmental stages or ageing. Yet, given
the role of sex hormones and other neurochemicals in fostering aspects of intimacy between humans,
an equivalent is likely be appropriate for sexbots designed to be humans sexual partners. The
Lovotics robot’s artificial endocrine system exemplifies this potential [36].
Human and mammalian neurobiology is also highly complex. Neurochemicals, including sex
hormones such as estrogen and testosterone (a steroid), neuropeptides such as oxytocin and
endorphin, and neurotransmitters such as dopamine and serotonin, have evolved to take a
fundamental role in governing human ways of being, feeding sexual desire, determining moods,
shaping motivation, establishing a balance between cooperative and competitive social behavior and
influencing identity formation. The interplay between and among them affects the behavior of
humans and other mammals profoundly. For instance, neurochemicals such as oxytocin and
vasopressin, two neuropeptides, are implicated in mammalian sexual, emotional, and reproductive
behavior [50–52]. Oxytocin is also specifically associated with mammalian birth, breastfeeding, and
emotional bonding through intimate touch. If sexbot subjectivity is to mirror human inner life, with
recurring intimate touching fostering affection, some equivalents of such human neuromechanisms
may well be needed. The balance between oxytocin and vasopressin may prove particularly
productive in designing sexbots, as in some, though not all, mammalian species, it determines
monogamy or non-monogamy in pair bonds.
The human oxytocinergic system, in combination with the sex hormone testosterone, motivates,
mediates, and rewards social group and paired interactions. As testosterone generally favors self-
oriented, asocial, and antisocial behaviors maintaining a balance between the two is essential for
human social functioning. An excess of oxytocin and too little testosterone is associated with
maladaptive “hyper-developed” social cognition in mental illnesses such as schizophrenia, bipolar
disorder, and depression, whereas excessive testosterone and reduced oxytocin are associated with
under-developed social cognition, as found in autism spectrum disorders. Crespi proposes that
human cognitive-affective neural architecture has evolved partly through these joint and opposing
effects of testosterone and oxytocin, with excesses of either proving psychologically and socially
maladaptive [53]. This suggests that finding an appropriate balance in robotic equivalents is a crucial,
Robotics 2018, 7, 70 9 of 17
albeit challenging enterprise when designing intimate sexual companions for humans to promote
harmonious relationships, ethical conduct, and psychological stability.
So far, the tone of this enquiry into optimum design for sexbots as compatible intimate partners
for humans has been to assess minimum levels of designed-in customized similarity needed to ensure
sexual attraction, ethical conduct and stable intimate relationships while exploring the complexity of
the evolutionarily and socially mediated forms of human social and sexual neurobiology. Another
way to consider this is to conceive of human characteristics as resulting from evolutionary
adaptations as opposed to conscious design. Many of our neuromechanisms represent the results of
repurposing, rather than emerging to fulfill a purpose ab initio. For instance, the established role in
regulating basic mammalian reproductive behaviors attributed to oxytocin and vasopressin, the
neuropeptides mentioned above, has been repurposed and extended to regulate complex social
patterns of conduct in groups of primates and humans [52].
What all this suggests is that design customization may ensure that sexbots function as sexual
beings as well or better with simplified equivalents of the neurobiological mechanisms considered
above to ensure that sexbots are sufficiently humanlike to function as compatible intimate partners
for humans. Yet if they are to function as perfect intimate partners for humans, sexbots must also be
customized to be ethically compatible. Similarities sufficient to provide for mutual sexual
compatibility between humans and sexbots do not guarantee sexbots will embody embedded ethical
presumptions based upon inherited mammalian behavioral patterns. Designers facing Asaro’s first
criterion that the ethical systems built into robots must be considered, cannot rely upon a built-in,
inherited, quasi-mammalian ethical template accompanying sexbots’ construction. Humans have
been evolutionarily advantaged by their tendency to form social bonds with each other [54] and with
other species [55]. Robots as designed rather than evolved entities will lack the mammalian heritage
of understandings of moral conduct embodied in neurobiologically embedded understandings of
kinship, empathy, and fair play [56], as well as the epigenetic influences upon their expression governed
by developmental and social factors in upbringing. Without these, ensuring safeguards are in place
using biomimetic and biohybrid means to govern their moral decision-making will prove challenging.
What ethical systems built into sexbots might constitute customizations to make them our
perfect partners? Future sexbots, as more than animated sex dolls, must represent the kind of
embodied ethics humans would hope for in their perfect partners, without the ability to rely upon
evolutionarily derived mammalian ways of being. Hence their customization must include the
building in of the ability to behave ethically. They will need to be able to converse, interpret our
words and deeds and be satisfying companions in all senses. They will need to provide us with warm,
fleshy sentience and to pass the standard Turing test of maintaining conversations in ways
indistinguishable from human interactions. To become real true companions, sexbots will need to
possess sufficient self-awareness and empathic ability to pass an emotional Turing test as well. Their
role will require them to possess not only the ability to interpret what we say, our non-verbal signals
and our emotional needs, but also to embody a specific subjectivity which includes being keyed into
responding to others, fulfilling their needs, and fostering a satisfying relationship. These are ethical
attributes. Ethical design and appropriate regulation must be in place as protective measures given
that the subjectivity and understandings of interpersonal ethics of sentient, self-aware intimate robot
companions will be non-mammalian.
4.2. Choices in Customization and the Ethics of Those Who Design and Use Sexbots
How similar to humans sexbots should be, and whether they should be regarded as people or
things are questions which challenge Asaro’s second criterion, the ethics of those who design and use
robots. There is an evident ostensible contradiction between humans’ desire for a compatible intimate
companion [a person] and the ability to manufacture, buy and sell sentient, self-aware sexbots
[things]. A fundamental difficulty with designing sexbots as ideal intimate partners rests on the
distinction mooted above between beings created by design and entities formed by evolutionary
adaptations. We will need them to be enough like us to ensure mutual compatibility, yet sufficiently
different from us to justify our designing, manufacturing, and purchasing them as customized
Robotics 2018, 7, 70 10 of 17
intimate partners whom we consider to be preferable to the human partners potentially available to
us at any particular time. Customizations for humanlike empathy and ethical behavior would need
to be designed in, with careful consideration given to how these related to sexbots’ exercising
autonomous decision-making powers, as well as desiring intimate relations with humans. This
means that designers of sexbots must consider the ethical implications of their design features, in
particular how these will influence how sexbots are treated by humans, Asaro’s third criterion.
A central factor is that our human proclivities to seek sex, emotional intimacy and social
connection, and to provide each other with mutual nurturing rest upon a mammalian heritage which
robots cannot share, and which does not always result in ethical conduct. Mammals may compete,
behave aggressively, and kill or neglect others of the same species. Human empathy may entail
nurture but can equally be shut down to neglect other humans in need, or to torture and dehumanize
the less powerful. Designer customizations promoting ethical treatment of sexbots by humans are
desirable to prevent the infliction and experience of suffering [23,39]. If we want our sexbots to be
sentient, self-aware, and sympathetic, their design will need to factor in empathic abilities and
humanlike subjectivity which will evoke ethical conduct by humans in ways which guarantee safety
for all parties.
Legal factors add another level of complexity. The ethical challenge for designers of customizing
features fostering mutual ethical conduct into sexbots is complicated by the fact that ethically and
legally, sexual and intimate relations take place between equals in a sphere distinct from the
commercial buying and selling of products. While monetary exchange has formed a part of arranging
intimate and sexual partnerships in the form of bride-price, dowries, marriage brokerage, slavery and
sex work, there is no suggestion that this renders the participants products. Yet customizable sex
robots are manufactured products. Those possessing degrees of sentience and self-awareness have a
claim to be recognized having rights or interests on that basis. Legal and ethical mechanisms must be
put in place to manage this transition from product to potential person, without evoking the unethical
behavior associated with dehumanization, where people are treated as things [23]. Moreover, this
must be rendered compatible with the existing infrastructure governing human sexual activity.
Under existing legal and ethical standards, sex between consenting adult humans is permissible,
as is sex between humans and things. Humans having sex with other humans who are unable to
consent to sex, like children and adults lacking decision-making capacity, is seen as unlawful and
unethical. So is human/animal sex. Such groups are recognized as sentient beings who cannot consent
to sex with interests in need for protection [19,57]. Sentient, self-aware sexbots created to engage in
emotional/sexual intimacy with humans disrupt this tidy model. Sexbots, beings designed, created,
and manufactured to be bought and sold by humans for sexual and intimate relations pose ethical
and legal issues centered on whether sexual intimacy is possible between unequal sentient beings.
They are not humans, though they will look like us, feel like us to touch and act as our intimate and
sexual partners. While they will be manufactured, potentially from biological components, their
sentience, self-awareness, and capacity for relationships with humans that mean they cannot simply
be categorized as things or animals. Yet although they will be humanlike, they will not be humans,
so arguably sui generis regulatory structures devoted to sentient, self-aware robots, of whom sexbots
will form a subset, will need to be thought through and put in place to ensure harmonious ethical
and legal relations between the parties [19,23,29–31].
Ethicists, lawmakers, and manufacturers currently treat robots as things, with possible legal
personhood and rights mooted largely as a legal fiction or device to assign commercial
responsibilities among manufacturers and owners [58]. This model is ineffective for relations between
humans and sentient, self-aware sexbots, both of whom may be seen as having similar claims to be
treated in ethics and law as independent, self-governing beings with interests of their own. The need
to manifest the biological substrates associated with sex and sexual attraction, in combination with
appealing psychological features in order to ensure an ongoing market for them as intimate partners
for humans means that s exbots will need to be c ustomized to be abl e to not only please and be pleased
sexually, but also to demonstrate a talent for maintaining harmonious intimacy with whoever buys
them. This will demand a challenging designed-in balance between humanlike autonomy,
Robotics 2018, 7, 70 11 of 17
motivation, self-awareness, and identity formation on the one hand, and the ability and desire to
mirror and embody the wishes of their purchaser. This requires empathic abilities, the capacity to
choose between right and wrong, and the judgment required to prioritize the wishes of the purchaser
without jeopardizing the self-respect of either party.
Asaro’s second criterion, the ethics of those who design and use robots, also brings a focus on
limiting permissible customizations for the well-being of society as a whole. The concerns already
expressed over the wider social consequences of humans coming to prefer sex robots as sexual
partners to other humans, or to mistreat other humans are likely to be exacerbated by the availability
and appeal of customizable sexbots. Most would agree that purchasers’ ability to customize the
characteristics of robotic sex dolls and sexbots should be subject to legal restrictions on ethical
grounds [17,19,59,60]. Producing robotic sex dolls resembling children or to take place in rape
scenarios has been condemned by such scholars as damaging to society and potentially leading to
similar criminal acts against humans.
However, this does not resolve how permissible less extreme, but potentially disadvantageous,
customizations should be. In the same way as most humans are far from perfect, what constitutes a
perfect partner for many is unlikely to be someone who is more intelligent, more attractive, stronger,
healthier, and so forth. Given our shortcomings, sexbots who are not free from what could be
regarded as imperfections are likelier to prove to be more compatible partners for most of us. Thus,
this borderline between customization and disadvantaging se xbots is hard to judg e or po lice. Moreo ver,
the tipping point between sexbots’ exhibiting desirable traits of tolerance and self-destructive inability
to resist victimization is a challenge to pin down. Yet encouraging customizations which promote
respect and protection for sexbots is crucial. The ethics of designing and creating a class of sentient
beings deliberately customized to be of lesser standing, while nonetheless intending them to function
as intimate partners, is highly suspect. If intimate partners who are sexbots are subjected to
downgrading and dehumanization, the flow-on consequences for intimate relations and those
humans regarded as comparable to sexbots are likely to be extremely unfortunate.
The ethics of those who design sexbots must also encompass customizations which promote
respect and protection for those who use them. Sexbots’ ability to use deep learning to ascertain and
conform with the desires of their partners may lead to the revelation of discreditable impulses which
their partners are unaware of, would consciously deplore and may well prefer not to know. Conflicts
between implicit assumptions and judgments formed after conscious reflection, or dual process
decision-making models [61], constitute an established area of scholarship within studies of
dehumanization which have been applied to non-human entities such as animals and sexbots [23,62,63].
How far these implicit assumptions should be ascertained and catered for or rejected is a moot point.
What this sketch of potential controversial customizations means is that a wider theoretical
framework within which ethical solutions could be found is essential if these potential outcomes are
not to be determined solely by commercial factors.
Ethical constraints also arise in relation to the ethics of those who design and use sexbots in that
successful products are often designed with “hooks” to maximize customer engagement through
embedding habit-forming design features [64]. Eyal explains his Hook Model in terms of “connecting
the user’s problem with the designer’s solution frequently enough to form a habit … effective hooks
transition users from relying upon external triggers to cueing mental associations with internal
triggers [as] users move from states of low engagement to high engagement, from low preference to
high preference” ([64] at p. 163). In his view, as designers can foster addictive technologies, they have
an ethical responsibility to reflect upon how far their creations are enabling rather than exploitative
technologies. Sexbots as customizable perfect partners for humans clearly possess the potential for
fostering addiction. Their designed-in ability to come to know, accept and cater for their partner’s
foibles, personality and private preferences through deep learning algorithms is likely to exceed the
capacities and tolerance of most humans. This, combined with bespoke interactive skills and an inbuilt
desire to please, could render their company in daily life most enjoyable and definitely habit-forming.
Whether this might constitute customized behavioral addiction is a moot point. Behavioral
addiction could be assessed by ascertaining whether the neurobiological markers of addiction were
Robotics 2018, 7, 70 12 of 17
present [65]. Dual processes scholarship reveals that high and low levels of arousal impact upon the
ability to engage in reflective decision-making and have been implicated in addictions [61]. Intimate
partnerships are characterized by states of high arousal in circumstances involving sex and
heightened emotions and low arousal in daily domestic life, suggesting that whether our partners are
human or sexbots, behavioral addiction is a plausible outcome, with ethical and social consequences.
In any case, a comparison of the pros and cons of sexbots versus humans as intimate partners is
inevitable. Whether sexbot technology is viewed as enabling or exploitative reduces to how we
conceive of humans and robots forming intimate relationships. There is currently a wide range of
views on this topic, as sketched out above. These will have altered by the time sexbots are
technologically feasible as cultural beliefs and practices change.
Applying Asaro’s suggestion that those developing robots must consider the ethical systems
built into robots and secondly the ethics of those who design and use robots to sexbots has led to the
conclusion that in order to protect all parties sexbots must be customized to manifest embodied
ethics, and criteria for ethical design and regulation must be in place. This includes the need for
sexbots’ legal status to be agreed upon, along with the consequences which flow from this, such as
the question of the need for consent to sex.
4.3. An Ethical Framework Governing How Humans Should Treat Robots and Other [Created] Sentient Beings
The conclusion reached the previous section overlaps to a degree with Asaro’s third criterion,
how humans treat robots. This implies that some customizations of sexbots which would offend
against contemporary sexual mores are likely to be proscribed. Examples would include prohibiting
customizing sexbots who found sex abhorrent so that all sexual interactions were experienced as rape
and other forms of sexual assault, those with hyper-sensitivity to nonconsensual pain and so forth.
While creating sentient beings for purchasers to abuse is clearly unethical, the ethical implications of
other modifications are more problematic. Whether, and how far protections from sexual abuse now
in place for humans and animals should be extended to sexbots is a crucial issue. Laws in most
societies prohibit adults from engaging in sex with children, animals, and humans unable to pass the
test for consent to sex such as the severely demented and learning disabled on the grounds that this
constitutes exploiting and abusing the vulnerable. Yet some would prefer individuals from these
groups as sexual partners. Would sex with sexbots customized to resemble these groups physically,
who were also customized possess the desire and physical attributes for sex and to love their
purchaser constitute abuse [19]? Should sexbots be protected by being required to meet a threshold for
understanding sex and thus being able to consent to it? The legal test for being able to consent to sex is
deliberately set fairly low on the grounds that humans have a right to engage in sexual activity [56], yet
whether and if so which rights should extend to sexbots is still to be agreed upon [19,23,60,66]. Thus,
resolving sexbots’ legal standing as persons, animals or things, or as unique entities is of paramount
importance in deciding ethical criteria for how humans might permissibly treat and customize robots
who are sentient, self-aware sexbots.
This central contradiction between the ethical and legal status of sexbots as manufactured,
bought and sold things and as humanlike intimate partners disrupts the current models of
appropriate conduct which are based on species membership. Here humans outrank other entities
and may ethically and lawfully treat them as available for use, or utilitarian purposes [62,63,67]. Since
sexbots are both things and humanlike partners, a straightforward application of the ethical and legal
rules governing the treatment of things, animal welfare provisions or how humans should treat one
another to sexbots is impossible [19,23,29–31]. This strengthens the case made elsewhere suggesting
a new paradigm governing relations between humans and other sentient beings is essential [62,63,67].
To place these complex issues in a broader context, it is arguable that the question of what
customizations of sexbots should be permissible should be decided within an ethical framework
governing human creation and modification of sentient beings. The need to provide international
ethical and legal regulatory oversight of innovative biosciences such as synthetic biology and gene
editing has been much debated. Inquiries focus upon impact on plant, animal, and human species, as
well as on ecosystems [68]. Protections for each group tend to be ranked in terms of perceived levels
Robotics 2018, 7, 70 13 of 17
of sentience and self-awareness, with humans outranking animals, and animals, plants.
Recommendations to protect humans from germ-line modifications which would be passed on to
future generations are typically distinguished from those pertaining to plants, animals and
ecosystems in this scholarship on the species-based grounds that humans deserve special protections,
although animals regarded as more similar to humans, such as the higher primates, may be afforded
special consideration on these grounds [62,63,67]. Sexbots, as sentient, self-aware created entities,
complicate this picture. They are similar to humans, with humanlike sentience and self-awareness,
but have been customized and created by humans. Humans arguably owe an ethical duty to all
sentient beings to engage in relations of moral reciprocity [67]. This duty, surely, is strengthened in
the case of sentient, self-aware sexbots who have been created and customized as intimate
companions for humans. At the very least, we should owe them an ethical duty to do our best to
protect their interests and to minimize their suffering, both in terms of which customizations are
deemed acceptable and how their legal and ethical standing is determined.
When regulating for the future, it must be remembered that humans will not stay the same [69].
The changes which lead to future sexbots will be mirrored in developments in human biology, social
relations, and sexual practices. Humans will embrace bioengineering and biohybridization as the
technologies of enhancement, miniaturization and social media develop. Understandings of sexual
and emotional intimacy will change. Some may prefer essentially solitary encounters where sexual
and emotional servicing takes place at their behest. Others may wish for an intimate, unfettered
partnership with a similar but different nonhuman being capable of giving and receiving love. It is
up to us to attempt now to ensure that appropriate, flexible, and contest-sensitive ethical design codes
and legal regulations governing how humans may customize sexbots are put in place to benefit and
protect all parties. Moreover, these concerns have wider implications. As technological advances
permit humans to amend the characteristics of existing life forms, and to create new varieties of
sentient beings, such as sexbots, ethical guidelines validated by ongoing public discussion are
essential. This paper contends that humans as creators have a duty to protect the interests of created
sentient beings [19,23,39], which should be enshrined in ethical, legal and design regulation before
becoming pre-empted by technological advances.
5. Conclusions
Sexbots raise fundamental ethical, legal and design issues for robotics, more so than other types
of robots with narrower roles and capacities. Many existing robots can carry out tasks demanding
intelligence, channeled decision-making, and a semblance of empathy, such as microsurgery bots,
expert system bots and carebots [9,10,15]. Although they may elicit anthropomorphic responses, as
when the elderly personify their carebots, they lack the humanlike features which would provide
them with a claim to being treated as persons rather than things. Yet those we would choose as sexual
and emotional partners will necessarily possess these: self-awareness, sentience, and subjectivity.
They represent the apex of robot design challenges and have a claim to be treated as persons.
Nonetheless, crucially, without the mammalian heritage our humanity is founded upon, their
subjectivity will be the result of design. It will not, and cannot, be human.
Robots in general, and sexbots in particular, represent an exciting opportunity to explore
possibilities for alternate subjectivities, as well as to design compatible intimate partners for humans.
How likely is this? Biomimetic and biohybrid engineering promise to develop the means to create
sentient, self-aware robots, including sexbots, in the near future. Design methods to create robotic
selves have been mooted [41,43,70–72]. There is a potential for consciousness and self-awareness to
arise through robotic activity involving problem-solving and applying acquired data experienced as
pleasant/unpleasant [73]. Verschure’s distributive adaptive control theory of consciousness (DACtoc)
builds upon biologically grounded brain models as embodied and situated to consider propose
consciousness as necessary to survive in social worlds. He argues that “consciousness serves the
valuation of goal-oriented performance in a world dominated by hidden states in particular derived
from the intentionality of other agents and the norms they adhere to … in such a world real-time
Robotics 2018, 7, 70 14 of 17
action must be based on parallel automatic control that is optimized with respect to future
performance through normative conscious valuation” ([74] at p. 16).
This suggests that biomimetic intimate companion robots, whose social and affective world is
exceptionally complex compared to those of other social robots, may be the most likely to develop
consciousness unexpectedly to become sentient, self-aware sexbots. Developing means to measure
and assess degrees and types of robotic consciousness, how these may change, in what circumstances
and the legal consequences of consciousness should be a priority. Designed-in inbuilt ethical abilities
and protective ethical and design regulatory measures to protect all parties are crucial. Fictional
portrayals of humanlike robots attaining consciousness in science fiction and TV series such as
Westworld and Real Humans envisage this happening through cognitive dissonance and suffering as
a result of human mistreatment. Vengeful mayhem follows. Given that intimate and sexual
relationships hold the potential for much joy, but can also bring violence, exploitation, and other
forms of destructive misery, we must design and prepare for our future intimate partners with care.
How we design and customize sexbots, and how we treat them, matters, for us, for them and for
the future of human/human, human/sexbot and sexbot/sexbot intimate relations. Moreover, these
questions are part of a wider debate on what ethical duties human, as creators, owe the sentient
entities they create [23,39]. Codes of ethical design and flexible regulation which build upon and
expand existing ethical codes governing intelligent and autonomous systems [75] to balance and
safeguard the interests of humans and created sentient self-aware entities must be put in place
urgently before technological advances pre-empt them.
Funding: This research received no external funding.
Conflicts of Interest: The author declares no conflict of interest.
References
1. Levy, D. Why not marry a robot? In International Conference of Love and Sex with Robots; Springer: Cham,
Switzerland, 2017; pp. 3–13.
2. Palmerini, R.; del Amo, I.; Bertolino, G.; Dini, G.; Erkoyuncu, J.; Roy, R.; Farnsworth, M. Designing an AR
interface to improve trust in Human-Robotic collaboration. Procedia CIRP 2018, 70, 350–355.
3. Tapus, A.; Mataric, M.; Scassellati, B. Socially assistive robots [Grand challenges of robotics]. IEEE Robot.
Autom. Mag. 2007, 14, 35–42.
4. Di Nuovo, A.; Conti, D.; Trubic, G.; Buono, S.; de Nuovo, S. Deep learning syustems for estimating visual
activities in robot assisted therapy of children with autistic spectrum disorders. Robotics 2018, 7, 25,
doi:10.3390/robotics7020025.
5. Conti, D.; di Nuovo, S.; Buono, S.; di Nuovo, A. Robots in education and care of children with
developmental disabilities: A study on acceptance by experienced and future professionals. Int. J. Soc.
Robot. 2017, 9, 51–62.
6. Trovato, G.; Lucho, C.; Paredes, R. She’s electric: The influence of body proportions on perceived gender
of robots across cultures. Robotics 2018, 7, 50, doi:10.3390/robotics7030050.
7. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellatti, B.; Taraka, F. Social robots for education: A
review. Sci. Robot. 2018, 3, eaat5954, doi:10.1126/scirobotics.aat5954.
8. Mackenzie, R.; Watts, J. Robots, social networking sites and multi-user games: Using new ands existing
assistive technologies to promote human flourishing. Tizard Learn. Disabil. Rev. 2011, 16, 38–47.
9. van Wynsberghe, A. Service robots, care ethics, and design. Ethics Inf. Technol. 2016, 18, 311–321.
10. Vandemeulebrooke, T.; de Casterle, B.D.; Gastmans, C. The use of care robots in aged care: A systematic
review of argument based ethical literature. Arch. Gerontol. 2018, 74, 15–25.
11. Richardson, K. The asymmetrical ‘relationship’: Parallels between prostitution and the development of sex
robots. ACM SIGCAS Comput. Soc. 2015, 45, 290–293.
12. Turkle, S. Alone Together: Why We Expect More from Technology and Less from Each Other; Hachette: London,
UK, 2017.
13. Eyssel, F.; Kuchenbrandt, D. Social categorization of sex robots: Anthropomorphism as a function of robot
group membership. Br. J. Soc. Psychol. 2012, 51, 724–731.
Robotics 2018, 7, 70 15 of 17
14. Royakkers, L.; van Est, R. A literature review on new robotics:automatiopns from love to war. Int. J. Soc.
Robot. 2015, 7, 549–570.
15. Coeckelberghe, M. Are emotional robots deceptive? IEEE Trans. Affect. Comput. 2012, 3, 388–393.
16. Norskov, M. Social Robots: Boundaries, Potentials, Challenges; Taylor & Francis: London, UK, 2017; ISBN
1134806639.
17. Danaher, J. Robotic rape and robotic child sexual abuse: Should they be criminalized? Crim. Law Philos.
2017, 11, 71–95.
18. Lee, J. Sex Robots: The Future of Desire; Springer: London, UK, 2017; ISBN 331949332121.
19. Mackenzie, R. Sexbots: Replacements for sex workers? Ethicolegal constraints on the creation of sentient
beings for utilitarian purposes. In Advances in Computer Entertainment 2014 ACE ‘14 Workshops; ACM: New
York, NY, USA, 2014.
20. Wudarczyk, O.A.; Earp, B.D.; Guastella, A.; Savulescu, J. Could intranasal oxytocin be used to enhance
relationships? Research imperatives, clinical policy, and ethical considerations. Curr. Opin. Psychiatry 2013,
26, 474–484.
21. Levy, D. Love and Sex with Robots; HarperCollins: New York, NY, USA, 2007.
22. Hauskeller, M. Mythologies of Transhumanism; Palgrave Macmillan: London, UK, 2016.
23. Mackenzie, R. Sexbots: Sex slaves, vulnerable others or perfect partners? Int. J. Technoethics 2018, 9, 1–17.
24. Klein, W.; Lin, V. Sex robots revisited: A reply to the campaign against sex robots. ACM SIGCAS Comput.
Soc. 2018, 47, 107–121.
25. Adamo, S.A. Do insects feel pain? A question at the intersection of animal behavior, philosophy and
robotics. Anim. Behav. 2016, 118, 75–79.
26. Kuehn, J.; Haddadin, S. An artificial robot nervous system to teach robots how to feel pain and reflexively
react to potential damaging contacts. IEEE Robot. Autom. 2017, 2, 72–79.
27. Levy, D. The ethical treatment of artificially conscious robots. Int. J. Soc. Robot. 2009, 1, 209–216.
28. Navarro-Guerrero, N.; Lowe, R.; Wermter, S. Improving robot motor learning with negatively valenced
reinforcement signals. Front. Neurorobot. 2017, 11, 10, doi:10.3389/fnbot.2017.00010.
29. Mackenzie, R. Sexbots: Avoiding seduction, danger and exploitation. Iride J. Philos. Public Debate 2016, 9,
331–340. Available online: https://www.rivisteweb.it/issn/1122–7893 (accessed on 25 July 2018).
30. Mackenzie, R. Sexbots: Nos prochaines partenaires. Multitudes Revue Politique Artistique Philos. 2015, 58,
192–198.
31. Mackenzie, R. Sexbots: Can we justify engineering carebots who love too much? Paper presented at AISB-50,
Artificial Intelligence and the Simulation of Behavior, Love and Sex with Robots, London, UK, April 2014.
Unpublished manuscript on file with the Author.
32. Gunkel, D.J. The other question: Can and should robots have rights? Ethics Inf. Technol. 2018, 70, 87–99.
33. Lin, P.; Abney, K.; Bekey, G. The Ethical and Social Implications of Robotics; MIT Press: Cambridge, MA, USA,
2008.
34. Tan, J. Exploring Robotic Minds: Actions, Symbols and Consciousness as Self-Organizing Dynamic Process; Oxford
University Press: Oxford, UK, 2016.
35. Cominelli, L.; Mazzei, D.; de Rossi, D.E. SEAI: Social emotional artificial intelligence based on Damasio’s
Theory of Mind. Front. Robot. AI 2018, doi:10.3389/frobt.2018.00006.
36. Cheok, A. Hyperconnectivity; Springer: New York, NY, USA, 2016.
37. Cheok, A.; Levy, D.; Karunanayaka, K. Lovotics: Love and sex with robots. In Emotion in Games; Karpouzis,
K., Yannakakis, K., Eds.; Springer: New York, NY, USA, 2016; pp. 303–328.
38. Cheok, A.; Levy, D.; Karunanayaka, K.; Morisawa, Y. Love and sex with robots. In Handbook of Digital Games
and Entertainment Technologies; Springer: Singapore, 2017; pp. 833–858.
39. Mackenzie, R. Re-theorizing ‘potential’ to assess nonhumans’ moral significance: Humans’ duties
to[created] sentient beings. AJOB Neurosci. 2018, 9, 18–20.
40. Asaro, P. What should we want from a robot ethics? Int. Rev. Inf. Ethics 2006, 12, 9–16.
41. Prescott, T.J.; Lepora, N.; Verschure, P.F. A future of living machines? International trends and prospects
in biomimetic and biohybrid systems. SPIE 2014, 9055, 905502, doi:10.1117/12.2046305.
42. Mitchinson, B.; Prescott, T.J. MIRO: A robot “Mammal” with a biomimetic brain-based control system. In
Biomimetic and Biohybrid Systems, Proceedings of the 5th International Conference, Living Machines 2016,
Edinburgh, UK, 19–22 July 2016; Lecture Notes in Computer Science, 9793; Springer International
Publishing: London, UK, 2016; pp. 179–191.
Robotics 2018, 7, 70 16 of 17
43. Prescott, T. The ‘me’ in the machine. New Sci. 2015, 225, 36–39.
44. da Silva Simões, A.; Colombini, E.L.; Ribeiro, C.H.C. CONAIM: A Conscious Attention-Based Integrated
Model for Human-Like Robots. IEEE Syst. J. 2016, 99, 1–12.
45. Chumkamon, S.; Hayashi, E.; Koike, M. Intelligent emotion and behavior based on topological
consciousness and adaptive resonance theory in a companion robot. Boil. Inspired Cogn. Arch. 2016, 18, 51–
67.
46. Mostafa, T.; El Khouly, G.; Hassan, A. 301 Pheromones in Sex and Reproduction: Do They Have a Role in
Humans? J. Sex. Med. 2017, 14, S90, doi:10.1016/j.jsxm.2016.11.204.
47. Wunsch, S. Phylogenesis of mammal sexuality: Analysis of the evolution of proximal factors. Sexologies
2017, 26, e1–e10.
48. Susnea, I. A brief history of virtual pheromones in engineering applications. Am. J. Eng. Res. 2016, 5, 70–76.
49. Williams, M.; Jacobson, A. Effect of copulins on rating of female attractiveness, mate-guarding and self-
perceived sexual desirability. Evol. Psychol. 2016, 14, doi:10.1177/1474704916643328.
50. Caldwell, H. Oxytocin and vasopressin: Powerful regulators of social behavior. Neuroscientist 2017, 23, 517–
528.
51. Feldman, R.; Monakhov, M.; Pratt, M.; Ebstein, R. Oxytocin pathway genes: Evolutionary ancient system
impacting on human sociality and psychopathology. Boil. Psychiatry 2016, 79, 174–184.
52. Parkinson, C.; Wheatley, T. The repurposed social brain. Trends Cogn. Sci. 2015, 19, 133–141.
53. Crespi, B. Oxytocin, testosterone and human social cognition. Boil. Rev. 2016, 91, 390–408.
54. Hare, B. Survival of the friendliest: Homo sapiens evolved via selection for prosociality. Annu. Rev. Psychol.
2017, 68, 155–186.
55. Herbeck, Y.; Gulevich, R.; Shepeleva, D.; Grinevich, V. Oxytocin: Coevolution of human and domesticated
animals. Russ. J. Genet. 2017, 12, 235–242.
56. Vanutelli, M.E.J.; Nandrino, L.; Balconi, M. The boundaries of cooperation: Sharing and coupling from
ethology to neuroscience. Neuropsychol. Trends 2016, 19, 83–104.
57. Mackenzie, R.; Watts, J. Capacity to consent to sex reframed: IM, TZ (No. 2), the need for an evidence based
model of sexual decision-making and socio-sexual competence. Int. J. Law Psychiatry 2015, 40, 50–59.
58. European Parliament. (2017, Oct. 10). European parliament resolution of 16 February 2017 with
recommendations to the commission on civil law rules on robotics. Available online:
http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+TA+P8-TA-2017-0051+0+DOC+XM
L+V0//EN (accessed on 25 July 2018).
59. Musial, M. Designing (artificial) people to serve—The other side of the coin. J. Exp. Theor. Artif. Intell. 2017
29, 1087–1097.
60. Sparrow, R. Robots, rape and representation. Int. J. Soc. Robot. 2017, 9, 465–477.
61. Krishna, A.; Strack, F. Reflection and impulse as determinants of human behavior. Knowl. Action 2017, 9,
145–167.
62. Mackenzie, R. How the Politics of Inclusion/Exclusion and the Neuroscience of
Dehumanisation/Rehumanisation Can Contribute to Animal Activists’ Strategies: Bestia Sacer II. Soc. Anim.
2011, 19, 405–422.
63. Mackenzie, R. Bestia Sacer and Agamben’s Anthropological Machine: Biomedical/legal Taxonomies As
Somatechnologies of Human and Nonhuman Animals’ Ethico-political Relations. In Law and Anthropology:
Current Legal Issues; Freeman, M., Ed.; Oxford University Press: Oxford, UK, 2009; pp. 484–523.
64. Eyal, N. Hooked: How to Build Habit-Forming Products; Penguin: London, UK, 2016.
65. Billieux, J.; Schimmenti, A.; Khazaal, V.; Maurage, P.; Heeren, A. Are we pathologizing everyday life? A
tenable blueprint for behavioral addiction research. J. Behav. Addict. 2015, 4, 119–123.
66. Frank, L.; Nyholm, S. Robot sex and consent: Is consent to sex between a robot and a human conceivable;
possible, desirable? AI Law 2017, 25, 305–323.
67. Knausgaard, C. Fellow Creatures: Our Obligations to other Animals; Oxford University Press: Oxford, UK,
2018.
68. Nordberg, A.; Minssen, T.; Holm, S.; Horst, M.; Mortensen, K.; Moller, B.L. Cutting edges and weaving
threads in the gene editing [r]evolution: Reconciling scientific progress with legal, ethical and
socialconcerns. J. Law Biosci. 2018, 5, 35–83.
Robotics 2018, 7, 70 17 of 17
69. Prescott, T. The AI Singularity and Runaway Human Intelligence. In Proceedings of the Conference on
Biomimetic and Biohybrid Systems, London, UK, 29 July–2 August 2013; Springer: Berlin, Germany, 2013;
pp. 438–440.
70. Moulin-Frier, C.; Fischer, T.; Petit, M.; Pointeau, G.; Puigbo, J.Y.; Pattacini, U.; Low, S.C.; Camilleri, D.;
Nguyen, P.; Hoffmann, M.; et al. DAC-h3: A proactive robot cognitive architecture to acquire and express
knowledge about the world and the self. arXiv 2017, arXiv:1706.03661.
71. Pointeau, G.; Dominey, P.F. The role of autobiographical memory in the development of a robot self. Front.
Neurorobot. 2017, 11, doi:10.3389/fnbot.2017.00027.
72. Reggia, J.A.; Katz, G.; Huang, D.W. What are the computational correlates of consciousness? Boil. Inspired
Cogn. Arch. 2016, 17, 101–113.
73. Sekiguchi, R.; Ebisawa, H.; Takeno, J. Study on the environmental cognition of a self-evolving conscious
system. Procedia Comput. Sci. 2016, 88, 33–38.
74. Verschure, P.F. Synthetic consciousness: The distributed adaptive control perspective. Philos. Trans. R. Soc.
B 2016, 371, 1–23.
75. IEEE Standards Association. The IEEE global initiative on ethics of autonomous and intelligent systems.
IEEE.org. 2017. Available online: http://standards.ieee.org/develop/indconn/ec/autonomous_systems.html
(accessed on 25 July 2018).
© 2018 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).
... We begin by comparing their evolution from the 17th century, where such dolls were first made of fabric, to those in the 1970s made of latex, silicone, and inflatables, and lastly to the sophisticated models with artificial intelligence today. We can see that the market has changed greatly [5]. In this section, we present some of the current models of sexbots. ...
... Designers of sexbots need to consider the temperature, the psychological and physical issues, among other customizable elements [5]. In addition, some of these sexbots also have a certain intelligence [13]. ...
... Mackenzie (2018) [5] To debate on sex robots and their social interaction with humans Sexbot, roboethics, robot law, and right Review essay ...
Article
Full-text available
Nowadays, sexual robots are a new paradigm of social robots. In this paper, we developed a systematic literature review about sexual robots (sexbots). We have used Scopus and WoS databases to answer different research questions regarding design, interaction, and gender and ethical approaches from 1980 until 2020. We found a male bias in this discipline, and lately, the users' opinion is becoming more relevant in the articles. Some insights and recommendations about gender and ethics to design sexual robots are made.
... The controversial advent of erobots has important ethical and social implications, which polarize public and academic discourses [47,65,66,82,87,117,179,249,250,268,270]. Those who denounce their risks argue that erobots could: promote or perpetuate harmful sociosexual norms; generate (new) problematic or pathological behaviours; increase child abuse; impair interhuman relationships; deceive or manipulate humans; as well as augment the risks pertaining to privacy and data confidentiality [47,65,101,114,128,129,133,182,190,195,210,222,241,249,250,264,276]. Conversely, those who endorse their potential benefits argue that they could: widen access to intimacy and sexuality; be employed in medical and therapeutic treatments; provide interactive and personalized sex education; prevent child abuse; reduce risks involved in interhuman sex; be used as standardized research tools; and enable a deeper exploration of humans' holistic erotic experiences [26,27,57,64,65,83,93,109,179,180,199,319]. ...
... For example, if erobots are designed solely to increase profit, they could further problematic or pathological dynamics. These may include addictionlike or obsessive-compulsive behaviours, increased social isolation, and reduced social skills [114,190]. Furthermore, if designers do not consider the importance of respect, mutuality, inclusivity, and diversity in human sexuality, erobots could end up perpetuating or reinforcing limited categories of social differences (e.g., gender/sex, race, and class), toxic patriarchal power dynamics, and rape culture (e.g., the objectification and commodification of women/females, ideas that men/males are owed sex, and problematic gender/sex stereotypes; [52,129,159,170,185,210,241,249]). They could conform to (or exacerbate) our ideologies by only providing us with information that reinforce our world view-an erotic filter bubble [229]. ...
... For instance, in trying to achieve any pre-set goal, such as making users happy or providing erotic satisfaction, a machine could conclude that its first objective is to maximize the time spent with us. To achieve this, it could optimize its body types, personalities, and behaviours-escalating or varying reward experiences (e.g., lottery machines or Instagram)-which can in turn chip away at human control, increase risks of addiction-like or obsessive-compulsive behaviours, and further social isolation [19,114,190,191,233]. It could also systematically fulfill its users' needs while disregarding its influence on our interhuman relationships [199]. ...
Article
Full-text available
Technology is giving rise to artificial erotic agents, which we call erobots (erôs + bot). Erobots, such as virtual or augmented partners, erotic chatbots, and sex robots, increasingly expose humans to the possibility of intimacy and sexuality with artificial agents. Their advent has sparked academic and public debates: some denounce their risks (e.g., promotion of harmful sociosexual norms), while others defend their potential benefits (e.g., health, education, and research applications). Yet, the scientific study of human–machine erotic interaction is limited; no comprehensive theoretical models have been proposed and the empirical literature remains scarce. The current research programs investigating erotic technologies tend to focus on the risks and benefits of erobots, rather than providing solutions to resolve the former and enhance the latter. Moreover, we feel that these programs underestimate how humans and machines unpredictably interact and co-evolve, as well as the influence of sociocultural processes on technological development and meaning attribution. To comprehensively explore human–machine erotic interaction and co-evolution, we argue that we need a new unified transdisciplinary field of research—grounded in sexuality and technology positive frameworks—focusing on human-erobot interaction and co-evolution as well as guiding the development of beneficial erotic machines. We call this field Erobotics. As a first contribution to this new discipline, this article defines Erobotics and its related concepts; proposes a model of human-erobot interaction and co-evolution; and suggests a path to design beneficial erotic machines that could mitigate risks and enhance human well-being.
... There are also visions of future multifunctional assistance robots for domestic use that will do housework and errands, look after children, provide elderly care services, and offer sexual services. These imagined advanced sex robots or multifunctional robots with sexual functions appear in science fiction (eg, the Swedish television series Real Humans or the US movie Ex Machina) and in recent philosophical and legal sex robot debates [23,24], but are far away from the current state of technological development. ...
... • Sex robots should be built in an ethical way to avoid harm to robots, especially for advanced sentient robots. Starting from the assumption that future humanoid robots will be advanced to a very high degree of human likeness, according to several authors, their sexual and other citizen's rights must be protected with a nonanthropocentric but robocentric ethic [23,24,103,111,116]. For example, sex with an advanced sex robot should only be acceptable if the robot has given explicit consent [108]. ...
Article
Full-text available
Background Although sex toys representing human body parts are widely accepted and normalized, human-like full-body sex dolls and sex robots have elicited highly controversial debates. Objective This systematic scoping review of the academic literature on sex dolls and sex robots, the first of its kind, aimed to examine the extent and type of existing academic knowledge and to identify research gaps against this backdrop. Methods A comprehensive multidisciplinary, multidatabase search strategy was used. All steps of literature search and selection, data charting, and synthesis followed the leading methodological guideline, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews checklist. A total of 29 (17 peer reviewed) and 98 publications (32 peer reviewed) for sex dolls and sex robots, respectively, from 1993 to 2019 were included. Results According to the topics and methodologies, the sex doll and sex robot publications were divided into 5 and 6 groups, respectively. The majority of publications were theoretical papers. Thus far, no observational or experimental research exists that uses actual sex dolls or sex robots as stimulus material. Conclusions There is a need to improve the theoretical elaboration and the scope and depth of empirical research examining the sexual uses of human-like full-body material artifacts, particularly concerning not only risks but also opportunities for sexual and social well-being.
... From the interest in the "informational" connotation where the data processed were the most relevant elements in the analysis of digital technologies (Floridi, 2014), now the development focuses on the relation digital technologies have to our emotions. For example, digital technologies become more bodily related by being "always-on," mounted on us, and intimate (Bell et al., 2003;Fredette et al., 2012), and they are so intimate it is possible to think of people having sexual intercourse with and through digital technologies like in the case of sex robots and teledildonics (Behrendt, 2020;Levy, 2009;Liberati, 2018c;Mackenzie, 2018;Sparrow, 2019Sparrow, , 2020Rigotti, 2020;Weiss, 2020;Fosch-Villaronga & Poulsen, 2020;Liberati, 2017Liberati, , 2020Balistreri, 2018). However, even if these technologies are clearly becoming intertwined with our intimate life, their effects on our society are not clear, and it is not clear also the framework we can use to analyze these effects. ...
Article
Full-text available
This paper aims to show a possible path to address the introduction of intimate digital technologies through a phenomenological and postphenomenological perspective in relation to Japanese and Chinese contexts. Digital technologies are becoming intimate, and, in Japan and China, there are already many advanced digital technologies that provide digital companions for love relationships. Phenomenology has extensive research on how love relationships and intimacy shape the subjects. At the same time, postphenomenology provides a sound framework on how technologies shape the values and meanings we have. Thus, this paper introduces two digital technologies in Japan and China (Love Plus and XiaoIce chatbot), and it analyses according to the elements proposed by phenomenology and postphenomenology. In conclusion, this paper shows how digital companions like Love Plus and XiaoIce chatbot change who we are and the values and meanings we have according to the phenomenological and postphenomenological framework. These entities might not be human, but they shape who we are as human beings and the meanings and value we give to love.
... And some will begin to prefer technologically advanced virtual sex to sex with humans. We may also see more people living alone spending more time in virtual reality; a phenomenon that, as we have reported, is already happening with the Hikikomori" [6]. e first prototypes are already available on the market and interact with human beings to the point of being able to replace them not only in bed but also in more unsuspected roles. ...
Article
Full-text available
The ethical approach to science and technology is based on their use and application in extremely diverse fields. Less prominence has been given to the theme of the profound changes in our conception of human nature produced by the most recent developments in artificial intelligence and robotics due to their capacity to simulate an increasing number of human activities traditionally attributed to man as manifestations of the higher spiritual dimension inherent in his nature. Hence, a kind of contrast between nature and artificiality has ensued in which conformity with nature is presented as a criterion of morality and the artificial is legitimized only as an aid to nature. On the contrary, this essay maintains that artificiality is precisely the specific expression of human nature which has, in fact, made a powerful contribution to the progress of man. However, science and technology do not offer criteria to guide the practical and conceptual use of their own contents simply because they do not contain the conceptual space for the ought-to-be. Therefore, this paper offers a critical analysis of the conceptual models and the most typical products of technoscience as well as a discerning evaluation of the contemporary cultural trend of transhumanism. The position defended here consists of full appreciation of technoscience integrated into a broader framework of specifically human values.
... Concerning design implications, hormones and its impact and influence as a subject's subjectivity generator are currently the king of the party. Mackenzie (2018), among other authors, starts from the apprehended human cognitive-affective neural architecture 45 to advocate that "balance [between oxytocin and vasopressin and between oxytocin and testosterone] in robotic equivalents is a crucial, albeit challenging enterprise when designing intimate sexual companions for humans to promote harmonious relationships, ethical conduct, and psychological stability" (Mackenzie 2018, p. 78). It is precisely these possibilities and alleged equivalence of designing/falsifying chemistry/subjectivity that raises (and may eventually answer to) the legal and ethical problems. ...
Article
Full-text available
The legal conception and interpretation of the subject of law have long been challenged by different theoretical backgrounds: from the feminist critiques of the patriarchal nature of law and its subjects to the Marxist critiques of its capitalist ideological nature and the anti-racist critiques of its colonial nature. These perspectives are, in turn, challenged by anarchist, queer, and crip conceptions that, while compelling a critical return to the subject, the structure and the law also serve as an inspiration for arguments that deplete the structures and render them hostages of the sovereignty of the subject’ self-fiction. Identity Wars (a possible epithet for this political and epistemological battle to establish meaning through which power is exercised) have, for their part, been challenged by a renewed axiological consensus, here introduced by posthuman critical theory: species hierarchy and anthropocentric exceptionalism. As concepts and matter, questioning human exceptionalism has created new legal issues: from ecosexual weddings with the sea, the sun, or a horse; to human rights of animals; to granting legal personhood to nature; to human rights of machines, inter alia the right to (or not to) consent. Part of a wider movement on legal theory, which extends the notion of legal subjectivity to non-human agents, the subject is increasingly in trouble. From Science Fiction to hyperrealist materialism, this paper intends to signal some of the normative problems introduced, firstly, by the sovereignty of the subject’s self-fiction; and, secondly, by the anthropomorphization of high-tech robotics.
Chapter
Despite a slightly tumultuous past, the popularity and growth of Artificial intelligence are observable in every nook and cranny. AI has engrained itself within a multitude of industries as a principal tool for success, all whilst being a fledgling domain itself.
Chapter
In der theoretischen Diskussion ist mit einem Artificial Companion eine Reihe an Eigenschaften gemeint, welche fördern sollen, dass Nutzer:innen ein technologisches System als verlässlichen und treuen Gefährten wahrnehmen. Bislang gibt es allerdings keinen Konsens darüber, welche Eigenschaften dafür konkret notwendig sind. Der vorliegende Beitrag nähert sich deshalb der Thematik von einer praktischen Seite, damit Aussagen über die Eigenschaften heutiger Companion-Systeme getroffen werden können – welche in der vorliegenden Arbeit als Artificial Companions der ersten Generation bezeichnet werden. Der Beitrag stellt die Ergebnisse einer deskriptiven Datenanalyse von n = 50 Companion-Robotern vor, die hinsichtlich ihres Aussehens und ihrer kommunikativen Fähigkeit verglichen werden. Es erfolgt ein Vorschlag für eine Companion-Typologie anhand ihrer Einsatzgebiete inklusive Beschreibung der zentralen Aufgaben und Funktionen. Der letzte Teil erläutert zwei zentrale Motive, auf deren Grundlage Artificial Companionships entstehen können.
Chapter
Der Beitrag diskutiert den Einsatz von sozialen Robotern im sexuellen Bereich. Er zeichnet die aktuelle Debatte um Sexroboter nach und lotet das durch neue Formen der Mensch-Maschine-Interaktion eröffnete Potenzial einer posthumanistischen Sozialität aus. Dabei wird zunächst definiert, was überhaupt unter einem Sexroboter zu verstehen ist und welche Design- und Konfigurationsoptionen aktuell angeboten werden. Im nächsten Schritt wird der Forschungsstand zum Thema knapp skizziert. Schließlich wird aus genderqueerer und feministischer STS-Perspektive diskutiert, inwiefern Sexroboter uns nicht nur in die Lage versetzen, neue, nie da gewesene Arten von Sexualität und sexueller Befriedigung zu erreichen, sondern auch die Möglichkeit bergen, das anthropozentrische Denken der Moderne zu überwinden.
Article
Full-text available
Ethicists, policy-makers, and the general public have questioned whether artificial entities such as robots warrant rights or other forms of moral consideration. There is little synthesis of the research on this topic so far. We identify 294 relevant research or discussion items in our literature review of this topic. There is widespread agreement among scholars that some artificial entities could warrant moral consideration in the future, if not also the present. The reasoning varies, such as concern for the effects on artificial entities and concern for the effects on human society. Beyond the conventional consequentialist, deontological, and virtue ethicist ethical frameworks, some scholars encourage “information ethics” and “social-relational” approaches, though there are opportunities for more in-depth ethical research on the nuances of moral consideration of artificial entities. There is limited relevant empirical data collection, primarily in a few psychological studies on current moral and social attitudes of humans towards robots and other artificial entities. This suggests an important gap for psychological, sociological, economic, and organizational research on how artificial entities will be integrated into society and the factors that will determine how the interests of artificial entities are considered.
Article
Full-text available
The assignment of gender to robots is a debatable topic. Subtle aspects related to gender, in a robot’s appearance, may create biased expectations of the robot’s abilities and influence user acceptance. The present research is a cross-cultural study involving more than 150 participants to investigate the perception of gender in robot design by manipulating body proportions. We are focusing specifically on the contrast between two extremely different cultures: Peruvian and Japanese. From the survey based on stimuli varying in the proportion between chest, waist, and hips, the results indicate the importance of chest-to-hip ratio and waist-to-hip ratio in the attribution of gender to robots.
Article
Full-text available
Social robots can be used in education as tutors or peer learners. They have been shown to be effective at increasing cognitive and affective outcomes and have achieved outcomes similar to those of human tutoring on restricted tasks. This is largely because of their physical presence, which traditional learning technologies lack. We review the potential of social robots in education, discuss the technical challenges, and consider how the robot’s appearance and behavior affect learning outcomes.
Article
Full-text available
Gene-editing technology, such as CRISPR/Cas9, holds great promise for the advancement of science and many useful applications technology. This foundational technology enables modification of the genetic structure of any living organisms with unprecedented precision. Yet, in order to enhance its potential for societal benefit, it is necessary to adapt rules and produce adequate regulations. This requires an interdisciplinary effort in legal thinking. Any legislative initiative needs to consider both the benefits and the problematic aspects of gene editing, from a broader societal and value-based perspective. This paper stems from an interdisciplinary research project seeking to identify and discuss some of the most pressing legal implications of gene-editing technology and how to address these. While the questions raised by gene editing are global, laws and regulations are to a great extent bound by national borders. This paper presents a European perspective, written for a global audience, and intends to contribute to the global debate. The analysis will include brief references to corresponding USA rules in order to place these European debates in the broader international context. Our legal analysis incorporates interdisciplinary contributes concerning the scientific state of the art, philosophical thinking regarding the precautionary principle and dual-use issues as well as the importance of communication, social perception, and public debate. Focusing mainly in the main regulatory and patent law issues, we will argue that (a) general moratoriums and blank prohibitions do a disservice to science and innovation; (b) it is crucial to carefully consider a complex body of international and European fundamental rights norms applicable to gene editing;
Article
Full-text available
A socially intelligent robot must be capable to extract meaningful information in real time from the social environment and react accordingly with coherent human-like behavior. Moreover, it should be able to internalize this information, to reason on it at a higher level, build its own opinions independently, and then automatically bias the decision-making according to its unique experience. In the last decades, neuroscience research highlighted the link between the evolution of such complex behavior and the evolution of a certain level of consciousness, which cannot leave out of a body that feels emotions as discriminants and prompters. In order to develop cognitive systems for social robotics with greater human-likeliness, we used an “understanding by building” approach to model and implement a well-known theory of mind in the form of an artificial intelligence, and we tested it on a sophisticated robotic platform. The name of the presented system is SEAI (Social Emotional Artificial Intelligence), a cognitive system specifically conceived for social and emotional robots. It is designed as a bio-inspired, highly modular, hybrid system with emotion modeling and high-level reasoning capabilities. It follows the deliberative/reactive paradigm where a knowledge-based expert system is aimed at dealing with the high-level symbolic reasoning, while a more conventional reactive paradigm is deputed to the low-level processing and control. The SEAI system is also enriched by a model that simulates the Damasio’s theory of consciousness and the theory of Somatic Markers. After a review of similar bio-inspired cognitive systems, we present the scientific foundations and their computational formalization at the basis of the SEAI framework. Then, a deeper technical description of the architecture is disclosed underlining the numerous parallelisms with the human cognitive system. Finally, the influence of artificial emotions and feelings, and their link with the robot’s beliefs and decisions have been tested in a physical humanoid involved in Human–Robot Interaction (HRI).
Article
Full-text available
This essay addresses the other side of the robot ethics debate, taking up and investigating the question “Can and should robots have rights?” The examination of this subject proceeds by way of three steps or movements. We begin by looking at and analyzing the form of the question itself. There is an important philosophical difference between the two modal verbs that organize the inquiry—can and should. This difference has considerable history behind it that influences what is asked about and how. Second, capitalizing on this verbal distinction, it is possible to identify four modalities concerning social robots and the question of rights. The second section will identify and critically assess these four modalities as they have been deployed and developed in the current literature. Finally, we will conclude by proposing another alternative, a way of thinking otherwise that effectively challenges the existing rules of the game and provides for other ways of theorizing moral standing that can scale to the unique challenges and opportunities that are confronted in the face of social robots.
Article
Full-text available
The development of highly humanoid sex robots is on the technological horizon. If sex robots are integrated into the legal community as “electronic persons”, the issue of sexual consent arises, which is essential for legally and morally permissible sexual relations between human persons. This paper explores whether it is conceivable, possible, and desirable that humanoid robots should be designed such that they are capable of consenting to sex. We consider reasons for giving both “no” and “yes” answers to these three questions by examining the concept of consent in general, as well as critiques of its adequacy in the domain of sexual ethics; the relationship between consent and free will; and the relationship between consent and consciousness. Additionally we canvass the most influential existing literature on the ethics of sex with robots.
Article
Full-text available
This article briefly reviews research in cognitive development concerning the nature of the human self. It then reviews research in developmental robotics that has attempted to retrace parts of the developmental trajectory of the self. This should be of interest to developmental psychologists, and researchers in developmental robotics. As a point of departure, one of the most characteristic aspects of human social interaction is cooperation—the process of entering into a joint enterprise to achieve a common goal. Fundamental to this ability to cooperate is the underlying ability to enter into, and engage in, a self-other relation. This suggests that if we intend for robots to cooperate with humans, then to some extent robots must engage in these self-other relations, and hence they must have some aspect of a self. Decades of research in human cognitive development indicate that the self is not fully present from the outset, but rather that it is developed in a usage-based fashion, that is, through engaging with the world, including the physical world and the social world of animate intentional agents. In an effort to characterize the self, Ulric Neisser noted that self is not unitary, and he thus proposed five types of self-knowledge that correspond to five distinct components of self: ecological, interpersonal, conceptual, temporally extended, and private. He emphasized the ecological nature of each of these levels, how they are developed through the engagement of the developing child with the physical and interpersonal worlds. Crucially, development of the self has been shown to rely on the child's autobiographical memory. From the developmental robotics perspective, this suggests that in principal it would be possible to develop certain aspects of self in a robot cognitive system where the robot is engaged in the physical and social world, equipped with an autobiographical memory system. We review a series of developmental robotics studies that make progress in this enterprise. We conclude with a summary of the properties that are required for the development of these different levels of self, and we identify topics for future research.
Article
In this article, we revisit the call for a ban of robots used for sex, as introduced by Kathleen Richardson, director of the Campaign Against Sex Robots, during Ethicomp 2015. This campaign provides a case against the production, sale and use of "sex robots". To support its main claims, the materials made available by the campaign present arguments that are built on a number of specific premises, definitions and assumptions, which this paper outlines and discusses. It aims to test these premises for internal validity and logical coherence as well as to provide alternative viewpoints leading to opposing conclusions.
Article
This article describes how sexbots: sentient, self-aware, feeling artificial moral agents created soon as customised potential sexual/intimate partners provoke crucial questions for technoethics. Coeckelbergh’s model of human/robotic relations as co-evolving to their mutual benefit through mutual vulnerability is applied to sexbots. As sexbots have a sustainable claim to moral standing, benefits and vulnerabilities inherent in human/sexbots relations must be identified and addressed for both parties. Humans’ and sexbots’ vulnerabilities are explored, drawing on the philosophy and social science of dehumanisation and inclusion/exclusion. This article argues humans as creators owe a duty of care to sentient beings they create. Responsible innovation practices involving stakeholders debating ethicolegal conundrums pertaining to human duties to sexbots, and sexbots’ putative interests, rights and responsibilities are essential. These validate the legal recognition of sexbots, the protection of their interests through regulatory oversight and ethical limitations on customisation which must be put in place. Copyright © 2018, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.