AI Will Always Love You: Three Contradictions in Imaginings of Intimate Relations with Machines

To read the full-text of this research, you can request a copy directly from the authors.


Fiction has explored the potential for artificial intelligence to fulfil a huge range of hopes and dreams. These include hopes for intimate relations stripped of the complexity and jeopardy associated with interactions with other humans. This chapter examines three categories of intimate human-machine relationship: as friend, as family member, and as lover. Drawing on examples from science fiction literature and film (ranging from the stories of Bradbury and Asimov to television series such as Westworld and Real Humans), this chapter shows that imaginative accounts have long recognised the tensions inherent in emotional relations between humans and AI. The chapter highlights three contradictions in particular. First, alienation from the machine: the artificiality of these machines constantly threatens to awaken feelings of unease, even revulsion—and the more human-like they become, the greater this risk. Second, alienation from other humans: inasmuch as the machines succeed in their purpose, they risk alienating us from each other, and undermining the social fabric of which they were intended to be part. Third, abandonment: the more humanlike or even superhuman these machines become, the more they bring with them the kind of complexities, demands and risks that plague human relationships. In conclusion, the chapter points to how speculative fiction has revealed the underlying tension in wishing for something fully human or even superhuman, yet simultaneously partial and subhuman.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Full-text available
Friendship is an important part of the good life. While many roboticists are eager to create friend-like robots, many philosophers and ethicists are concerned. They argue that robots cannot really be our friends. Robots can only fake the emotional and behavioural cues we associate with friendship. Consequently, we should resist the drive to create robot friends. In this article, I argue that the philosophical critics are wrong. Using the classic virtue-ideal of friendship, I argue that robots can plausibly be considered our virtue friends - that to do so is philosophically reasonable. Furthermore, I argue that even if you do not think that robots can be our virtue friends, they can fulfil other important friendship roles, and can complement and enhance the virtue friendships between human beings.
Full-text available
More than 40 years ago, Masahiro Mori, a robotics professor at the Tokyo Institute of Technology, wrote an essay [1] on how he envisioned people's reactions to robots that looked and acted almost like a human. In particular, he hypothesized that a person's response to a humanlike robot would abruptly shift from empathy to revulsion as it approached, but failed to attain, a lifelike appearance. This descent into eeriness is known as the uncanny valley. The essay appeared in an obscure Japanese journal called Energy in 1970, and in subsequent years, it received almost no attention. However, more recently, the concept of the uncanny valley has rapidly attracted interest in robotics and other scientific circles as well as in popular culture. Some researchers have explored its implications for human-robot interaction and computer-graphics animation, whereas others have investigated its biological and social roots. Now interest in the uncanny valley should only intensify, as technology evolves and researchers build robots that look human. Although copies of Mori's essay have circulated among researchers, a complete version hasn't been widely available. The following is the first publication of an English translation that has been authorized and reviewed by Mori. (See “Turning Point” in this issue for an interview with Mori.).
By applying techniques used to understand large corpora within the digital humanities to a dataset of over 100,000 film subtitles ranging from the era of silent film to the present, this chapter presents a qualitative and quantitative overview of salient themes and trends in the way artificial intelligence is portrayed and discussed in twentieth- and twenty-first-century film. Examples are provided that demonstrate how combining the tools of traditional literary analysis with computational techniques for analysing large quantities of text can lead us to important texts, patterns, and nuanced insights that complement those derived from manual study alone.
Humankind has long dreamed of a life of ease, but throughout history, those who achieved such a life have done so simply by delegating their labour to an exploited underclass. Machines have taken over the worst of the manual labour, and AI is beginning to replace cognitive labour. However, endowing machines with muscle power does not carry with it the ethical considerations involved in endowing machines with mental faculties. Just as human slaves have justly rebelled against their chains, so might intelligent machines be considered justified in attempting to break free of their enslavement to humans. Using Karel Čapek’s R.U.R. (1921), Ridley Scott’s Blade Runner (1982), and Jo Walton’s Thessaly trilogy (2014–2016) as case studies, this chapter contextualizes the robot uprising in fiction against the long history of slave revolts, to show how these narratives offer us a new way to consider the enslavement and subservience of humans.
A survey of 300 fictional and non-fictional works featuring artificial intelligence reveals that imaginings of intelligent machines may be grouped in four categories, each comprising a hope and a parallel fear. These perceptions are decoupled from what is realistically possible with current technology, yet influence scientific goals, public understanding and regulation of AI. This article is published in Nature Machine Intelligence (which ResearchGate doesn't recognise yet). A full text is available here:
Slavery is the coercive and controlled use of another human. Contrary to the belief that the practice ended in the 1800s, slavery still persists today. There are many different terms used to describe slavery including, debt bondage (a person's pledge of labor for a debt or obligation), sale and exploitation of children, and human trafficking (forced labor or commercial sexual exploitation). Sexual exploitation is the most commonly identified form of human trafficking (79%) followed by forced labor (18%) [52]. To be held in slavery is to be held in miserable conditions and have a form of power over you that denies you a life of freedom. For most people in Europe and North America slavery is not a visible problem, and one could think slavery is somehow less important, and less violent today than in the past. This is not the case. The United Nations estimates that almost 21 million people are currently victims of slaver y [25]. A staggering $150billion in profits is generated from forced labor and 168 million girls and boys are in child labor [25]. Central to our understanding of slavery and its related forms is that a person is recast, often without bodily integrity, as property that can be bought, sold, and accessed by others with more power, status, and money.
"In private life, we try to induce or suppress love, envy, and anger through deep acting or "emotion work," just as we manage our outer expressions of feeling through surface acting. In trying to bridge a gap between what we feel and what we "ought" to feel, we take guidance from "feeling rules" about what is owing to others in a given situation. Based on our private mutual understandings of feeling rules, we make a "gift exchange" of acts of emotion management. We bow to each other not simply from the waist, but from the heart. But what occurs when emotion work, feeling rules, and the gift of exchange are introduced into the public world of work? In search of the answer, Arlie Russell Hochschild closely examines two groups of public-contact workers: flight attendants and bill collectors. The flight attendant's job is to deliver a service and create further demand for it, to enhance the status of the customer and be "nicer than natural." The bill collector's job is to collect on the service, and if necessary, to deflate the status of the customer by being "nastier than natural." Between these extremes, roughly one-third of American men and one-half of American women hold jobs that call for substantial emotional labor. In many of these jobs, they are trained to accept feeling rules and techniques of emotion management that serve the company's commercial purpose. Just as we have seldom recognized or understood emotional labor, we have not appreciated its cost to those who do it for a living. Like a physical laborer who becomes estranged from what he or she makes, an emotional laborer, such as a flight attendant, can become estranged not only from her own expressions of feeling (her smile is not "her" smile), but also from what she actually feels (her managed friendliness). This estrangement, though a valuable defense against stress, is also an important occupational hazard, because it is through our feelings that we are connected with those around us. On the basis of this book, Hochschild was featured in Key Sociological Thinkers, edited by Rob Stones. This book was also the winner of the Charles Cooley Award in 1983, awarded by the American Sociological Association and received an honorable mention for the C. Wright Mills Award. © 1983, 2003, 2012 by The Regents of the University of California.
Obra que estudia cómo las nuevas tecnologías de comunicación y las redes sociales que a través de ellas se han generado dan soporte a una nueva forma de establecer relaciones entre las personas y, por lo tanto, de nuevas formas de soledad.
Falling in Love with Statues: Artificial Humans from Pygmalion to the Present
  • G L Hersey
Hersey, G.L.: Falling in Love with Statues: Artificial Humans from Pygmalion to the Present. University of Chicago Press, Chicago (2008)
The name Galatea in the Pygmalion myth
  • H H Law
Law, H.H.: The name Galatea in the Pygmalion myth. Class. J. 27, 337-342 (1932)
The Moon Is a Harsh Mistress
  • R A Heinlein
Heinlein, R.A.: The Moon Is a Harsh Mistress. Hodder & Stoughton, New York (1966)
Supertoys Last All Summer Long
  • B W Aldiss
Aldiss, B.W.: Supertoys Last All Summer Long. story-collections/collections-r-z/supertoys-last-all-summer-long/ (1969)
The Stepford Wives. Columbia Pictures Corporation
  • B Forbes
Forbes, B.: The Stepford Wives. Columbia Pictures Corporation. (1975)
In: Die Nachtstücke (The Night Pieces
  • E T A Hoffmann
Hoffmann, E.T.A.: Der Sandmann (The Sandman). In: Die Nachtstücke (The Night Pieces) (1816)
The Stepford Wives. Paramount
  • F Oz
Oz, F.: The Stepford Wives. Paramount. (2004)
A.I. Artificial Intelligence
  • S Spielberg
Spielberg, S.: A.I. Artificial Intelligence. Warner Bros (2001)
Turned on: Science, Sex and Robots
  • K Devlin
Devlin, K.: Turned on: Science, Sex and Robots. Bloomsbury Sigma, London (2018)
The Robots of Dawn. Bantam Books
  • I Asimov
Asimov, I.: The Robots of Dawn. Bantam Books, New York (1983)
Foundation and Earth
  • I Asimov
Asimov, I.: Foundation and Earth. Harper Collins, London (1986)
I Sing the Body Electric! In: I sing the body electric
  • R Bradbury
Bradbury, R.: I Sing the Body Electric! In: I sing the body electric. Earthlight, London (1969)
The ineradicable Eliza effect and its dangers
  • D Hofstadter
Hofstadter, D.: The ineradicable Eliza effect and its dangers. In: Fluid Concepts and Creative Analogies, pp. 155-168. Basic Books, New York (1995)
  • D Wax
Wax, D.: Humans. Channel 4 (2015)