ArticlePDF Available

Intermediaries: Reflections on virtual humans, gender, and the Uncanny Valley

Authors:

Abstract

Embodied interface agents are designed to ease the use of technology. Furthermore, they present one possible solution for future interaction scenarios beyond the desktop metaphor. Trust and believability play an important role in the relationship between user and the virtual counterpart. In order to reach this goal, a high degree of anthropomorphism in appearance and behavior of the artifact is pursued. According to the notion of the Uncanny Valley, however, this actually may have quite the opposite effect. This article provides an analysis of the Uncanny Valley effect from a cultural and gender studies perspective. It invites readers to take a closer look at the narratives that influence the production of anthropomorphic artifacts. The article starts with a short introduction of the idea of the Uncanny Valley and gives a brief overview of current artifacts. Following this, a semiotic view on computer science is proposed, which in a further step serves as an epistemological grounding for a gender-critical rereading of the Turing test. This perspective will be supported by analyzing a classic story of user and artifact—E.T.A. Hoffmann’s narration of Olimpia. Finally, the special character of anthropomorphic artifacts is discussed by taking Freud’s concept of “Das Unheimliche”, as well as theories of identity formation into consideration, closing with a plea for a more diverse artifact production.
ORIGINAL ARTICLE
Intermediaries: reflections on virtual humans, gender,
and the Uncanny Valley
Claude Draude
Received: 8 December 2009 / Accepted: 1 December 2010
Springer-Verlag London Limited 2010
Abstract Embodied interface agents are designed to ease
the use of technology. Furthermore, they present one pos-
sible solution for future interaction scenarios beyond the
desktop metaphor. Trust and believability play an impor-
tant role in the relationship between user and the virtual
counterpart. In order to reach this goal, a high degree of
anthropomorphism in appearance and behavior of the
artifact is pursued. According to the notion of the Uncanny
Valley, however, this actually may have quite the opposite
effect. This article provides an analysis of the Uncanny
Valley effect from a cultural and gender studies perspec-
tive. It invites readers to take a closer look at the narratives
that influence the production of anthropomorphic artifacts.
The article starts with a short introduction of the idea of the
Uncanny Valley and gives a brief overview of current
artifacts. Following this, a semiotic view on computer
science is proposed, which in a further step serves as an
epistemological grounding for a gender-critical rereading
of the Turing test. This perspective will be supported by
analyzing a classic story of user and artifact—E.T.A.
Hoffmann’s narration of Olimpia. Finally, the special
character of anthropomorphic artifacts is discussed by
taking Freud’s concept of ‘‘Das Unheimliche’’, as well as
theories of identity formation into consideration, closing
with a plea for a more diverse artifact production.
Keywords Uncanny Valley Turing test
Anthropomorphism Gender Psychoanalysis
Human–computer interaction Theory Cultural studies
1 Valleys and gaps
1.1 Between life and death
In 1970, roboticist Masahiro Mori published a theory on
how humans react emotionally to artificial beings (Mori
1970). According to Mori, the role model of robotics is the
human. In a graphic, he links the trustworthiness of the
artifact to its human resemblance.
As the Fig. 1shows, human likeness evokes trust only up
to a certain point. If the robot comes very close to appearing
human, but of course is not quite the real thing, minor lapses
will produce irritations. On its way to reach the peak of
humaneness, the robot falls into the depths of the Uncanny
Valley. The starting point for Mori’s considerations is
industrial robots who simulate certain human actions but not
human appearance. Adding to this, he differentiates between
mobile and immobile objects. Especially, the ability to move
autonomously contributes to the lifelikeness of the artifact,
but it also adds to its potential creepiness. Most interesting,
the Uncanny Valley addresses matters of life and death. Even
more scary than those who actually are dead, appear to be the
beings that are situated between the two discrete states: In
the abyss, zombies, and other undead creatures lurk—the
deepest point of the valley is inhabited by those who are
neither dead nor alive. Mori illustrates this ambiguity by
using the example of a prosthetic hand. If the artifact looks
like a healthy human hand, but feels cold and alien when
touched, it may be experienced as slightly disturbing at the
least and as horrifying at the worst. The prosthetic hand can
be unsettling precisely because it invokes an encounter with
the living dead. According to this, the uncanny is triggered
because of the discrepancy between looking at and touching
of the object. A further-reaching discussion of the Uncanny
Valley effect is lacking in Mori’s paper. Still, he wonders:
C. Draude (&)
Socio-technical system design & gender,
Department of Computer Science,
University of Bremen, Bremen, Germany
e-mail: cdraude@tzi.de
123
AI & Soc
DOI 10.1007/s00146-010-0312-4
‘Why do we humans have such a feeling of strangeness? Is
this necessary?’’(ibid.). As a roboticist, his perspective is
application-oriented. For the design process of anthropo-
morphic robots, he advises to go for the first peak shown in
the graphic, but not further. This means that the design of the
artificial being accepts a cut back on lifelikeness, but avoids
stumbling into the Uncanny Valley.
1.2 Material-semiotic embodiments: closing the gap?
1.2.1 Robots and virtual humans
Mori’s concept is discussed controversially; it has been
considered non-scientific (Ferber 2003) and questionable
(Bartneck 2007) or served as inspiration (MacDorman
2005). But even if not addressed explicitly, the Uncanny
(Freud 1963) plays a role when it comes to the design of
artificial beings. It serves as a nodal point for the accep-
tance and the overall impact of artificial beings, such as
humanoid robots, embodied interface agents, computer
game figures or avatars. Human characters in animation
films, for example, are often considered to fall into the
Uncanny Valley when they are designed to achieve a very
realistic appearance.
1
Successful movies, in contrast, tend
to employ features that are more cartoon-like in order to
avoid the effect.
2
Instead of aiming at a copy of the real
world, an original aesthetic gets created. In social robotics,
there exist a variety of possible embodiment forms. When
it comes to the Uncanny Valley especially the Actroids,
3
lifelike humanoid robots that are designed to explore and
challenge the effect are worth mentioning. With their sili-
con body and respiratory sounds, they try to achieve what
is considered the healthy person status in Mori’s picture.
The doppelga
¨nger status adds to their uncanniness and
provokes ethical questions on the cloning of humans. Other
roboticists, aware of it or not, follow Mori’s dictum. The
MIT humanoid robotics group does not build artifacts that
mirror human appearance,
4
and Honda’s humanoids are
covered by space suits.
5
Interestingly, an attribution to one
gender is very obvious in case of the Actroids, whereas the
MIT group seeks to avoid a gendering of the artifact.
6
When it comes to the design of embodied interface agents,
no such diversity can be found. These Virtual Humans
7
(Magnenat-Thalmann 2004) aspire toward the healthy
person status as well. Here, the simulation of lifelike
human behavior and appearance is the goal. Just like social
robots, Virtual Humans should possess a high degree of
autonomy, they should be proactive, and they should dis-
play emotional artificial intelligence. All this is considered
to lead to a behavior that is verbally as well as non-verbally
convincing. Scenarios that employ Virtual Humans favor
the concept of a shared space, a mixed reality (Kopp 2003).
Just as in the story of Alice in ‘‘Through the Looking-
Glass’’ (Carrol 1992), where the mirror serves as an
interface that opens up a spatial dimension as well as it is
an imaginary place, the conceptualization of Virtual
Humans is driven by narratives that interweave human and
non-human actors in a collective environment. Virtual
reality technologies mean that the mirror or screen
becomes unnoticeable and eventually disappears com-
pletely. New technological scenarios go far beyond the idea
of the personal computer and the desktop metaphor.
8
Software agents literally are Lichtgestalten.
9
In contrast
to robots, they cannot move through physical space. It is
Fig. 1 The Uncanny Valley. Adapted from: http://en.wikipedia.org/
wiki/File:Mori_Uncanny_Valley.svg
1
Cf. discussions on: ’The Polar Express’ (Robert Zemecki, USA,
2004). http://wardomatic.blogspot.com/2004/12/polar-express-virtual-
train-wreck_18.html. Accessed 1 May 2010.
2
For example ’Shrek’ (Andrew Adamson, Vicky Jenson, USA,
2001).
3
http://www.ed.ams.eng.osaka-u.ac.jp/index.en.html.http://www.ed.
ams.eng.osaka-u.ac.jp/research/0007/. Accessed 1 May 2010.
4
http://www.ai.mit.edu/projects/humanoid-robotics-group. Accessed
30 Jan 2010.
5
http://www.honda-robots.com/english/html/p3/frameset2.html.
Accessed 13 Jan 2010.
6
Cf. MIT Humanoid Robotics Group. FAQs http://www.ai.mit.edu/
projects/humanoid-robotics-group. Accessed 13 Jan 2010.
7
I use Virtual Human as a collective term for embodied conversa-
tional agents, personal service assistants, digital substitutes etc. I find
the term Virtual Human especially challenging and interesting
because it links the virtual and the human sphere rather than
separating it.
8
Cf. The ‘‘Universal Fan Fest’’ project, which is part of Japan’s bid
submitted to football’s world governing body FIFA. http://www.
google.com/hostednews/afp/article/ALeqM5gNVZsxBSbgXx268O1
6flQfqXOs_w. Accessed 26 May 2010.
9
Translation: ‘‘Beings of light’’.
AI & Soc
123
precisely their on-screen or projected visual form of
embodiment that seems to free them from the constraints
that come with having a material body. As stated above,
Mori names the discrepancy between looking at and
touching of the artifact as one major source of irritation.
With the interface agents, touching is impossible—a
human cannot shake a Virtual Human’s hand.
10
That these
artifacts nevertheless are viewed as valid interaction part-
ners can be regarded as a shift of the relation between the
visual and the haptic senses. And this may be read against
the background of a broader sociocultural re-conceptuali-
zation, where new technology and media practices turn the
material body into a ‘‘visual medium’’ (Balsamo 2003).
Nowadays, Mori’s example of the prosthetic hand falls
short when it comes to explaining the potential uncanny
effect of Virtual Humans. And because it is not this gap
between look and feel alone that produces disturbing arti-
facts, further considerations have to be taken into account.
1.2.2 Semiotics: interface design as a process of sign
mediation
In a way, Virtual Humans are like ghosts. The term avatar
with its religious origin
11
highlights the transcendent nature
of the virtual doppelga
¨nger. The adoption of the term in
order to denominate programmers or user representatives in
forums, games, and online worlds is not a coincidence. Just
as the religious concept, the virtual counterpart, too, pre-
sents a dematerialized form of embodiment. The figuration
avatar speaks of the wish to overcome the restraints of the
physical world; it exemplifies the desire to leave and beat
the meat as it is called in Cyberpunk fiction. Of course, the
goal of embodied interface agent research is not to construct
metaphysical devices, but to make computer usage easier.
And anthropomorphism is used to reach a broader band-
width in the interaction: The human body, or here its sim-
ulation, serves as a medium that is able to produce a direct
and more intuitive form of information exchange.
The special character of Virtual Humans becomes
clearer when their origin is reconsidered. As intermediary
between the human and the abstract levels of computing
technology, an interface agent needs to address both
worlds. And because organic life and computers do not
operate on the same basis, there need to be modes of
translation or transformation. These modes fall in the logic
of ‘‘the translation of the world into a problem of coding’
(Haraway 1991) that are due to the character of the com-
puter as a ‘‘semiotic machine’’ (Nadin 2007).
This is not just of importance when it comes to
designing interfaces. The shift is constitutive for the whole
area of techno- and life-sciences and must be viewed as a
general sociotechnical transformation. The noteworthy
thing about computers is they do not process material
objects as other machines might do, they process semiotic
representations—descriptions of objects, bodies, environ-
ments etc.; I simplify this point here and I do not cover the
full cycle of abstraction and re-modeling that needs to take
place comprising steps like formalization, standardization,
and executability (cf. Nake 1993). Nevertheless, I want to
stress the special character of the ‘‘algorithmic sign’’ (Nake
2001)—a sign which simultaneously gets interpreted by the
computer and by the user.
The computer and the human participate in an ongoing
process of sign/signal exchange and interpretation/proces-
sing. In current interface scenarios, the computer screen,
mouse, and keyboard play the important role. The sign or
symbol on the screen is to be interpreted by the user—
likewise the user manipulates computational objects fol-
lowing the executive character of the algorithmic sign.
Interface design in this sense means organizing the process
of sign mediation in a way that the interpretative activity of
the user corresponds with the functioning principle of the
computer. In the cycle of abstraction mentioned above, the
algorithmic sign is stripped off its context in order to
become computable. But, on the other hand, the sign has to
resurface in a way that is comprehensible for the human.
This double nature is the challenge software designers
have to face. The computer’s language in this picture
appears to be precise, rule-oriented, and non-ambiguous—
and that of the human as quite the opposite. For the human,
a sign is relational and complex—for the computer the
signal is a state. Following the development of interface
solutions throughout the years, the crucial point seems to
have been to either ‘‘move the system closer to the user’’ or
to ‘‘move the user closer to the system’’ (Norman and
Draper 1986). Simply put, from a semiotic point of view,
the question is whether the signs of and on the interface are
organized in a way that the user experiences them as being
further away from—or closer to the computational basis.
The computational basis in this discourse is set as abstract
and difficult to understand—it presents an area for experts,
not the everyday user. And that is precisely because signal-
processing appears to be context-free, which also is a
question of its materiality:‘[] an electronic signal does
not have a singular identity—a particular state qualitatively
different from all other possible states. [] In contrast to a
10
Even though this experience, too, might change. Cf. the tangible
hologram projector, which was presented at SIGGRAPH 2009. http://
www.nextnature.net/2009/08/tangible-hologram-projector/. Accessed
26 May 2010.
11
Avatar is taken from Hinduism; in its religious context ‘avatar’ is
used for describing the human or animal form of embodiment of a
god—or a godly quality—after descending from heaven to earth. She
or he may emerge on different places at the same time. Therefore, the
avatar describes a form of representation that is not bound to the rules
of physical reality. Instead the avatar belongs to a meta-reality, where
death and pain have no meaning.
AI & Soc
123
material object, the electronic signal is essentially muta-
ble’’ (Manovich 2001). According to Donna Haraway,
artificial intelligence research is noted for an epistemo-
logical shift. A shift where, ‘‘organisms and machines alike
were repositioned on the same ontological level, where
attention was riveted on semiosis, or the process by which
something functioned as a sign’’ (Haraway 1997). This
comes as no surprise given the underlying principles of
computer science. Here, it is essential to note that reduction
is only one way of characterizing processes of abstraction.
Simultaneously, a kind of doubling effect takes place
inherent to the procedure of semioticizing. Simply put, you
create a new world with language. Thus, the very principles
of computer science encourage procreation (Nake 1993).
Especially with artificial intelligence technologies, the
constructive character of language becomes viable. The
algorithmic sign obtains a circulating and relational char-
acter; it gets derived from the world, but it has got for-
mative effects as well. From this perspective, the Virtual
Human almost literally re-sensualizes abstract technology
by providing it with a ‘‘Zeichenhaut’’—in order to become
computable all matters must grow a skin (German: Haut) of
signs (German: Zeichen) (ibid.). Anthropomorphic agents
present a very interesting solution for the mediation pro-
cess that takes place at the interface. With their body made
out of signs, they are constructed to close the gap between
humans and machines. Or, put differently, they should heal
the split the hyphen in human–computer interaction sym-
bolizes. This oscillating, in-between status is a reminder of
ghosts, in the way that a dematerialized body emerges that
lives in both worlds—that of the living humans and that of
the dead (machines).
The relation between sign and the body is of rather far-
reaching importance as the following rereading of the
experimental setting of the Turing test proposes.
2 Reconsidering the Turing test
In the research field of embodied interface agents, a suc-
cessful scenario is characterized by the Virtual Human’s
ability to pass as a believable interaction partner. With the
simulation of human appearance and behavior, the socio-
cultural world finds its way into interface design.
12
The fact
that humans need to trust their virtual counterpart and feel
comfortable with it (Ruttkay 2004) brings up the topic of
the Uncanny Valley effect, even if it is at most times not
discussed explicitly. Under no circumstances should the
artifact arouse uncanny or unsettling feelings. In the
research field, a high level of trust and believability is in-
terlinked with the goal to design the Virtual Human as
lifelike as possible. Even if not addressed as such, the
technological mirroring of the human always invokes and
recites a web of identity-establishing categories like gender
and interdependent markers (ethnicity, cultural back-
ground, age, sexual orientation etc.). Not surprisingly, a
believable virtual doppelga
¨nger is linked to a believable
performance of gender. For the construction of anthropo-
morphic interface agents, it has even been stated that
transgressing the human–machine boundary seems less
threatening than transgressing the cultural order of gender
(Bath 2002;Lu
¨bke 2005). Or, put differently, it seems
more acceptable to mix artificial and real life than to
question heteronormative
13
gender relations. In the fol-
lowing, I want to discuss this finding and explain how it
relates to the Uncanny Valley effect.
2.1 The gender imitation game
Interestingly so, it is one of the most classic papers of arti-
ficial intelligence research that interweaves the human–
machine boundary with the cultural order of the two genders.
The Turing test, proposed in 1950, challenges the ability of a
computer to engage in human-like conversation. While
various critics analyze the notion of machine and intelli-
gence Turing develops (Searle 1984; Weizenbaum 1983),
others (Hayles 1999a,b) have stated that the gender rele-
vance of this ‘‘founding narrative of artificial intelligence
and cybernetics’’ (Bath 2002, p. 85)
14
at most times gets
neglected when the test is mentioned today. In the first
version of the paper ‘‘Computing machinery and intelli-
gence’’, Alan Turing starts with inventing the ‘‘Imitation
Game’’, which ‘‘is played with three people, a man (A), a
woman (B), and an interrogator (C) who may be of either
sex. The interrogator stays in a room apart from the other
two. The object of the game for the interrogator is to
determine which of the other two is the man and which is the
woman. He knows them by labels X and Y, and at the end of
the game, he says either ‘X is A and Y is B’ or ‘X is B and Y
is A’’’ (Turing 1950). Thus, before Turing develops a sce-
nario for human–machine interaction, he invents a gender
imitation game, in which different roles are attributed to
each gender. The role of the woman is to be of assistance to
the interrogator, and Turing suggests that she should do that
by being truthful. At the same time, it becomes clear that this
as well may cause confusion, because the man might equally
claim to be the woman. So, in the course of the game, in fact
12
Cf. for example Creating bonds with humanoids. AAMAS 2005
Workshop. http://www.iut.univ-paris8.fr/*pelachaud/AAMAS05.
Accessed 13 Jan 2010.
13
The term heteronormativity questions heterosexuality as a dom-
inant normative setting, which excludes other sexual orientations,
lifestyles and identity concepts.
14
Translation CD.
AI & Soc
123
both players try to convince the interrogator that they are the
woman. Turing then suggests to replace the original ques-
tion, ‘‘Can machines think?’’, by the question, ‘‘What will
happen when a machine takes the part of A in this game?’’.
According to this, the imitation of the woman by the man
may be replaced by the imitation of the woman by the
machine. By doing so, the test produces a gender-biased
scenario, but it also introduces the notion of ‘‘doing gender’’
(Butler 2004), of gender as a performance rather than a fixed,
given state.
15
First, Turing suggests that a man may trans-
gress his original gender attribution, before in a second step,
he links this to the overcoming of the human–machine
boundary. In order to understand the impact of the test, it is
important to consider how Turing arrives at this intersection
of gender/machine performance. Here, the character of the
computer as a semiotic machine, and the relation between
materiality and the (algorithmic) sign, plays a crucial role.
With the gender imitation game, Turing suggests a split
between the human body and the sign. He describes an
experiment in which references to the human body should be
eliminated as far as possible. The answers in the game must
be delivered via typewriter, because handwriting is too close
to the human body and might be a giveaway. The gendered
coding of the human voice would equally pose a threat to the
success of the game. In the test setting, it is the corporeality
of the embodiment that threatens to reveal which player is
human and which is the machine—as much as it reveals in
the original imitation game, which player is the woman and
which is the man. In other words, according to the Turing
test, the sign as in the typewritten language is treated as freed
from the connotations, restraints, and limits an embodied
existence brings along. In the course of the test, embodiment
can mean either the physical materiality of the machine or
the human body. It is this decoupling of the sign and the
human body which makes it possible to attribute a rather
radical, subversive potential to the 1950s Turing test. As I
have stated above, the test is gender biased, and from a
historico-cultural perspective, it is no coincidence that the
female embodiment and the machine performance super-
impose.
16
Nevertheless, the test does introduce a certain
form of gender queering acted out by the man.
17
Following
this, the test suggests that the heteronormative gender order
is a symbolic order all along.
Or, put differently: ‘‘This construction necessarily
makes the subject into a cyborg, for the enacted and
represented bodies are brought into conjunction through
the technology that connects them. If you distinguish cor-
rectly which is the man and which the woman, you in effect
reunite the enacted and the represented bodies into a single
gender identity. The very existence of the test, however,
implies that you may also make the wrong choice []
What the Turing test ‘proves’ is that the overlay between
the enacted and the represented bodies is no longer a nat-
ural inevitability but a contingent production, mediated by
a technology that has become so entwined with the pro-
duction of identity that it can no longer meaningfully be
separated from the human subject’’ (Hayles 1999a,b,
p. xiii). In early cyberfeminist discourse (cf. Stone 1991),
exactly this potential of new technology, namely the
potential to subvert common gender codes by disarranging
naturalized assumptions on bodies and identities, has been
welcomed. The deconstructive possibilities that the virtual
mirror provides, on the other hand, may be experienced as
disturbing and thus allow a deeper insight into what is
happening at the borders of the Uncanny Valley.
2.2 A face-to-face Turing test
To sum up, the possibly uncanny artifacts of artificial
intelligence research point at a provocative connection
between the gender order and computer science’s basic
principles. At first glance, the situation seems paradox: The
logic of computing translates the human body into a con-
struct, and this move could serve as an entry point for the
deconstruction of stereotypical identity concepts. With end
products like the Virtual Human, the idea of a lifelike
human copy gets favored. Just as in Mori’s graph,
anthropomorphic artificial beings seek to gain the status of
ahealthy person. And in effect, this goal leads to an ide-
alized, overconformed image of the human rather than to
the construction of diverse, flexible forms of virtual
embodiment. With the Virtual Human, a mostly unques-
tioned state of naturalness is pursued. And precisely, this
naturalizing effect of the artifact is used to mask the
working modes of the underlying technological device
(von Braun 2001, p. 103). It is important to keep in mind
that the setting of the Turing test gets established in ref-
erence to the cultural gender order, but that it still intro-
duces gender as a performance and therefore disrupts the
nature-culture dichotomy. Turing reaches at this point by
freeing the scenario from the constraints of embodiment.
Turing made it clear that ‘‘no engineer or chemist claims to
be able to produce a material that is indistinguishable from
the human skin. It is possible that at some time, this might
be done, but even supposing this invention available we
should feel there was little point in trying to make a
thinking machine more human by dressing it up in such
artificial flesh’’ (Turing 1950).
15
This does not mean that ones gendered existence is a simple
performative act. The concept describes the rather complex produc-
tive power of discursive reality.
16
I cannot exemplify this point of the female embodiment here. For a
thorough analysis see (Kormann 2006).
17
This is even more interesting against the background of Turing’s
life, his sexual orientation and the sufferings he had to endure because
of it. Cf. (Hodges 1992).
AI & Soc
123
With social robots and Virtual Humans, however, it is
exactly the goal to bring embodiment back into the picture.
For this, the original setting of the Turing test changes into
aface-to-face situation, and thus, an important epistemo-
logical shift takes place. Effectively, now not only the
output but the body itself should be able to trick the
audience. The ‘‘artificial flesh’’ in which the Virtual
Humans are ‘‘dressed up’’ is, in appearance and behavior,
always gendered artificial flesh.
And precisely at this point, the uncanny re-enters the
stage, as the following remark by Justine Cassell shows:
‘One way to think about the problem
18
that we face is to
imagine that we succeed beyond our wildest dreams in
building a computer that can carry on a face-to-face Turing
test. That is, imagine a panel of judges challenged to
determine which socialite was a real live young woman and
which was an automaton (as in Hoffmann’s ‘The Sand-
man’). Or, rather, perhaps to judge which screen was a part
of a video conferencing setup, displaying an autonomous
embodied conversational agent running on a computer. In
order to win at the Turing test, what underlying models of
human conversation would we need to implement, and what
surface behaviors would our embodied conversational agent
need to display?’’ (Cassell 2000, p. 2). In this new, adapted
version of the Turing test, the uncanny emerges in reference
to E.T.A. famous story ‘‘Der Sandmann’’ (Hoffmann 1994/
1817). In contrast to more common virtual forms of
embodiment like computer game characters or avatars, the
Virtual Human is conceptualized as autonomous interaction
partner, and the artifacts are currently not integrated into
everyday environments. To illustrate their potential, often
examples from films and literature can be found in the
research material. Using ‘‘Der Sandmann’’ as a vision not
only draws the attention to the gendered implications of the
human–machine boundary, but it also points at the possible
uncanniness of the artificial being. At most times, dystopian
threads of science fiction are neglected when used as an
example. ‘‘Der Sandmann’’, especially, produces a picture
that is not a very uplifting. Why then is it, that Cassell
recites this romantic story in which Nathanael, the user of
the machine Olimpia, dies in the end and the artifact gets
dismantled? What can be learned from this for a broader
conception of human–humanoid interaction?
3 Virtually gender trouble
3.1 The case of Olimpia
Translated into the area of computer science, ‘‘Der Sand-
mann’’ tells a story of user and artifact. In the narration, the
male protagonist Nathanael gets frustrated with his fiancee
Clara, mainly because she rejects the flow of his ongoing
poetic recitations. He encounters the artificial being
Olimpia and falls in love with her. In the course of the
novel, the roles of the real live young woman Clara and that
of the automaton Olimpia transpose. Exactly as it is
envisioned in the Turing test, the woman gets replaced by
the machine. Subsequently, Nathanael experiences Olimpia
as warm and caring, whereas the character of Clara, for
him, reverses. But this only happens for Nathanael. Olim-
pia, who in the perspective of all the others in the story
remains cold and machine-like, serves as a projection space
for him. She truly represents a desiring machine.
19
When it
comes to the encounter between Nathanael and Olimpia, it
is his agency that animates the object. The fact that his lips
spread warmth to hers, that the spark of his eyes activate
hers, is noteworthy for the field of human–computer
interaction: the ability of the user to construct a meaningful
scenario should not be underestimated. For a short
moment, even Nathanael experiences the uncanny effect
that Mori describes for the prosthetic hand, but quickly he
manages to overcome the Uncanny Valley: ‘‘Olympia’s
hand was as cold as ice; he felt a horrible deathly chill
thrilling through him. He looked into her eyes, which
beamed back full of love and desire, and at the same time,
it seemed as though her pulse began to beat and her life’s
blood to flow into her cold hand’’.
20
According to this, the
human–machine interaction in this story gets established
and stabilized via acting out a heterosexual relationship.
Olimpia’s passing of the Turing test depends on the fact
whether her gender performance is convincing enough to
superimpose the machine character. As stated above,
Olimpia passes only in relation to Nathanael, all the others
experience her as uncanny. Siegmund, Nathanael’s friend,
is extremely worried and voices his concerns about Olim-
pia: ‘‘Nevertheless, it is strange that many of us think much
the same about Olympia. To us—pray do not take it ill,
brother she appears singularly stiff and soulless. Her shape
is well proportioned—so is her face—that is true! She
might pass for beautiful if her glance were not so utterly
without a ray of life—without the power of vision. Her
pace is strangely regular; every movement seems to depend
on some wound-up clockwork. Her playing and her singing
keep the same unpleasantly correct and spiritless time as a
musical box, and the same may be said of her dancing. We
find your Olympia quite uncanny and prefer to have
nothing to do with her. She seems to act like a living being
18
She is referring to the problem of human–computer interaction.
19
Cf. the cyberpunk novel ’Idoru’, in which the virtual being Rei
Toei is an ‘‘aggregate of subjective desire’’ (Gibson 1996).
20
Hoffmann 1994/1817, p. 37. Here: English translation by John
Oxenford. http://www.fln.vcu.edu/hoffmann/sand_e.html. Accessed
15 Feb 2010.
AI & Soc
123
and yet has some strange peculiarity of her own’’.
21
Hence,
Olimpia’s computability and rule-orientation do not simply
make her boring and predictable; she falls into the Uncanny
Valley. Olimpia is accused of just pretending to be a life-
like being which here equals: she just pretends to be a
woman.
In Sherry Turkle’s work on the computer as a ‘‘Second
Self’’, it is the machine origin in particular that renders the
anthropomorphic artifact as uncanny. She states that: ‘‘A
being that is not born of a mother, that does not feel the
vulnerability of childhood, a being that does not know
sexuality or anticipate death, this being is alien’’ (Turkle
1984, p. 311). And indeed, science fiction narratives are
full of lost beings that search for some kind of belonging,
which at most times results in a quest for a proof of their
own genealogical identity.
22
Now, I am far from making
this point in order to support oppositions of natural origins
in contrast to artificial ones. I am making it to stress that
the entrance into the human world is interwoven with the
cultural order of the two genders and to point at the way the
boundary between human and artifact usually gets drawn.
On the one side, there are organic heterosexual reproduc-
tion, vulnerability, fear of death, the finiteness of life,
which define humanity. Beings like Olimpia, on the other
side, hold the power to transgress this ‘‘life cycle’’ (ibid.),
but they pay for this by risking to appear non-human,
uncanny and alien.
3.2 The Uncanny revisited
In order to unravel the entanglement that the Uncanny
Valley effect, the cultural order of gender and the human-
artifact relation produce, several threads can be taken up.
For example, in an article on Digital Beauties, Karin Es-
ders states that it is the virtual beings lack of being traced
back to a material body that makes it uncanny (Esders
2007). For her, the mostly overconform appearance of
Virtual Humans derives from the missing ‘‘material refer-
ence and bodily distinctiveness’’ (ibid., p. 101
23
) that real
human subjects inevitably possesses—and which holds a
potential to induce moments of resistance and leads to the
variety of embodiment forms and identity concepts the real
world comprises. This links the uncanny to the origin of the
artifact again, as well as Esders’ findings pose questions on
the role of the material and the semiotic.
According to Mori, humanoids fall into the Uncanny
Valley if they reach a high degree of human likeness but
still produce minor lapses. They are somehow not quite
there yet. Hence, ‘‘virtual beings embody a state of ‘as well
as’ and of ‘neither–nor’’’ (Esders 2007,p.111
24
), and this
not only points at the potential to recode and transgress
what is considered human, but also at disturbances in the
realm of gender.
In his classic essay on the uncanny effects of ‘Der
Sandmann’’, Sigmund Freud defines ‘‘the uncanny’ as
‘that class of the frightening which leads back to what is
known of old and long familiar’’ (Freud 1963,p.46
25
). The
uncanny in this view is something which has been
repressed and then re-enters the stage. Now, in Mori’s
overview, the undead appears to be even more frightening
than the dead corpse. One cannot help to wonder, why that
is. It is understandable that humans fear death, but why is
the state between life and death so scary?
Against the Freudian background, the question occurs
what it actually is that comes back to haunt the human in
form of Olimpia—or the Virtual Human. In a rereading of
Freud’s article, He
´le
`ne Cixous points out that Freud actu-
ally marginalizes the meaning of Olimpia and rather
focuses on Nathanael. According to her, however, the key
to understand the uncanny lies in Olimpia’s role as a hybrid
and intermediary: ‘‘It is the between that is tainted with
strangeness. Everything remains to be said on the subject of
the Ghost and the ambiguity of the Return, for what renders
it intolerable is not so much that it is an announcement of
death nor even the proof that death exists, since this Ghost
announces and proves nothing more than his return. What
is intolerable is that the Ghost erases the limit which exists
between two states, neither alive nor dead; passing through,
the dead man returns in the manner of the Repressed. []
In the end, death is never anything more than the distur-
bance of the limits. [] Olympia is not inanimate. The
strange power of death moves in the realm of life as the
Unheimliche in the Heimliche, as the void fills up the lack’
(Cixous 1976, p. 534).
The positioning of technological artifacts between two
states, their being ‘‘neither flesh nor fowl’’ (Akrich 1991)
adds to their ghost-like quality that characterizes their
uncanniness. In Mori’s valley, the (un)dead are gathering.
In a broadened conception, this ‘‘immense system of
death’’ represents the abject, the outcast, the monstrous—in
short, it is that which threatens human identity on its way to
get a valid identity status itself. Following Judith Butler,
and I have noted this point earlier in reference to science
fiction narratives, obtaining an intelligible form of sub-
jectivity goes hand in hand with the heteronormative
ordering system (Butler 1990). For the production of
the uncanniness of Virtual Humans, especially the
21
Ibid., p. 40.
22
For example, ‘‘A.I.—Artificial Intelligence’’(Steven Spielberg,
USA, 2001).
23
Translation by CD.
24
Translation by CD.
25
Translation: http://www.rae.com.pt/Freud1.pdf. Accessed 13 Jan
2010.
AI & Soc
123
interconnection of gender and melancholia is of interest
(Butler 1997). According to Butler, it is crucial to note that
when it comes to the formation of the gendered self, the
taboo against homosexuality is the founding prohibition.
26
In short, the construction of a heteronormative gender
identity is always based on the primary loss of the homo-
sexual object of desire. This repressed lost other, which
cannot live and also cannot be mourned, gets incorporated
as a part of the self. To grasp the concept of the dopp-
elga
¨nger,
27
Steve Garlick suggests to link Freud’s concept
of the uncanny with Butler’s theory of identity formation.
When Butler’s powerful concept of identity formation is
taken seriously, the gendered body itself can be considered
ahaunted house because it incorporates the lost other
(Garlick 2002, p. 870). Even if these last thoughts prove to
be too challenging, it is remarkable that the German term
‘unheimlich’’ literally means ‘‘not being at home’’.
4 Conclusion
It is exactly the erasing of the limits Cixous describes and
the transference of this transgression onto the realm of the
symbolic gender order which the Virtual Human caters to.
With its intermediate character, it does not only question
the boundaries of life and death, but it also opens up the
possibility of transgender options.
The potential uncanniness of the Virtual Human makes
sense, when it is accepted that identity formation is a
process, and not simply a fixed state or a natural inevita-
bility. Rather, the forming of a self must be viewed as an
ongoing performative act in which the subject recites
intelligible norms. The notion of gender as an activity, the
way of doing gender, also leaves some space for breaches
and lapses of gender regulations. What the Turing test does
is, it exemplifies the deconstructive potential of computer
science by introducing gender and machineness as valid
players in the game. No matter what is ascribed to you, in
the test you may perform drag. Against this background,
the Virtual Human does not just fill the void between
human and the computer—it also is the representation of
the space between man and woman. And this may be
experienced as uncanny and even threatening, given how
intelligible identity concepts are gained.
Earlier I have stated that the Virtual Human interface—
and we can also see this in the case of Hoffmann’s Olim-
pia—is likely to produce a paradox situation. On the one
hand, the very existence of cyborg beings threatens the
nature-culture dichotomy. This blurring of strict boundaries
on the other hand nourishes the need to stabilize the
symbolic gender order rather than to dissolve it. It is
the strict following of this ordering system that holds the
promise for the artifact to reach the status of a human
subject.
The Virtual Human is defined as a hybrid—it cannot
take additional risks by transgressing a norm so central to
our culture. This seems to be even more true since the
artifact already lives with a secret: Following the goal of
the research field, it has to hide its machine-like origin. As
Olimpia illustrates, this agenda is likely to produce lapses
and errors. Not because the artifact is designed badly that
is, but because the underlying working modes of comput-
erization always will shine through. The case of Olimpia,
among other things, tells a story of how the idea of being
human gets re-established in the face-to-face Turing test.
And computability, standardized behavior, predictability,
formalization are traits that have been sourced out to the
machine. The gendered artificial flesh then is used to coat
these characteristics.
According to Freud, the uncanny (German: unheimlich)
oscillates between the home (dem Heim) and the strange
(un-heimisch =not home). The relation between private
and public places has a history of gendered connotations.
Still in the 1950s, for example, the family home was
considered to be the area for women, and those who
stepped out of this ordering system were regarded as
threatening (cf. Esders 2007). In the research area of Vir-
tual Humans, the simulation of human appearance and
behavior stands for ease of use and trust. Their embodiment
can be seen as a housing that transforms abstract comput-
ing modes into something comfortable and makes the user
feel more at home. It comes as no surprise that so many
artificial beings, in fiction and in science, are conceptual-
ized as female. But this artificial housing also transports
dimensions that are unintended by the designers. Earlier I
posed the question, why with ‘‘Der Sandmann’’ a story is
recited which is rather disturbing from a technological
point of view—it ends with the user dying and the artifact
getting destroyed. One answer may be that the reference to
this narration speaks of the desire to overcome the between
that is tainted with strangeness and once and for all put a
different ending to the story.
What definitely can be learned from the story of ‘‘Der
Sandmann’’ is the fact that human–humanoid interaction
always invokes a network of meanings and relations. The
development of the scenario in its complexity is rather hard
to foresee when one mainly focuses on the design
engineering of the artifact. The humanoid itself can only be
pre-scripted up to a certain point. However, a different
scripting of the whole setting of the technological narration
26
Freud characterizes the formation of the ego as melancholic
structure. The child has to give up the desire for its parents because of
the incest taboo. Butler, however, argues, that the taboo against
homosexuality precedes the incest taboo (Butler 1990, p. 64).
27
In reference to Jaques Derrida he introduces the doppelga
¨nger as
the revenant, as something which comes back.
AI & Soc
123
will eventually result in the production of more realistic,
less idealized artifacts. For a significantly different out-
come of the story, the analysis of the pictured user-artifact
scenario with all its complicated implications must be
taken seriously. When Turing in the 1950s was able to
introduce an interaction scenario that oscillates between the
dissolution and the fixation of identity norms, there now is
the time to reclaim that in-between space and aim for more
diverse forms of virtual embodiment.
Acknowledgments This research is based upon work supported by
the Deutsche Forschungsgemeinschaft (DFG), postgraduate school
number GRK 1014.
References
Akrich M (1991) The description of technical objects. In: Bijker WE,
Law J (eds) Shaping technology/building society. Studies in
sociotechnical change. MIT Press, Cambridge, pp 205–240
Balsamo A (2003) On the cutting edge: cosmetic surgery and the
technological production of the gendered body. In: Mirzoeff N
(ed) The visual culture reader. London, Routledge, p 223
Bartneck C (2007) Is the Uncanny Valley an Uncanny Cliff? In:
IEEE, the 16th IEEE international symposium on robot and
human interactive communication, 2007, RO-MAN, Jeju
Bath C (2002) Was ko
¨nnen uns Turing-tests von Avataren sagen?
Performative Aspekte virtueller Verko
¨rperungen im Zeitalter der
Technoscience. In: Epp, Astrid u.a. (Hrsg.), Technik und
Identita
¨t. Bielefeld, pp 79–99
von Braun C (2001) Versuch u
¨ber den Schwindel. Pendo, Zu
¨rich
Butler J (1990) Gender trouble: feminism and the subversion of
identity. Routledge, New York
Butler J (1997) The psychic life of power: theories in subjection.
Stanford University Press, Stanford
Butler J (2004) Undoing gender. Taylor & Francis, London
Carrol L (1992) Through the looking-glass and what Alice found
there. Ware, 1992 (1872)
Cassell J (2000) Nudge Nudge Wink Wink: elements of face-to-face
conversation for embodied conversational agents. In: Cassell J
(ed) Embodied conversational agents. MIT Press, Cambridge
Cixous H (1976) Fiction and its phantoms: a reading of Freud’s Das
Unheimliche (The »uncanny«). In: New literary history, thinking
in the arts, Sciences, and Literature vol 7, No. 3, pp 525–548
Esders K (2007) Trapped in the Uncanny Valley: Von der unheim-
lichen Scho
¨nheit ku
¨nstlicher Ko
¨rper. In: Pau H, Ganser A (eds)
Screening gender. Geschlechterszenarien in der gegenwa
¨rtigen
US-amerikanischen Popula
¨rkultur. Lit Verlag, Berlin, pp 97–115
Ferber D (2003) The man who mistook his girlfriend for a robot.
http://iiae.utdallas.edu/news/pop_science.html. Accessed 23 Jan
2010
Freud S (1963) Das Unheimliche. Aufsa
¨tze zur Literatur. Fischer
Verlag, Frankfurt am Main
Garlick S (2002) Melancholic Secrets: Gender Ambivalence and the
Unheimlich. In: Psychoanalytic review. pp 861–876
Gibson W (1996) Idoru. New York, London
Haraway DJ (1991) A cyborg manifesto: science, technology, and
socialist-feminism in the late twentieth century. In: Haraway DJ
(ed) Simians, cyborgs and women. The reinvention of nature.
Routledge, New York, p 164
Haraway DJ (1997) Modest_Witness@Second_Millenium.Femal-
eMan_Meets_OncoMouse
TM
. Feminism and Technoscience.
Routledge, New York, p 128
Hayles KN (1999a) How we became posthuman. Virtual bodies in
cybernetics, literature, and informatics. Chicago University
Press, Chicago
Hayles NK (1999b) How we became posthuman. Virtual bodies in
cybernetics, literature, and informatics. The University of
Chicago Press, Chicago
Hodges A (1992) Alan Turing: the enigma. Random House, London
Hoffmann ETA (1994/1817) Der Sandmann. In: Hoffmann ETA (ed)
Gesammelte Werke in Einzelausgaben, Bd. 3. Aufbau-Verlag,
Berlin und Weimar
Kopp S et al (2003) Max—a multimodal assistant in virtual reality
construction. In: Ku
¨nstliche Intelligenz. pp 11–17
Kormann E (2006) Textmaschinenko
¨rper. Genderorientierte Lektu
¨ren
des Androiden. Rodopi, Amsterdam
Lu
¨bke V (2005) Cybergender. Geschlecht und Ko
¨rper im Internet.
Helmer, Ulm
MacDorman KF (2005) Androids as an experimental apparatus: why
is there an uncanny valley and can we exploit it? In: CogSci-
2005 workshop: toward social mechanisms of android science.
pp 106–118
Magnenat-Thalmann N (2004) Handbook of virtual humans. Wiley,
Chichester
Manovich L (2001) The language of new media. MIT Press
(Leonardo), Massachusetts, p 132
Mori M (1970) Bukimi no tani. Translated as: the Uncanny Valley.
http://www.androidscience.com/theuncannyvalley/proceedings
2005/uncannyvalley.html. Accessed 20 Dec 2009
Nadin M (2007) Semiotic machines. Public J Semiot H. 1(1):85–114.
Online version: http://www.nadin.ws/archives/760. Accessed 15
May 2010
Nake F (1993) Von der Interaktion. U
¨ber den instrumentalen und den
medialen Charakter des Computers. In: Nake F (ed) Die
ertra
¨gliche Leichtigkeit der Zeichen. A
¨sthetik, Semiotik, Infor-
matik. AGIS-Verlag, Baden-Baden (Internationale Reihe
Kybernetik und Information, 18), pp 165–189
Nake F (2001) Das algorithmische Zeichen. In: Bauknecht, W. u.a.
(Hrsg.) Informatik 2001. Tagungsband der GI/OCG Jahresta-
gung 2001, pp 736–742
Norman DA, Draper SW (1986) User centered system design. New
perspectives on human-computer interaction. Lawrence Erlbaum
Associates, Hillsdale, p 43
Ruttkay Z (ed) (2004) From brows to trust. Evaluating embodied
conversational agents. Kluwer, Dordrecht
Searle JR (1984) Minds, brains and science. Harvard University Press,
Cambridge
Stone S (1991) Will the real body please stand up. In: Benedikt M
(Hg.): Cyberspace: first steps. MIT Press, Cambridge
Turing AM (1950) Computing machinery and intelligence. Mind
59:433–460
Turkle S (1984) The second self. Computers and the human spirit.
Simon & Schuster, New York
Weizenbaum J (1983) ELIZA: a computer program for the study of
natural language communication between man and machine.
Commun ACM 25th Anniv Issue 26(1):23–27
AI & Soc
123
... Regarding the scientific community of CG, there is much evidence of gender bias, skin color, etc [Dodik et al. 2022;]. In [Draude 2011], authors claimed that simulated human likenesscould utilize human self-reliability, turning something abstract into comfortable. In this sense, the work of [Araujo et al. 2021]showed that women feel more comfortable with realistic female CG characters than with male characters, while men feel comfortable similarly with female and male characters. ...
... The characters were visibly male and female but would these results be similar if the virtual human had no gender identification? Still in relation to gender categorization [Draude 2011], a studied question is: Has Computer Science the potential to deconstruct the gender binary with virtual humans? ...
... It can be related to gender identification [Scott 2007], for example, women identify more with female virtual humans. This indicates that it is possible to deconstruct gender stereotypes through virtual humans, as mentioned by [Draude 2011]. In our opinion, this could mean the industry does not need to create stereotyped animated virtual humans to convey gender. ...
Conference Paper
Full-text available
Animations have become increasingly realistic with the evolution of Computer Graphics (CG). In particular, human models and behaviors have been represented through animated virtual humans. Gender is a characteristic related to human identification, so virtual humans assigned to a specific gender have, in general, stereotyped representations through movements, clothes, hair, and colors in order to be understood by users as desired by designers. An important area of study is determining whether participants’ perceptions change depending on how a virtual human is visually presented. Findings in this area can help the industry guide the modeling and animation of virtual humans to deliver the expected impact to the public. In this paper, we reproduce using an animated CG baby, a previous perceptual study conducted in real life aimed to assess gender bias about a baby. Our research indicates that simply textually reporting a virtual human’s gender may be sufficient to create a perception of gender that affects the participant’s emotional response so that stereotyped behaviors can be avoided.
... Therefore, the uncanny valley is triggered due to the inconsistency between "look" and the touch" of the object. The examples such as an industrial robot, a humanoid robot, a zombie, a prosthetic hand, a bunraku puppet, and finally, a healthy person denote different levels of the familiarity curve (Draude 2011). ...
... This effect is commonly used to describe the method of differentiating between humans and robots, i.e., different levels of anthropomorphism are revealed in uncanny valley theory (Mori 1970;Nissen and Jahn 2021). The insights of the uncanny valley suggest that the uncanny valley is observed when anthropomorphism reaches a certain level of human likeness (Draude 2011). This results in negative reactions towards the said entity. ...
... This happens on the way to gaining the peak of human likeness. Therefore, different levels of anthropomorphism tend to deliver different levels of uncanniness to users (Draude 2011). According to the theory of the uncanny valley, when anthropomorphism of social robots is between the low and medium levels, there will be an increase in the perceived uncanniness, whereas a very high level of anthropomorphism might lower the perceived uncanniness among users, and the hypotheses for this study are proposed as: H1a. ...
Conference Paper
Full-text available
Social robots have been increasingly popular during the past decade. Anthropomorphism is identified as a critical factor affecting social robots’ acceptance. A conceptual model is proposed to examine social robots’ user acceptance based on the theory of the uncanny valley. User perception of the anthropomorphism level (low, medium, and high) of social robots is proposed to affect users’ perceived uncanniness and humanness of social robots, which influences user trust in social robots, and user trust will lead to users’ intention to use social robots. The theoretical model will be empirically tested through an experiment conducted online targeting the hospitality industry. The study will contribute to social robotic acceptance literature by explaining how anthropomorphism affects user trust in social robots via perceived uncanniness and humanness, as well as how user trust influences user acceptance of social robots.
... Gender is a significant factor in VHs perception, with studies showing gender differences in tasks involving facial detection and emotion discrimination [7]- [9]. Despite efforts to increase female VH representation in the media 12 , male VHs still dominate 3 , raising concerns about gender imbalance and its impact on perceptions of comfort and realism [10], [11]. Indeed, people tend to anthropomorphize technologies, attributing human characteristics such as gender to VHs [10], [11], even if they do not have any visual cue. ...
... Despite efforts to increase female VH representation in the media 12 , male VHs still dominate 3 , raising concerns about gender imbalance and its impact on perceptions of comfort and realism [10], [11]. Indeed, people tend to anthropomorphize technologies, attributing human characteristics such as gender to VHs [10], [11], even if they do not have any visual cue. Skin color is another critical factor, with dark colored skin VHs historically being stereotyped in the media 4 . ...
Conference Paper
Full-text available
Recent advancements in Computer Graphics (CG) have significantly enhanced the realism of animations and characters in various media. However, the Uncanny Valley (UV) theory suggests that as Virtual Humans (VHs) become more realistic, they may evoke discomfort. This phenomenon challenges industry professionals and researchers to study human perception, considering diverse characteristics such as gender and skin color. This work investigates human perception and sensations when playing or watching VHs, aiming to answer many questions regarding their visual characteristics. For example, one question examines human perception concerning the relationship between the character’s gender and the participant’s gender. The results showed in-group advantages for participants regarding VHs with binary genders, both in gender attribution and emotion recognition. Additionally, this work explores solutions for deconstructing the gender binary using a genderless Virtual Baby (VB) and an adult VH model. It also discusses the UV effect on VHs with different skin colors, highlighting potential biases in skin color algorithms.
... Furthermore, it is believed that the gender of virtual agents could have some pedagogical value in the process of gender acceptance and debinarization (Armando et al., 2022). To date, the classic binary role still persists in IT, since female voices in machines are attributed with "female tasks" and "female information," as it is also the case vice versa (Draude, 2011). This is also visible in the female voices of Cortana, Alexa, Siri, and others. ...
... This is also visible in the female voices of Cortana, Alexa, Siri, and others. Draude (2011) holds that these voices should provide a sense of tranquility, akin to the female role of mothers and soothers. As seen above (cf. ...
Article
Full-text available
The present review is the first of its kind to form a conceptual discussion about a novel field, here referred to as digital psychology . The result is a conceptual impact model of digital psychology (in short: CIMDP) highlighting the bidirectional relationship between human psychology (consisting of affect, cognition, and behavior) and digital transformation (driven by datafication, algorithmization, and platformization). The findings of the CIMDP are applied to a relevant field in economy and business development, namely, to the digital future of work, which appears to be mediated by organizational behavior and governed by managerial decisions. The resulting model may be used to provide orientation in a new research domain and to guide future studies in psychology, cognitive science, digital transformation, human–computer interactions, organizational behavior, and business management.
... According to Mori, the UV is a phenomenon where a human response may fall into a valley of discomfort/uncanny depending on how much the VH resembles a real human. While Mori's theory was initially applied to robots, various authors have explored the impact of the UV on CG characters [6,12,34]. According to Katsyri et al. [30], the feeling of discomfort is associated with human identification, wherein individuals observing or interacting with VH seek human characteristics in them. ...
Preprint
Full-text available
Virtual humans (VH) have been used in Computer Graphics (CG) for many years, and perception studies have been applied to understand how people perceive them. Some studies have already examined how realism impacts the comfort of viewers. In some cases, the user's comfort is related to human identification. For example, people from a specific group may look positively at others from the same group. Gender is one of those characteristics that have in-group advantages. For example, in terms of VHs, studies have shown that female humans are more likely to recognize emotions in female VHs than in male VHs. However, there are many other variables that can impact the user perception. To aid this discussion, we conducted a study on how people perceive comfort and realism in relation to interactive VHs with different genders and expressing negative, neutral, or positive emotions in groups. We created a virtual environment for participants to interact with groups of VHs, which are interactive and should evolve in real-time, using a popular game engine. To animate the characters, we opted for cartoon figures that are animated by tracking the facial expressions of actors, using available game engine platforms to conduct the driven animation. Our results indicate that the emotion of the VH group impacts both comfort and realism perception, even by using simple cartoon characters in an interactive environment. Furthermore, the findings suggest that individuals reported feeling better with a positive emotion compared to a negative emotion, and that negative emotion recognition is impacted by the gender of the VHs group. Additionally, although we used simple characters, the results are consistent with the perception obtained when analysing realistic the state-of-the-art virtual humans, which positive emotions tend to be more correctly recognized than negative ones.
... While acknowledging their economic contribution, it is important to consider the ethical concerns that arise from their use. The dominance of "Asian or Caucasian features" in humanoid robots or "hyper-sexualization or over feminization of female robots" pose ethical questions in terms of gender, race, and ethnicity (Draude, 2011;Manthiou et al., 2021;Riek & Howard, 2014;Robertson, 2010). ...
Article
Full-text available
Following COVID-19, there has been an increase in digitization and use of Artificial Intelligence (AI) across all spheres of life, which presents both opportunities and challenges. This commentary will explore the landscape of the gendered impact of AI at the intersections of Science and Technology Studies, feminist studies (socialist feminism), and computing. The Global Dialogue on Gender Equality and Artificial Intelligence (2020) organized by UNESCO highlighted the inadequacy of AI normative instruments or principles which focus on gender equality as a “standalone” issue. Past research has underscored the gender biases within AI algorithms that reinforce gender stereotypes and potentially perpetuate gender inequities and discrimination against women. Gender biases in AI manifest either during the algorithm’s development, the training of datasets, or via AI-generated decision-making. Further, structural and gender imbalances in the AI workforce and the gender divide in digital and STEM skills have direct implications for the design and implementation of AI applications. Using a feminist lens and the concept of affective labor, this commentary will highlight these issues through the lenses of AI in virtual assistants, and robotics and make recommendations for greater accountability within the public, private and nonprofit sectors and offer examples of positive applications of AI in challenging gender stereotypes.
Article
Full-text available
The Uncanny Valley, hypothesized by Masahiro Mori in 1970, theorizes about a sensation of discomfort evoked in humans on exposure to anthropomorphic artificial bodies that preserve some mechanical features. The aims of this paper are multiple. Firstly, to confirm the truth of this hypothesis, which has often been the subject of controversy but which has been amply demonstrated in experimental settings and anecdotic evidence. Secondly, to focus the study of the Uncanny Valley on the face, considered as central in the manifestation of the phenomenon. Thirdly, to provide a description of this occurrence as not exclusively a neuroscientific or psychological matter, but also as an issue of extreme semiotic relevance, through the meta-analysis of recent experiments and the treatment of some cinematographic cases known to have generated the Uncanny Valley experience in the audience.
Article
Full-text available
Questo studio è dedicato alla rappresentazione degli automi nel Romanticismo, con particolare attenzione ai concetti di ‘perturbante’ e ‘ambivalenza’. La riflessione include i temi dello specchio, dell’immagine ideale del sé (Ideal-Ich) e della figura femminile, in una prospettiva che tiene conto delle tracce del mito pigmalionico e del mito prometeico nell’immaginario romantico. Il primo paragrafo ripercorre sommariamente le forme dell’automa nel tempo a partire dall’età classica; seguono un breve inquadramento delle categorie storico-filosofiche dominanti nel Romanticismo, l’identificazione del fantastico come modo letterario e l’analisi comparata di due casi studio: L’uomo della sabbia (1816) di E.T.A. Hoffmann e L’Ève future (1886) di August Villiers de L’Isle-Adam. Il confronto mira a illuminare le costanti e le varianti dei significati incarnati dall’automa in due opere che coprono un arco che va dalla fase di ascesa del Romanticismo con Hoffmann, alla sua tarda espressione con Villiers.
Article
Full-text available
A semiotic machine, no matter how it is embodied or expressed, has to reflect the various understandings of what the knowledge domain of semiotics is. It also has to reflect what methods and means support further acquiring knowledge of semiotics. Moreover, it has to express ways in which knowledge of semiotics is tested, improved, and evaluated. Given the scope of the endeavor of defining the semiotic machine, the methodological approach must be anchored in the living experience of semiotics. Accordingly, the cultural-historic perspective, which is the backbone of any encyclopedic endeavor, is very much like a geological survey for a foundation conceived from a dynamic perspective. The various layers could shed light on a simple aspect of the subject: At which moment in the evolution of semiotics does it make sense to make the association (in whatever form) to tools and to what would become the notion of a machine? Reciprocally, we would have to explain how the various understandings of the notions tool and machine are pertinent to whatever was the practice of semiotics at a certain juncture. Yet another reference cannot be ignored: The reductionist-deterministic view, celebrated in what is known as the Cartesian Revolution. Since that particular junction in our understanding of the world, the reduction of semiotic processes to machine descriptions is no longer a matter of associations (literal or figurative), but a normative dimension implicitly or explicitly expressed in semiotic theories. Given this very intricate relation, we will have to systematize the variety of angles from which various understandings of the compound expression semiotic machine can be defined.In our days, such understandings cover a multitude of aspects, ranging from the desire to build machines that can perform particular semiotic operations to a new understanding of the living, in view of our acquired knowledge of genetics, molecular biology, and information biology. That the computer—a particular form of machine—as an underlying element of a civilization defined primarily as one of information processing, could be and has been considered a semiotic machine deserves further consideration.