Content uploaded by Valerio Sperati
Author content
All content in this area was uploaded by Valerio Sperati on Jan 24, 2017
Content may be subject to copyright.
Noname manuscript No.
(will be inserted by the editor)
Transitional wearable companions: a novel concept
of soft interactive social robots to improve social
skills in children with autism spectrum disorder
Beste ¨
Ozcan ·Daniele Caligiore ·
Valerio Sperati ·Tania Moretta ·
Gianluca Baldassarre
Received: date / Accepted: date
Abstract We present a novel concept of interactive devices, called “transi-
tional wearable companions” (TWCs), usable to support therapy and foster
social skill development in children with autism spectrum disorder (ASD).
TWCs have two distinctive features. First, they are soft interactive devices,
which look like tender animals, able to arise attachment emotions and give
a continuous reassuring physical contact. Second, TWCs are embedded social
robots responding to the child’s manipulations by emitting lights, sounds, or
vibrations usable for multiple purposes, for example to enhance the child’s
engagement. TWCs can have additional important features. First, the input-
output rules with which they respond to the child’s actions can be changed
by the therapist/caregiver, for example through a tablet, thus opening a large
number of possibilities to foster social interaction. Second, TWCs can have
biosensors gathering information on the child’s physiological and emotional
state, thus offering multiple ways to support the interaction with the child
during therapy and daily life. The paper presents the principles underlying
TWC design, their possible future enhancements, a first prototype (+me) of
social TWC, and possible empirical experiment procedures to test the effec-
tiveness of TWC in controlled experiments. For their multifaceted and flexible
features, TWCs might become an important tool to enhance ASD children’s
social abilities in ecological and therapeutic contexts.
Keywords Autism ·social interaction ·therapy ·interactivity ·wearable ·
biosensors ·emotional state
Beste ¨
Ozcan ·Daniele Caligiore ·Valerio Sperati ·Tania Moretta ·Gianluca Baldassarre
Laboratory of Computational Embodied Neuroscience,
Institute of Cognitive Sciences and Technologies,
Italian National Research Council (LOCEN-ISTC-CNR),
Via S.Martino della Battaglia 44, 00185 Rome, Italy
Tel.: +39-06-44595230
Fax: +39-06-44595243
E-mail: {daniele.caligiore, valerio.sperati, gianluca.baldassarre}@istc.cnr.it {bestesi, tani-
amoretta89}@gmail.com
2 Beste ¨
Ozcan et al.
1 Introduction:
Autism is increasingly considered a pervasive neurodevelopmental disorder
[1,2]. It is characterised by a great variation in both quality and gravity of
symptoms, overall grouped under the name of Autism Spectrum Disorder -
ASD. In this work, we focus on the possible treatment of autism involving
individuals in the early developmental phases, namely children younger than
ten years old. Autistic children generally exhibit a significant impairment in
the social-communicative domain. They rarely initiate social interaction [3–
5], often refuse human contact [6], tend to focus on restricted interests and
activities, and are inclined to display repetitive behavioural patterns which
isolate them from the outer world [7]. Attempts by caregivers to interfere
with these stereotyped routines generally provoke a stressful situation for the
autistic child. As a main consequence, it can be very difficult to establish a
social interaction with the child, with negative repercussions on her/his mental
development because social interaction and communication play a key role in
children development [8].
A possible psychological interpretation for the pathological behaviours of
autistic individuals concerns the expectations and judgements involved in so-
cial interactions and contexts. The possible unexpected events involved in
these situations, usually manageable by healthy people, might appear unsafe
or threatening to children with autism who consequently tend to shy away from
them [4]. This is consistent with reports where autistic children are described
as particularly attracted by highly predictable activities [9,10].
In recent years, this latter observation has been exploited by some thera-
pists who started to use robots, computers, and electronic gadgets during ther-
apeutic sessions because, unlike people, they may be programmed to exhibit
highly predictable behaviours [2,11]. For example, computers based products
may be set in order to not react to atypical behaviour showed by autistic
children such as rocking or screaming as a human would [12]. In this way,
the stress and unpredictability caused by social interaction is largely removed
during the interaction with a computer [13], a robot, or a mechatronic device.
These technological tools may thus represent powerful attractors or mediators,
that can be exploited as an easy way in which therapists and researchers can
establish a connection with ASD subjects [11]. Children with autism seem to
show a preference in establishing a relationship with these artificial agents [14]
and often improve their skills after a therapeutic session based on their use.
For example, interactive toys may provide predictability by relying on con-
stant cause and effect functions that reassure children and support them in
the daily behaviours (e.g., cleaning teeth, travelling in a car [9]). During thera-
peutic sessions using robots or other artificial agents (e.g., computer simulated
avatars) children with autism have a reduction of stereotypical and repetitive
behaviours and an improvement of language skills [15,16]. Importantly, pre-
dictable robots and ob jects might mediate social interactions and improve
social skills [11, 17].
Trasitional Wearable Companions 3
Capitalising on these experiences, we propose here a novel class of inter-
action devices, called transitional wearable companions – TWCs, usable to
improve social skills and engagement of children with ASD. The paper is or-
ganised as follows. Sec. 2 illustrates the key features of TWCs. Sec. 3 reviews
existing systems related to TWCs. Sec. 4 illustrates a first prototype of TWC
and how it could be enhanced in the future. Sec. 5 outlines the experimental
protocol of an empirical test directed to test the acceptability and utility of
TWCs for children with ASD. Sec. 6 draws the conclusions of the paper.
2 The key features of TWCs
TWC are soft interactive mechatronic devices – social robots – which out-
wardly look like a tender animal or a security blanket. TWCs have two core
defining features, and some additional features, now illustrated in some details
(see Figure 1). The first core feature of TWCs is that they have the power-
ful reassuring features of what in psychology are called “transitional objects”.
Transitional objects are puppets, blankets, or similar objects (e.g., the blanket
of Linus, the child character of Shultz’s comics) with a soft touch and a ten-
der look that the child can carry along when she/he independently navigates
and explores the environment without the reassuring support of a caregiver
(e.g., the mother) [18]. Since expectations and judgements involved in social
contexts might appear threatening for children with autism, making social in-
teractions problematic [4], many children with ASD develop an attachment to
a transitional object, for example a teddy bear. As transitional objects [19],
with their stable presence and features wearable companions can represent
a reliable source of soothing and confidence when the autistic child explores
novel environments “far” from parents, caregivers and familiar places.
The second core feature of TWCs, also shared with wearable computers
(also simply called wearables), is that they contain an embedded mechatronic
device that allows them to react to the actions of the child. In this respect,
a wearable companion can be considered a social robot having actuators that
can emit lights, sounds, and vibrations in response to the child acting on its
touch/interactive sensors, and these responses are controlled by an on-board
computer (e.g., an Arduino board [20]). While typically developing children
can be motivated to engage with inanimate object such as transitional objects,
children with autism may benefit additional degrees of “animacy and interac-
tivity” to elicit their engagement [21]. The cause-effect regular nature of such
type of interaction would give the child a higher sense of control and hence
mitigate fearful and avoidance reactions [22]. In this respect, and importantly,
since the causal link (contingency) between the actions of the child on the
wearable companion’s sensors and the responses of the wearable actuators are
governed by an on-board computer, they can be modified via software. This
allows a fine regulation of the wearable companion reactions to the child’s
actions so as to tailor them on the child’s personal features, level of cognitive
development, and emotional structure. Moreover, for their richness and pro-
4 Beste ¨
Ozcan et al.
grammable nature, the contingencies could be progressively sophisticated (at
a pace tuned with the level of development of the child’s cognition and emo-
tions) to foster the child’s exploration of novel features and the development
of divergent behaviours leading to “accommodate” to those novel experiences
[23].
Alongside the two core defining elements, TWCs can have other addi-
tional features empowering their possibilities of employment with children with
autism. A first empowering feature is that TWCs can be endowed with wireless
communication components (e.g., based on bluetooth or TCP-IP technology)
allowing them to exchange information with other mechatronic devices such
as external computers, tablets, smartphones, or another TWCs. Importantly,
these additional devices might be under the control of another human agent,
such as a therapist, a parent, or a friend child, thus making the TWCs social
transitional wearable companions (STWCs). The features of STWCs might
open important new means to support therapy in unstructured contexts and
for the development of social skills in daily life – for example by fostering and
enriching the interactions with parents and friends [9]. For example, the care-
givers could manipulate the stimuli emitted by the wearable companion (e.g.,
the type and/or rate of colour, sound, and vibration) to motivate the child’s
interaction with them [24]. In particular, the child can be progressively led
to understand that the pleasurable interactions with the wearable companion
depend on the caregivers’ intervention (e.g., via the tablet) and this will be
a strong motivation for the child to increase the level and quality of social
engagement with them.
A second empowering feature of TWCs consists in endowing them with
physiological sensors alongside with other “external” interactive sensors (e.g.,
touch sensors), thus leading to have a physiological social transitional wear-
able companions (PSTWCs). By physiological sensors we refer for example
to biosensors embedded in a wristband connected to the TWCs or embedded
in the wearable companion’s main body that can return information on the
internal physiological state of the child, for example in relation to internal
temperature, heart rate, level of stress (e.g., via skin conductance). Informa-
tion on the physiological state is very important as the physiological state of
the body is strongly related with the emotional state of the child. Indeed,
the information on the child physiological state so gathered could be auto-
matically processed based on pattern-recognition and other machine-learning
algorithms to infer the emotional state of the child. Thus, the collection and
suitable elaboration of information on the child’s physiological state could fur-
nish precious real-time knowledge on the current emotional state of the child
and how she/he is reacting to current experiences [25–27].
The uses of information returned by the biosensors, used locally by the
wearable companion or communicated to external devices, might be employed
in several different ways. First, the wearable companion might react to the
internal state of the child and not only to her/his overt actions, and thus its
contingencies might be modified on the basis of such additional information.
For example, the reactions of the TWCs might be tuned down in intensity and
Trasitional Wearable Companions 5
variety in correspondence to a higher level of stress of the child. Second, the
information on the emotional state of the child might be broadcasted to other
agents through the actuators of the wearable companion. For example, colour
lights might be used to visually render the current level of stress of the child,
so allowing therapists to suitably tune the therapeutic actions, or parents to
regulate their behaviour [28]. The rendering of some aspect of the emotional
state of the child through sensorial means easily interpretable by the child
might also be used to support the development of her/his skills in understand-
ing own emotional states. Third, thanks to the prolonged interaction of the
child with the wearable companion, the biosensors might support a prolonged
recording of the child’s physiological and emotional states in correspondence
to different experiences, thus allowing a continuous monitoring of the child
development supporting therapeutic decision making.
3 Related works
The literature offers various options for the treatment of autism, for example
reward based training of skills, training of language abilities to build social
relationships, improvement of motor skills, use of games as a means to improve
social skills [29, 30]. Alongside these “traditional” approaches, the development
of new technologies is contributing to the emergence of new techniques based
on the use of artificial agents (e.g., robots, computer based products, electronic
gadget) to improve emotional and social abilities [15,31, 30]. The goal of the
interactions between the artificial agent and the children might be to stimulate
joint attention or to involve in the child-artificial agent relationship a third
agent, such as a caregiver, for example to encourage imitative or other resonant
behaviours. The artificial agent can also act as a teacher, as a game, or as a
means through which the child with autism can express emotions and goals
[32]. Combining some benefits of traditional techniques [29] with the use of new
technologies may allow the exploration of new (and maybe more effective)
alternative routes for the treatment of ASD [33, 30]. In this respect, several
traditional treatments for social impairments in ASD are based on the training
of emotion recognition [34]. The aim of such treatments is to enable ASD
individuals to interpret intentions and meanings of people and to anticipate
their emotional reactions to typical situations they may encounter in daily
lives.
One of the main limitations of these approaches is that they present a lim-
ited generalization of the experience to real life situations. The learning process
deriving from these treatments indeed uses a limited repertoire of predefined
scenarios (e.g. realised by drawings, videos, photographs), and is based on the
memorization and interpretation of a scene as a therapeutic setting. Recently,
several studies have shown that making the therapeutic setting closer to the
real life scenario (e.g. using robots and electronic gadget to evoke elementary
emotional states instead of simple photographs) could increase the effective-
ness of these treatments [35, 32]. However, even in these enriched setups the
6 Beste ¨
Ozcan et al.
Fig. 1: The schema shows the main features of a transitional wearable compan-
ion (TWC), in this case a physiological social transitional wearable companion
(PSTWC). A TWC can be carried along by the child and has engaging affective
properties typical of transitional objects, for example the wearable resemble a
tender animal and is made of soft materials. These features and the physical
contact of the wearable give the child a sense of protection and stability thus
supporting her/him in the engagement with and exploration of unknown phys-
ical and social settings. A TWC contains a mechatronic device with sensors
and actuators, i.e. an embedded robot, that can suitably react to the actions
of the child. For example, when the child presses a paw of the animal, the
wearable responds with the production of various visual and auditory stimuli.
A TWC can be a social TWC (STWC) if it can connect (e.g., via bluetooth)
with external electronic or mechatronic devices, such as a computer, a tablet
(as in this case), or another TWC, and these devices are under the control of
another human agent, for example the caregiver or another child. These de-
vices allow these other agents to interfere, modulate, modify the ways in which
the TWC reacts to the actions of the child, thus giving an important social
dimension to the TWC. A TWC can be a physiological STWC (PSTWC) if
its mechatronic component contains physiological sensors (e.g., to detect the
current skin conductance and heart rate of the child). These sensors can be
used to allow the TWC to react to the child internal state, for example to
manifest to the child herself her affective state through its actuators (e.g., a
light pulsing with the child’s heart rate), or to give real-time information on
the child’s emotional state to caregivers, for example to a therapist or a parent
via a light intensity or a distant tablet.
children poorly interact with the artificial agent to cause an emotional states
and create social situations. This interaction might be instead very important
Trasitional Wearable Companions 7
as it actively engages the children in emotion recognition and resonance pro-
cesses enhancing social skills [34,36]. The use of robotic agents could greatly
enrich the immersiveness of the interaction. For example, the child could ca-
ress the robot face (or could smile to the robot) and could observe the effect
of the action on the robot (e.g. the robot could emit a sound). The matching
between the expected effect and the effective one could reinforce the learning
of the emotional understanding.
Another important drawback of traditional treatments for ASD is the lack
of the possibility for on-line monitoring of the benefits of the treatments. This
is important as it could support the tuning of stimuli and more in general of
the therapeutic intervention [24].
TWCs overcomes these limitations by exploiting its transitional and inter-
active features and also by incorporating some elements of traditional thera-
peutic approaches (e.g., reward based learning, improvement of motor skills,
game). Through TWCs, the emotion detection abilities of the child could be
trained by a two-way interaction. First, the child might interact with the body
of the TWC that, as a result, might react with signals and actions affecting the
child (e.g., emitting lights, sounds, and vibrations). Second, using a PSTWC
the caregivers could receive data on the emotional states and activity of the
child through the biosensors and other sensors and accordingly she/he could
manipulate the stimuli emitted by the companion, for example during a game
(e.g., type and/or rate of colour, sound, and vibration) to further motivate
the interaction (cf., “LEGO R
therapy”, [37]). The child could also “feel” what
happens as a consequence of the actions of the companion or the caregiver if
these are transferred into suitable signals transmitted to the child by the com-
panion. This enrichment of the interaction could train and hence substantially
increase the generalization abilities of the children to recognise emotions and
other social signals in ecological conditions, and hence it could improve their
ability to cope with more complex real life social contexts.
Detecting the emotional state of autistic children is not trivial [24] and
most studies done in the past in this field were restricted to measurements in
laboratories (e.g., [38, 39]). Recently, it has been shown that there is a signifi-
cant emotion-related information that can be recognized through physiological
activity [25]. In this respect, it has been found that in the ma jority of autism
cases there is a dysregulation of the electrophysiological parameters at the
basis of autonomic nervous system (ANS) [40–43] with hyperarousal of sym-
pathetic system and dampened parasympathetic vagal tone [44,45]. Smeekens
et al. [42] studied the differences in the ANS activity between ASD young
males and young males without ASD during social interaction. The results
show an increase in heart rate (HR) and in heart rate variability (HRV) in
ASD group. Vagal “brake”, at the basis of HR modulation, enables rapid en-
gagement and disengagement with objects and people thus promoting social
interaction [46]. Abnormalities in HRV, in HR reactivity and in electrodermal
activity (EDA) were found in different studies on ASD [47, 26,27, 48, 49]. In
8 Beste ¨
Ozcan et al.
Fig. 2: Current prototype of +me, showing a sequence of possible light combi-
nations. Lights and sounds behaviours depend both on the child, who touches
the TWC, and on the therapist/caregiver who manages the input-output func-
tions of the device.
this line, some recent research have started to use these observations to collect
data outside the laboratory through the use of biosensors1.
Building on these recent approaches, TWCs can deal with the need for on-
line monitoring of the effects of therapeutic actions by allowing the collection
and integration of information from biosensors, for example used to monitor
the emotional state of children (e.g., changes in HR [46]). The use of wearables
with biosensors as in PSTWCs contributes to meet the increasing need for eco-
logical monitoring of physiological variables to support medical interventions
and therapies outside the clinical setting [50, 51]. Information collected with
accelerometers embedded in a wristbands might also be used to monitor the
effects of treatments [52–54], for example to measure the success of therapies
aiming to decrease repetitive movements.
4 TWCs: architecture and functioning
4.1 Current prototype: the +me device
At our laboratory2we have developed a first concrete example of TWC [20].
This is still a partial implementation of the general idea, as it includes the
features of a STWC but currently it lacks the integration with biosensors. The
prototype is called +me 3(see fig. 2 and 3), and outwardly looks like a soft,
animal-shaped pillow (see fig. 4). The four paws form a collar so that the child
can wear the +me around the neck (see fig. 5).
An electronic device is embedded within the pillow padding (see fig. 6).
The device is composed by several commercial electronic components, partially
hosted on a customised printed circuit board (PCB). The various components,
1Commercial products for physiological data recording are now becoming available at
relatively low prices, e.g. see www.empatica.com
2http://www.istc.cnr.it/group/locen
3www.plusme.it
Trasitional Wearable Companions 9
Fig. 3: An early prototype of +me showing different light responses based on
the location of the hand contact. The touch sensors embedded in the fabric
detect even the soft caresses. The colours of emitted lights and sounds can be
remotely regulated via a tablet application (in the background).
Fig. 4: A +me prototype has been presented to MakerFaire 2015, European
Edition. The aspect of the device, characterised by an animal shape and soft-
ness, was clearly appealing for children.
described in fig. 7, manage the +me inputs and outputs. Four capacitive sen-
sors are arranged under the cotton fabric cover of the pillow in correspondence
of the paws and detect the childs touch. Four 20 cm long RGB LEDs strip are
placed within the paws and can light the animal limbs with different colours.
Two speakers are positioned in correspondence to the animal head, so that they
are close to childs ears when the pillow is worn. The sound card component
can play mp3 files which, in the current experimental setup, reproduce brief
sounds or musics. All activities are supported by two Arduino boards. The
first board is the main controller of the +me and coordinates the software op-
erations (inputs readings, outputs management, onboard computations). The
second board is a slave controller employed for the audio management.
10 Beste ¨
Ozcan et al.
Fig. 5: The +me worn around the neck.
Fig. 6: The PCB embedded in the +me padding.
The whole device is powered by a 12V LiPo rechargeable battery, ensuring
several hours of life and a safe low voltage. The software mastering the cause-
effect contingencies (e.g., how the lights and sounds are produced in response to
the child’s touch of the animal paws), can be modified in real-time through an
application running on a tablet (Android operating system). This application,
coupled by bluetooth connection to the +me, can be controlled by therapists
and caregivers [28].
Trasitional Wearable Companions 11
Fig. 7: Schema of the organisation of the +me electronic components. A cus-
tomised printed circuit board (PCB) hosts commercial components: two Ar-
duino Nano controllers (the first performing the main operations of the sys-
tem, the second performing audio operations); a HC-05 bluetooth module; a
board for capacitive touch sensor (MPR121 by Adafruit); a TLC5940 LED
driver chip; an audio board (VS105 by Adafruit); a stereo 3.7W audio ampli-
fier (MAX98306 by Adafruit). PCB is connected to: a couple of 4 Ohm 3W
stereo speakers for sound output; four 20 cm long 12V RGB LED strips for
visual output; four circular patches of transparent conductive copper fabric
(by Plug&Wear) for touch input detection.
4.2 Future enhancements of the system
The system we developed, currently under test (see sec. 5), could be improved
in several ways. Here, we briefly illustrate three possible directions of improve-
ment that might be implemented independently or in synergy.
12 Beste ¨
Ozcan et al.
Biosensors integration. We are developing an improved +me prototype that
will include all the critic features requested by a PSTWC. This version of
the system will be integrated with a wristband with low cost, low power,
and non-intrusive biosensors and accelerometers usable for physiological and
movement data collection. These will support the detection of HR, HRV, EDA,
Skin Temperature (SKT), and movement [52,53, 27, 21,6]. Such data will be
sent to the tablet application to allow the therapist/caregiver, once suitably
elaborated, to evaluate in real-time the levels of activity, stress, engagement,
and other emotional states of the child. This information will help the thera-
pist/caregiver to have a deeper understanding of the childs emotional state as
it might complement the possibly poor information received from the child fa-
cial expressions, bodily signals, and verbal communication. This will facilitate
a real-time fine tuning of the therapeutic/daily life interactions [7]. Moreover,
it will also allow a faster learning by the caregiver of the best way with which
to deal with the child, for example to minimise the stress level: indeed, the
caregiver will have a real-time feedback on the effects of own behaviour on the
child’s state. Information on the child’s state might also be directly exploited
by the wearable companion, for example to adjust the inputs sent to the child
and based on sounds, coloured lights, and vibrations (fig. 1), especially if such
information will be suitably processed by an intelligent controller.
Intelligent controller. Future versions of the +me might be provided with
more artificial intelligence. This intelligence might implemented in the main
Arduino controller embedded in the wearable companion, or reside remotely
in the tablet connected with the rest of the system via Bluetooth. More in-
telligence might be used to accomplish two classes of useful functions. First,
information from biosensors and other sensors of the system might be suit-
ably processed to produce complex knowledge on the child’s emotional state
and on the level and quality of his activity. This knowledge might be directly
delivered to a therapist/caregiver via the tablet or via the actuators of the
wearable companion (e.g., lights), or it might be used to suggest the caregiver
possible courses of action [55, 56]. For example, the wearable companion might
understand that the child is in a stressed condition and so suggest the care-
giver to use a less-intrusive, emphatic approach; or it might realise that the
child is in a down-state, or bored, and so suggest the caregiver to intervene
and engage with her/him. This would thus support a suitable regulation of
the caregiver’s behaviour so as to improve social interaction and engagement.
Also, it could allow the caregiver to learn permanent skills to better interpret
the child’s language, facial expressions, and body signals. Second, a more in-
telligent controller might allow the wearable companion itself to autonomously
regulate its own behaviour. For example, the companion might monitor the
number and frequency of repetitive actions performed by the child on specific
parts of its body and understand that the child got trapped in a stereotyped
behaviour. On this basis, the companion might decide to interrupt or change
the light/sound contingencies to lead the child to engage in other activities
that might be proposed him/her by the companion itself.
Trasitional Wearable Companions 13
Smart moving textiles. The interaction features of +me could be greatly ex-
tended if movement capabilities were added to it. Although standard motors
(servos) could be embedded into the TWC, this solution is probably not suit-
able as such actuators are relatively heavy and non compliant. A promising
direction lies in the employment of “smart” textiles. These types of solutions
appears well fitted to the soft nature of TWCs and would add additional
“robotic capabilities” to the device. Smart textiles are a new class of fabrics
which incorporate wires containing Shape Memory Materials SMM (based on
alloys [57] or polymers [58]). These type of materials have the interesting prop-
erty to react to external stimuli, such as changes in temperature, by modifying
their shape and size. This technology is finding interesting applications in ar-
eas as clothing [59–62], and garments based on smart moving textiles, capable
to shrinking, creasing and rolling-up, and are already available in the mar-
ket. These interesting properties could be naturally exploited in the wearable
companion. For example, a +me using a smart fabric under the control of the
inner electronic apparatus could nicely hug a child in correspondence to stress
peaks so as to increase the child’s feeling of protection.
5 Experimental protocol
A first experimental test with the current prototype has been designed and
now being implemented. The test will involve a selected sample of autistic
children. The main target of the experiment is to confirm the utility of the
features of STWC implemented in the +me. In particular we expect that the
+me will be accepted by children as a lovely comforting companion more than
other toys because of its wearable features, interactivity, and social engagement
potentialities.
In detail, the experimental protocol will try to investigate the ability of
the +me (which has interactive, companion-like, and wearable properties)
compared to other objects normally used during therapy, in particular: a
Chicco R
toy (interactive, non “companion-like”, non wearable); a pelushe (non
interactive, companion-like, wearable); and wooden cubes (non interactive, non
companion-like, non wearable). We will measure how the +me compares to
the other toys with respect to its capacity to capture the attention of chil-
dren, encourage engagement, and support social interaction. To do this each
object will be compared through a number of quantitative indexes measuring
the object acceptability during five activities carried out with the children.
The sample of participants will be selected within the “Istituto Neurotrauma-
tologico Italiano (INI), Villa Dante Division” and will include patients with
a diagnosis of ASD and chronological age between 2 and 6 years. The five
activities will be videotaped and organised as follows.
Observation. In this activity, the child is free to interact with a test object
placed on a table 30 cm away from him/her (thus easily reachable by the child).
During the activity the experimenter observes the child-object interaction.
14 Beste ¨
Ozcan et al.
The activity will involve all the tested object separately. This activity aims at
verifying how much each tested object arises the child’s interest.
Wearability. During this activity the experimenter places a test object on the
shoulders of the child and observes him. This activity is carried out only with
the +me and the pelushe to see which of the two objects is better accepted
and enjoyable by the child.
Ability to adjust cause-effect loop. This is an activity in which the child is
free to interact with the test object. The activity is conducted only with the
+me and the Chicco R
toy. During the activity, in the case of the +me if the
child exhibits a repetitive behavior the experimenter intervenes to break the
repetitions thought the tablet. By contrast, in the Chicco R
toy case the exper-
imenter intervenes by moving the child’s hand to another point of the object.
This activity is intended to check if the +me compares to the Chicco R
toy if
used to quench stereotyped behaviours and fixation, problematic symptoms of
ASD.
Activation request. This is an activity during which the experimenter places
the object on a table so that the child can not reach it. The experimenter
observes the child and gives the test object to him. After 10 seconds, the
experimenter puts back the test object on the table and observes the child
again. This activity is intended to evaluate which object induces a request
from the child towards the experimenter in order to get the test object, thus
stimulating social engagement.
Imitative behavior. This activity is divided in three different phases depending
on the features of the objects. During the activity the experimenter executes
particular actions on every object in front of child. Then the experimenter
gives the object to the child and observes possible child’s imitative behaviours.
This activity is carried out to verify which of test objects incentives imitative
behaviours.
The quantitative behavioural indices that we will record based on camera
recordings and direct observation are as follows:
–how long the child touches the object (in seconds);
–how many times the child looks at the object (frequency);
–how many times the child looks the experimenter;
–how many times the child refuses the ob ject or throws it away;
–how many times the child turns away;
–how many times the child smiles;
–how many times the child cries;
–how many times the child engages in a triangulation looks with the object
and experimenter;
–how many times the child touches the object and the experimenter;
–how many times the child performs indicative gestures.
Trasitional Wearable Companions 15
6 Conclusions
This paper has proposed the principles of a new concept of social robot, called
transitional wearable companion – TWC, usable to support social interactions
of children with Autism Spectrum Disorder (ASD) and to improve their so-
cial skills in both therapeutic and ecological contexts (e.g., home and school).
TWCs are soft interactive social robots that give the child a continuous reas-
suring physical contact and respond to the child’s manipulations by emitting
lights, sounds, vibrations and other actions. The responses of the TWC are
usable to enhance the childs engagement and allow the wearable companion to
communicate with him/her and/or with caregivers and therapists of the child.
TWCs could also include interfaces, such as tablets or computers, communi-
cating with the main device through Bluetooth (social TWC – STWC). This
opens up innumerable possibilities for therapists and caregivers to “get in the
loop” of the child-wearable companion interactions, for example to remotely
monitor the child and intervene on the behaviour of the wearable compan-
ion. Finally, STWCs could also include biosensors to collect information on
some physiological parameters of the child (physiological STWC – PSTWC).
This information could be the basis to produce knowledge on the emotional
state and activity of the child. This knowledge might be used by caregivers,
therapists and also by the wearable companion itself. In the future, PSTWCs’
potential to support children and caregivers might be further enhanced in their
potentialities by increasing their artificial intelligence and by endowing them
with soft actuators allowing the motion of their body. Thanks to these highly
flexible features, TWCs could meet the necessity to have customised and per-
sonalised health care products for ASD [30, 32], and thus become a very useful
tool to support both therapy and daily life social interactions of children with
such disorder.
Acknowledgements This research has received funds from the European Commission un-
der the 7th Framework Programme (FP7/2007-2013), ICT Challenge 2 “Cognitive Systems
and Robotics”, project “IM-CLeVeR - Intrinsically Motivated Cumulative Learning Versa-
tile Robots”, grant agreement no. ICT-IP-231722. The authours would like to thank M.
Aliberti, S. Scaffaro, A. Medda from INI Institute, Villa Dante for the collaboration in de-
veloping the TWC general idea, and M. Cicorella from MakeInBo for his help in developing
the +me hardware.
References
1. H. van Rijn and P. J. Stappers, “The Puzzling Life of Autistic Toddlers: Design Guide-
lines from the LINKX Project,” Advances in Human-Computer Interaction, vol. 2008,
pp. 1–8, 2008.
2. S. Baron-Cohen, O. Golan, and E. Ashwin, “Can emotion recognition be taught to
children with autism spectrum conditions?” Philosophical Transactions of the Royal
Society B: Biological Sciences, vol. 364, no. 1535, pp. 3567–3574, 2009.
3. R. el Kaliouby and P. Robinson, “Therapeutic Versus Prosthetic Assistive Technologies:
The Case Of Autism,” Computer Laboratory, University of Cambridge, Tech. Rep.,
2003.
16 Beste ¨
Ozcan et al.
4. E. I. Konstantinidis, A. Luneski, C. a. Frantzidis, P. Costas, and P. D. Bamidis, “A
proposed framework of an interactive semi-virtual environment for enhanced education
of children with autism spectrum disorders,” in 22nd IEEE International Symposium
on Computer Based Mediac Systems. Albuquerque, NM, USA: IEEE, August 2009,
pp. 1–6.
5. A. Funahashi, A. Gruebler, T. Aoki, H. Kadone, and K. Suzuki, “Brief report: the
smiles of a child with autism spectrum disorder during an animal-assisted activity may
facilitate social positive behaviors–quantitative analysis with smile-detecting interface.”
Journal of Autism and Developmental Disorders, vol. 44, no. 3, pp. 685–693, 2014.
6. A. Jimenez, “Physiological Sensor,” Ph.D. dissertation, University of Louisville, 2013,
departement of Electrical and Computer Engineering.
7. K. Welch, “Physiological signals of autistic children can be useful,” Instrumentation &
Measurement Magazine, IEEE, vol. 15, no. 1, pp. 28–32, 2012.
8. Y. Takano and K. Suzuki, “Affective communication aid using wearable devices based
on biosignals,” Proceedings of the 2014 conference on Interaction design and children
- IDC ’14, pp. 213–216, 2014.
9. A. Dsouza, M. Barretto, and V. Raman, “Uncommon Sense: Interactive sensory toys
that encourage social interaction among children with autism,” Workshop paper pre-
sented at IDC (Vol. 12), 2010.
10. S. Baron-Cohen, Mindblindness: An essay on autism and theory of mind, Ba, Ed. MIT
press, Cambridge, MA, 1995.
11. P. Pennisi, A. Tonacci, G. Tartarisco, L. Billeci, L. Ruta, S. Gangemi, and G. Pioggia,
“Autism and social robotics: A systematic review,” Autism Research, vol. 9, no. 2, pp.
165–183, 2016.
12. S. Powell, “The use of computers in teaching people with autism,” in Autism on the
agenda: papers from a National Autistic Society Conference. London, 1996.
13. W. Farr, N. Yuill, and H. Raffle, “Social benefits of a tangible user interface for children
with Autistic Spectrum Conditions.” Autism : the international journal of research and
practice, vol. 14, no. 3, pp. 237–52, 2010.
14. R. Luyster, K. Gotham, W. Guthrie, M. Coffing, R. Petrak, K. Pierce, S. Bishop, A. Es-
ler, V. Hus, R. Oti et al., “The autism diagnostic observation scheduletoddler module:
A new module of a standardized diagnostic measure for autism spectrum disorders,”
Journal of autism and developmental disorders, vol. 39, no. 9, pp. 1305–1320, 2009.
15. G. Pioggia, R. Igliozzi, M. Ferro, A. Ahluwalia, F. Muratori, and D. De Rossi, “An
android for enhancing social skills and emotion recognition in people with autism,”
IEEE Transaction on Neural Systems and Rehabilitation Engineering, vol. 13, no. 4,
pp. 507–515, 2005.
16. B. Robins, K. Dautenhahn, R. T. Boekhorst, and a. Billard, “Robotic assistants in ther-
apy and education of children with autism: can a small humanoid robot help encourage
social interaction skills?” Universal Access in the Information Society, vol. 4, no. 2, pp.
105–120, 2005.
17. M. Trimingham, “’Objects in transition: the puppet and the autistic child’,” Journal of
Applied Arts in Health, vol. 1, no. 3, pp. 251–265, 2010.
18. D. W. Winnicott, “Transitional objects and transitional phenomena a study of the first
not-me possession,” International Journal of Psychoanalysis, vol. 34, pp. 89–97, 1953.
19. O. Stevenson, “The First Treasured Possession: A Study of The Part Played by Specially
Loved Objects And Toys In The Lives of Certain Children,” The Psychoanalytic Study
of the Child, vol. 9, pp. 199–217, 1954.
20. B. ¨
Ozcan, “Motivating Children with Autism to Communicate and Interact Socially
Through the + me Wearable Device,” in Nea-Science: Giornale Italiano di Neuro-
scienze, Psicologia e Riabilitazione, F. Paglieri and F. Ferretti, Eds., Salerno, Italy,
2014, pp. 65–71.
21. J. Z. Elias, P. B. Morrow, J. Streater, S. Gallagher, and S. M. Fiore, “Towards tri-
adic interactions in autism and beyond: Transitional objects, joint attention, and social
robotics,” Proceedings of the Human Factors and Ergonomics Society, vol. 55, no. 1,
pp. 1486–1490, 2011.
22. J. C. J. Brok and E. I. Barakova, “Engaging autistic children in imitation and turn-
taking games with multiagent system of interactive lighting blocks,” in Entertainment
Trasitional Wearable Companions 17
Computing ICEC 2010, ser. Lecture Notes in Computer Science, vol. 6243. Springer,
2010, pp. 115–126.
23. D. Caligiore, P. Tommasino, V. Sperati, and G. Baldassarre, “Modular and hierarchical
brain organization to understand assimilation, accommodation and their relation to
autism in reaching tasks: a developmental robotics hypothesis,” Adaptive Behavior,
vol. 22, no. 5, pp. 304–329, 2014.
24. J. A. Kientz, G. R. Hayes, T. L. Westeyn, T. Starner, and G. D. Abowd, “Pervasive com-
puting and autism: Assisting caregivers of children with special needs,” IEEE Pervasive
Computing, no. 1, pp. 28–35, 2007.
25. R. W. Picard, “Future affective technology for autism and emotion communication.”
Philosophical transactions of the Royal Society B: Biological Sciences, vol. 364, no.
1535, pp. 3575–84, 2009.
26. M. C. Chang, L. D. Parham, E. I. Blanche, A. Schell, C.-P. Chou, M. Dawson, and
F. Clark, “Autonomic and behavioral responses of children with autism to auditory
stimuli,” American Journal of Occupational Therapy, vol. 66, no. 5, pp. 567–576, 2012.
27. A. Kushki, E. Drumm, M. P. Mobarak, N. Tanel, A. Dupuis, T. Chau, and E. Anagnos-
tou, “Investigating the autonomic nervous system response to anxiety in children with
autism spectrum disorders,” PLoS one, vol. 8, no. 4, p. e59730, 2013.
28. R. R. Fletcher, M. Z. Poh, and H. Eydgahi, “Wearable sensors: Opportunities and
challenges for low-cost health care,” in Proceedings of 32nd International Conference
of the IEEE Engineering in Medicine and Biology Society (EMBC’10), Buenos Aires,
Argentina, September 2010, pp. 1763–1766.
29. M. Campbell, E. Schopler, J. E. Cueva, and A. Hallin, “Treatment of autistic disorder,”
Journal of the American Academy of Child & Adolescent Psychiatry, vol. 35, no. 2, pp.
134–143, 1996.
30. N. Lofthouse, R. Hendren, E. Hurt, L. E. Arnold, and E. Butter, “A review of comple-
mentary and alternative treatments for autism spectrum disorders,” Autism research
and treatment, vol. 2012, 2012, article ID 870391.
31. A. Harris, J. Rick, V. Bonnett, N. Yuill, R. Fleck, P. Marshall, and Y. Rogers, “Around
the table: Are multiple-touch surfaces better than single-touch for children’s collabo-
rative interactions?” in Proceedings of the 9th International Conference on Computer
Supported Collaborative Learning (CSCL09), vol. 1, 2009, pp. 335–344.
32. B. Scassellati, Henny Admoni, and M. Matari´c, “Robots for Use in Autism Research,”
Annual Review of Biomedical Engineering, vol. 14, no. 1, pp. 275–294, 2012.
33. J. C. McPartland, M. Coffman, and K. A. Pelphrey, “Recent advances in understanding
the neural bases of autism spectrum disorder,” Current opinion in pediatrics, vol. 23,
no. 6, pp. 628–632, 2011.
34. P. Winkielman, “Embodied and disembodied processing of emotional expressions: In-
sights from autism spectrum disorders,” Behavioral and Brain Sciences, vol. 33, no. 6,
pp. 463–464, 2010.
35. H. Kozima, C. Nakagawa, and Y. Yasuda, “Interactive robots for communication-care:
A case-study in autism therapy,” in International Workshop on Robot and Human
Interactive Communication, (ROMAN 2005). IEEE, 2005, pp. 341–346.
36. H. Andreae, P. Andreae, J. Low, and D. Brown, “A Study Of Auti: A Socially Assis-
tive Robotic Toy,” in Proceedings of the 2014 Conference on Interaction Design and
Children (IDC ’14), 2014, pp. 245–248.
37. G. Owens, Y. Granader, A. Humphrey, and S. Baron-Cohen, “Lego R
therapy and the
social use of language programme: An evaluation of two social skills interventions for
children with high functioning autism and asperger syndrome,” Journal of Autism and
Developmental Disorders, vol. 38, no. 10, pp. 1944–1957, 2008.
38. S. Schachter and J. Singer, “Cognitive, social, and physiological determinants of emo-
tional state.” Psychological Review, vol. 69, no. 5, pp. 379–399, 1962.
39. C. Liu, K. Conn, N. Sarkar, and W. Stone, “Physiology-based affect recognition for
computer-assisted intervention of children with Autism Spectrum Disorder,” Interna-
tional Journal of Human-Computer Studies, vol. 66, no. 9, pp. 662–677, 2008.
40. X. Ming, J. M. Bain, D. Smith, M. Brimacombe, G. G. Von-Simson, and F. B. Axelrod,
“Assessing autonomic dysfunction symptoms in children: a pilot study,” Journal of
Child Neurology, vol. 26, no. 4, pp. 420–427, 2011.
18 Beste ¨
Ozcan et al.
41. R. C. Schaaf, T. W. Benevides, B. E. Leiby, and J. A. Sendecki, “Autonomic dysregu-
lation during sensory stimulation in children with autism spectrum disorder,” Journal
of Autism and Developmental Disorders, vol. 45, no. 2, pp. 461–472, 2013.
42. I. Smeekens, R. Didden, and E. W. M. Verhoeven, “Exploring the relationship of au-
tonomic and endocrine activity with social functioning in adults with autism spectrum
disorders,” Journal of Autism and Developmental Disorders, vol. 45, no. 2, pp. 495–505,
2013.
43. Y. Wang, M. K. Hensley, A. Tasman, L. Sears, M. F. Casanova, and E. M. Sokhadze,
“Heart Rate Variability and Skin Conductance During Repetitive TMS Course in Chil-
dren with Autism,” Applied Psychophysiology and Biofeedback, vol. 41, no. 1, pp. 47–60,
2016.
44. J. Klusek, J. E. Roberts, and M. Losh, “Cardiac autonomic regulation in autism and
Fragile X syndrome: A review.” Psychological Bulletin, vol. 141, no. 1, pp. 141–175,
2015.
45. M. A. Patriquin, J. Lorenzi, and A. Scarpa, “Relationship between respiratory sinus ar-
rhythmia, heart period, and caregiver-reported language and cognitive delays in children
with autism spectrum disorders,” Applied psychophysiology and biofeedback, vol. 38,
no. 3, pp. 203–207, 2013.
46. S. W. Porges, “The polyvagal theory: phylogenetic substrates of a social nervous sys-
tem,” International Journal of Psychophysiology, vol. 42, no. 2, pp. 123–146, 2001.
47. C. Hutt, S. J. Forrest, and J. Richer, “Cardiac arrhythmia and behaviour in autistic
children,” Acta Psychiatrica Scandinavica, vol. 51, no. 5, pp. 361–372, 1975.
48. W. Hirstein, P. Iversen, and V. S. Ramachandran, “Autonomic responses of autistic
children to people and objects.” in Proceedings of the Royal Society of London, ser. B:
Biological Sciences, vol. 268, 2001, pp. 1883–1888.
49. R. J. Palkovitz and A. R. Wiesenfeld, “Differential autonomic responses of autistic and
normal children,” Journal of Autism and Developmental Disorders, vol. 10, no. 3, pp.
347–360, 1980.
50. B. Tegler, M. Sharp, and M. A. Johnson, “Ecological monitoring and assessment net-
work’s proposed core monitoring variables: An early warning of environmental change,”
Environmental Monitoring and Assessment, vol. 67, no. 1-2, pp. 29–56, 2001.
51. M.-Z. Poh, N. C. Swenson, and R. W. Picard, “A Wearable Sensor for Unobtrusive,
Long-Term Assessment of Electrodermal Activity,” IEEE Transactions on Biomedical
Engineering, vol. 57, no. 5, pp. 1243–1252, 2010.
52. F. Albinali, M. S. Goodwin, and S. S. Intille, “Recognizing stereotypical motor move-
ments in the laboratory and classroom: a case study with children on the autism spec-
trum,” in Proceedings of the 11th international conference on Ubiquitous computing,
2009, pp. 71–80.
53. C.-Y. Pan, C.-L. Tsai, K.-W. Hsieh, C.-H. Chu, Y.-L. Li, and S.-T. Huang,
“Accelerometer-determined physical activity among elementary school-aged children
with autism spectrum disorders in taiwan,” Research in Autism Spectrum Disorders,
vol. 5, no. 3, pp. 1042–1052, 2011.
54. C. C. Haswell, J. Izawa, L. R. Dowell, S. H. Mostofsky, and R. Shadmehr, “Represen-
tation of internal models of action in the autistic brain,” Nature neuroscience, vol. 12,
no. 8, pp. 970–972, 2009.
55. P. Rani and N. Sarkar, “Emotion-sensitive robots - a new paradigm for human-robot
interaction,” in 4th IEEE/RAS/ International Conference on Humanoid Robots. Los
Angeles, USA: IEEE, 2004, pp. 149–167.
56. K. H. Hyun, E. H. Kim, and Y. K. Kwak, “Emotional feature extraction method
based on the concentration of phoneme influence for humanrobot interaction,” Advanced
robotics, pp. 47–67, 2010.
57. J. M. Jani, M. Leary, A. Subic, and M. A. Gibson, “A review of shape memory alloy
research, applications and opportunities,” Materials & Design, vol. 56, pp. 1078–1113,
2014.
58. D. Ratna and J. Karger-Kocsis, “Recent advances in shape memory polymers and com-
posites: a review,” Journal of Materials Science, vol. 43, no. 1, pp. 254–269, 2008.
59. J. Hu, Shape memory polymers and textiles. Elsevier, 2007.
60. J. Hu and S. Chen, “A review of actively moving polymers in textile applications,”
Journal of Materials Chemistry, vol. 20, no. 17, pp. 3346–3355, 2010.
Trasitional Wearable Companions 19
61. Y. Y. C. Vili, “Investigating smart textiles based on shape memory materials,” Textile
Research Journal, vol. 77, no. 5, pp. 290–300, 2007.
62. L. Van Langenhove, C. Hertleer, and A. Schwarz, “Smart textiles: An overview,” in
Intelligent Textiles and Clothing for Ballistic and NBC Protection. Springer, 2012,
pp. 119–136.
Authors’ biographies:
Beste ¨
Ozcan: in 2009 she got a M.A. at Department of Interior Architec-
ture and Environmental Design, Faculty of Fine Arts, Hacettepe Univer-
sity, Ankara (Turkey). In 2014 she got an International Ph.D in Design
and Innovation at Second University of Naples (Italy). She is interested
in cognitive sciences, interactive and wearable computing, social innova-
tion, AI, robots, the future, sci-fi, art, and everything in between. She is
currently an intern at Institute of Cognitive Sciences and Technologies,
National Research Council, ISTC-CNR, (Rome, Italy).
(www.beste-ozcan.com).
Daniele Caligiore: he received a Master Degree in Electronics Engineering
at the University of Catania (Italy) in 2003, and a PhD in Biomedical En-
gineering at the University Campus Bio-Medico di Roma (Italy) in 2011.
During his PhD he was visiting scholar at the Centre for Robotics and
Neural Systems and at the School of Psychology (University of Plymouth,
UK) and at Embodied Cognition Lab (Universita’ of Bologna, Italy). Since
2004 he works as a researcher with the Institute of Cognitive Sciences
and Technologies, National Research Council (Rome, Italy). He has par-
ticipated to several European projects in the field of embodied cognition
and developmental robotics: MindRACES – from Reactive to Anticipatory
Cognitive Embodied Systems; ROSSI – Emergence of communication in
RObots through Sensorimotor and Social Interaction; IM-CLeVeR – In-
trinsically Motivated Cumulative Learning Versatile Robots. His research
interests include developmental robotics, embodied cognition, system-level
computational neuroscience, brain cortical and sub-cortical hierarchies; re-
inforcement learning. He has authored/co-authored about 70 peer-reviewed
publications appeared in international journals, books and conference pro-
ceedings. Recently, he was guest-editor for a consensus paper of the journal
‘Cerebellum’ titled ‘Towards a systems-level view of cerebellar function: the
interplay between cerebellum, basal ganglia and cortex’.
(www.istc.cnr.it/people/daniele-caligiore).
Valerio Sperati: in 2006 he got an M.A in Psychology at the University of
Rome ’La Sapienza’. From 2007 to now he is a Temporary Researcher at the
Institute of Cognitive Science and Technologies, National Research Coun-
cil, ISTC-CNR (Rome, Italy), partecipating to the following European
projects: “ECAgents: Embodied and Communicating Agents”; “Swarmanoid:
Towards Humanoid Robotic Swarms”; “IM-CLeVeR: Intrinsically Moti-
20 Beste ¨
Ozcan et al.
vated Cumulative Learning versatile Robots.”. He is currently a Ph.D stu-
dent in Computer Science at University of Plymouth (UK).
(www.istc.cnr.it/people/valerio-sperati).
Tania Moretta: in 2011 she got a B.A. in ’Sciences and Psychological Tech-
nics for Clinic Analysis and Evaluation of Cognitive Processes’, and in 2014
a M.A. in ’Cognitive Neuroscience and Psychological Rehabilitation’, both
at University of Rome ’La Sapienza’. She is currently intern at Institute
of Cognitive Science and Technologies, National Research Council, ISTC-
CNR (Rome, Italy).
Gianluca Baldassarre: in 1998 he got a BA and MA in Economics at the
University of Rome ’La Sapienza’. In 1999 he got an MSc in Cognitive Psy-
chology and Neural Networks at the same university. In 2003 he got a PhD
in Computer Science at the University of Essex, UK (research on ’Planning
with Neural Networks’). Then he did a postdoc at the Institute of Cog-
nitive Sciences and Technologies, National Research Council, ISTC-CNR
(Rome, Italy), working on swarm robotics. From 2006 he is a Researcher at
ISTC-CNR and coordinates the Research Group called ‘LOCEN – Labora-
tory of Computational Embodied Neuroscience’. In 2006-2009 he was Team
Leader of the EU project ’ICEA – Integrating Cognition Emotion and Au-
tonomy’; in 2009-2013 he was the Coordinator of the European Integrated
Project ‘IM-CLeVeR – Intrinsically-Motivated Cumulative-Learning Versa-
tile Robots’; in 2016-2020 is being the Coordinator of the EU FET project
’GOAL-Robots - Goal-based Open-ended Autonomous Learning Robots’.
His research interests are on the architecture and learning mechanisms sup-
porting the cumulative acquisition of multiple sensorimotor skills, driven
by extrinsic and intrinsic motivations, in animals, humans, and robots.
He is also interested in higher-level embodied cognition grounded on and
serving those processes. He studies these topics following two synergis-
tic approaches: (a) through biologically-constrained computational models,
with the aim of understanding brain and behaviour; (b) through machine-
learning and robotic approaches, with the aim of producing new useful
artificial intelligence algorithms and robots. He has over 100 international
peer-review publications.
(www.istc.cnr.it/people/gianluca-baldassarre).
- A preview of this full-text is provided by Springer Nature.
- Learn more
Preview content only
Content available from International Journal of Social Robotics
This content is subject to copyright. Terms and conditions apply.