Archived project

Affective Body Language of Humanoid Robots

Goal: To enable robots to express affect during task execution, we integrate bodily expression of mood with functional behaviors. To this end, we propose a parameterized behavior model in which behavior parameters control the spatial and temporal extent of a behavior. Modulating these parameters provide affective behavioral cues in the behavior. Thus, moods can be expressed using the same behavior executed in different
"styles", rather than additional body actions used to show emotions.

Date: 14 December 2015

Updates
0 new
1
Recommendations
0 new
0
Followers
0 new
6
Reads
0 new
47

Project log

Koen V. Hindriks
added an update
Junchao's thesis can be found here: .
 
such an interested topic to be followed and it can be useful for citation for further work by my side
 
Koen V. Hindriks
added 7 research items
Bodily expression of affect is crucial to human robot interaction. We distinguish between emotion and mood expression, and focus on mood expression. Bodily expression of an emotion is explicit behavior that typically interrupts ongoing functional behavior. Instead, bodily mood expression is integrated with functional behaviors without interrupting them. We propose a parameterized behavior model with specific behavior parameters for bodily mood expression. Robot mood controls pose and motion parameters, while those parameters modulate behavior appearance. We applied the model to two concrete behaviors — waving and pointing — of the NAO robot, and conducted a user study in which participants (N=24) were asked to design the expression of positive, neutral, and negative moods by modulating the parameters of the two behaviors. Results show that participants created different parameter settings corresponding with different moods, and the settings were generally consistent across participants. Various parameter settings were also found to be behavior-invariant. These findings suggest that our model and parameter set are promising for expressing moods in a variety of behaviors.
Bodily expression of affect is crucial to human robot interaction. Our work aims at designing bodily expression of mood that does not interrupt ongoing functional behaviors. We propose a behavior model containing specific (pose and motion) parameters that characterize the behavior. Parameter modulation provides behavior variations through which affective behavioral cues can be integrated into behaviors. To investigate our model and parameter set, we applied our model to two concrete behaviors (waving and pointing) on a NAO robot, and conducted a user study in which participants (N=24) were asked to design such variations corresponding with positive, neutral, and negative moods. Preliminary results indicated that most parameters varied significantly with the mood variable. The results also suggest that the relative importance may be different between parameters, and parameters are probably interrelated. This paper presents the analysis of these aspects. The results show that the spatial extent parameters (hand-height and amplitude), the head vertical position, and the temporal parameter (motion-speed) are the most important parameters. Moreover, multiple parameters were found to be interrelated. These parameters should be modulated in combination to provide particular affective cues. These results suggest that a designer should focus on the design of the important behavior parameters and utilize the parameter combinations when designing mood expression.
Our goal is to develop bodily mood expression that can be used during the execution of functional behaviors for humanoid social robots. Our model generates such expression by stylizing behaviors through modulating behavior parameters within functional bounds. We have applied this approach to two behaviors, waving and pointing, and obtained parameter settings corresponding to different moods and interrelations between parameters from a design experiment. This paper reports an evaluation of the parameter settings in a recognition experiment under three conditions: modulating all parameters, only important parameters, and only unimportant parameters. The results show that valence and arousal can be well recognized when the important parameters were modulated. Modulating only the unimportant parameters is promising to express weak moods. Speed parameters, repetition, and head-up-down were found to correlate with arousal, while speed parameters may correlate more with valence than arousal when they are slow.
Koen V. Hindriks
added a project goal
To enable robots to express affect during task execution, we integrate bodily expression of mood with functional behaviors. To this end, we propose a parameterized behavior model in which behavior parameters control the spatial and temporal extent of a behavior. Modulating these parameters provide affective behavioral cues in the behavior. Thus, moods can be expressed using the same behavior executed in different
"styles", rather than additional body actions used to show emotions.