PosterPDF Available

Emotion Detection from Gait and Thermal Face for Social Robot

Authors:

Abstract

I got the "Best Poster Prize" in "Journee de L'ED Interfaces", where all second-year Ph.D students from 4 universities (École Polytechnique, ENSTA-ParisTech, Centrale Supélec and UVSQ - Université de Versailles Saint-Quentin-en-Yvelines ) and about 3 majors (Information and Computer Science, Biology and Chemistry) joined the poster part. Human emotion detection is vital for a social robot during human-robot interaction. In our research, we propose a hybrid machine learning model with the multimodal datathe gait data from Kinect and the thermal facial images from the thermal camera. The model can help the social robot classify 4 emotions, namely neutral, happy, angry and sad. Research goal: (1) How to detect emotion from human face and walking for the social robot. (2) How to improve the accuracy of emotion classification for the social robot.
,
Emotion Detection from Gait and Thermal Face for Social Robot
Chuang YU
chuang.yu@ensta-paristech.fr
Adriana TAPUS
adriana.tapus@ensta-paristech.fr
Autonomous Systems and Robotics Laboratory, U2IS, ENSTA-ParisTech, Palaiseau, France
,
1. Introduction
IHuman emotion detection is vital for a social robot during human-robot
interaction.
IIn our research, we propose a hybrid machine learning model with the
multimodal datathe gait data from Kinect and the thermal facial images
from the thermal camera.
IThe model can help the social robot classify 4 emotions, namely neutral,
happy, angry and sad.
IResearch goal:
How to detect emotion from human face and walking for the social robot
How to improve the accuracy of emotion classification for the social robot
2. Experiment scenario
ISensors and robot:
Pepper robot for interaction
Thermal camera Optris PI640
Microsoft Kinect V2
IParticipants and questionnaire:
8 women and 8 men
half extroverted and introverted
Eysenck Personality Questionnaire
Setting:
Walking area: 1-6m
Standing duration: 10 seconds
Orders:Balanced Latin Squares
Every person: 20 experiments
3. Data Extraction and Analysis
For gait, angles of body joints are strong relation with human emotion.
Left/right hip/knee angle/its velocity are get through Kienct SDK.
The filtering of data leads to a better accuracy of emotion classifier.
For face, temperature of nose and cheek is sensitive to human emotions.
Face ROI (Region of Interest) is detected by Dlib library.
The average temperatures of left/right cheek and nose are recorded .
4. Emotion Features Extraction
Gait Features:
8 PSD (Power Spectral Density) of joint angle and its speed are used as gait features.
PSD describes the power present in the signal as a function of frequency.
Welch method is employed to calculate PSD.
Thermal Face Features:
The average temperature of nose and left/right
cheek are used as the thermal face features.
The feature value distribution of 4 emotion shows
emotions can be classified.
5. Hybrid Emotion Classification Model and Accuracy
Hybrid model building
Firstly, RF (Random Forest) model of 4 emoions only with gait or thermal face features
is trained respectively.
Secondly, 6 RF models of 2 emotions with all features are trained respectively.
Lastly, the test data is input the hybrid model to classify human emotions.
Hybrid model with all features has a higher accuracy than model only with feature.
Accuracy of hybrid model
In this paper, we also adopt the CNN, HMM and SVM only with gait features.
The hybrid model has a higher accuracy with 80% than CNN, HMM and SVM.
The CNN needs a bigger database in order to get a better result.
6. Online Testing on Pepper Robot
During online testing, human walks to Pepper Robot and stands before it
to interact with it. Pepper will behave with the emotion as it understands.
Acknowledgments
The authors thank to Chinese Scholarship Council (CSC). This work is sup-
ported by the CSC scholarship.
Unit´e d’Informatique et d’Ing´enierie des Syst`emes - ENSTA ParisTech - http://u2is.ensta-paristech.fr
828, Boulevard des Mar´echaux, 91762, Palaiseau Cedex
bruno.monsuez@ensta-paristech.fr - 0181872030
´
Ecole Nationale Sup´erieure de
Techniques Avanc´ees
ResearchGate has not been able to resolve any citations for this publication.
ResearchGate has not been able to resolve any references for this publication.