Detecting the learner’s emotional state is very important to
intelligent tutoring systems. [3, 4]
Study Objective :
Prediction of facial expressions  only from physiological
Gather data from
Encephalography EEG (Emotiv Epoch)
Camera-based facial expression detection program (FACET) 
Pictures stimuli from the IAPS (International Affective Picture
Train machine learning algorithms:
Using facial expression data as ground truth
Building EEG-based facial expression models
Using our approach, we have reached 92% accuracy.
Table 2: Performance of Machine Learning methods
In order to elicit the participants’ emotions:
Experiment with 20 participants (7 women, 13 men, aged: 21 - 35)
Using 30 selected IAPS pictures with different affective ratings.
To build the EEG-based affective model:
Record EEG and facial micro-expressions (1) of users confronted
to IAPS pictures, using Emotiv and Facet software.
For each FACET frame time (every 1/6 sec), compute from the
corresponding mobile window of 1 sec EEG data, the frequency-
domain (2) and time-domain (3) features.
Train machine learning models (4) that correlates the micro-
expressions probabilities with the EEG features.
Test the accuracy of the models in predicting the emotions only
from the EEG.
1.Littlewort, G., et al. The computer expression recognition toolbox (CERT).
in Automatic Face & Gesture Recognition and Workshops (FG 2011), 2011
IEEE International Conference on. 2011. IEEE.
2.Ekman, P., Emotions revealed: Recognizing faces and feelings to improve
communication and emotional life. 2007: Macmillan.
3.Chaouachi, M. and C. Frasson. Mental workload, engagement and
emotions: an exploratory study for intelligent tutoring systems. in
Intelligent Tutoring Systems. 2012. Springer.
4.Liu, Y., O. Sourina, and M.K. Nguyen, Real-time EEG-based emotion
recognition and its applications, in Transactions on computational science.
2011, Springer. p. 256-277.
5.Lang, P.J., M.M. Bradley, and B.N. Cuthbert, International affective
picture system (IAPS): Affective ratings of pictures and instruction
manual. Technical report A-8, 2008
Three machine learning algorithms were used to predict the numeric
values of each emotion category:
IBk (K-nearest neighbours classifier),
Random Forest (classifier constructing a forest of random trees)
RepTree (Fast decision tree learner).
We used 10 fold cross validation in our test phase.
In this study, we investigated the predictability of facial micro-
expressions from EEG signal.
Our methodology and experimental design insured a reliable dataset.
From this dataset, EEG-Based facial expression models were extracted
with high accuracy .
In the future work, we will integrate our affective models in a real-
time application to support adaptation in an intelligent tutorial system.
Predicting Spontaneous Facial Expressions from EEG
Mohamed S. Benlamine1, Maher Chaouachi1, Claude Frasson1 and Aude Dufresne2
1Computer Science and Operations Research Department, University of Montreal
2Communication Department, University of Montreal
Random Forest outperforms IBk (k=1 neighbour) and RepTree
methods with higher correlation coefficient and lower error rates for
all emotion categories.
Our approach provides a simple and reliable way to capture the
emotional reactions of the user that can be used in intelligent tutorial
systems, learning, games, neurofeedback, and VR environments.
Figure 1: Pipeline of dataset construction for EEG-
based Facial Expressions recognition
Table 1. The computed Features from EEG Signals
F e a t u re s
–4 H z [ , th e t a [ 4 –8 H z [,
a l p ha [ 8 –1 2 H z [, be ta [ 1 2 –2 5 Hz [,
a n d gam m a [ 2 5 –4 0 H z [
- d o m a i n
E E G Feat u r e s
(1 2 Fe a t u res )
, S t a nd a rd E r ro r, M e d i a n ,
Mo d e , Sta n d a rd D ev i a ti o n , S am p l e
Va r i a n c e
, K u r to s i s , S ke w ne s s,
, M i n i m u m , M a x i m u m a n d
Emails: firstname.lastname@example.org | email@example.com | firstname.lastname@example.org | email@example.com