ChapterPDF Available

Emotion Recognition Through Accelerometer and Gyroscope Sensors: A Pilot Study

Authors:

Abstract and Figures

Human social interaction and behavior influence the ability to perceive emotions from people. Capturing these emotions through non-intrusive means will allow data collection to be more natural and free flowing. Recent trends used mobile phone sensors to collect data through non-intrusive means. This study explored using accelerometer and gyroscope sensors to capture human emotion during gameplay. We tested our dataset with J48, Random Forrest, Naïve Bayes, Neural Network, and Support Vector Machine (SVM) using 10-fold cross-validation. We have identified that Random Forest proved to be a better model for identifying emotions using the sensors, even with a not so good-quality data.
Content may be subject to copyright.
Emotion Recognition Through Accelerometer
and Gyroscope Sensors: A Pilot Study
Michael Dela Fuente1,2(B), Carlo Inovero1,2, and Larry Vea2
1Polytechnic University of the Philippines, Manila, Philippines
mbdelafuente@pup.edu.ph
2Technological Institute of the Philippines, Manila, Philippines
Abstract. Human social interaction and behavior influence the ability to per-
ceive emotions from people. Capturing these emotions through non-intrusive
means will allow data collection to be more natural and free flowing. Recent
trends used mobile phone sensors to collect data through non-intrusive means.
This study explored using accelerometer and gyroscope sensors to capture human
emotion during gameplay. We tested our dataset with J48, Random Forrest, Naïve
Bayes, Neural Network, and Support Vector Machine (SVM) using 10-fold cross-
validation. We have identified that Random Forest proved to be a better model for
identifying emotions using the sensors, even with a not so good-quality data.
Keywords: Emotion Recognition ·Mobilephone Sensors ·Accelerometer ·
Gyroscope
1 Introduction
The advancement in computing made it possible for people to use smartphones beyond
the communication functions for which they were initially designed. People utilize their
devices for various tasks not limited to ordering food, shopping, or entertainment. Mobile
phones progressively become more complex over time to suit increasing demands and
needs. With mobile phones having a variety of sensors, the utilization of these integrated
sensors enables the deployment of new applications that previously were impractical.
Human social interaction and behavior are influenced by the ability to perceive
emotions from people [8]. Different approaches were introduced to recognize emotion
and behavior using technology. Often one issue is that emotion captured is unnatural,
especially if the subject is aware of the activity. Capturing these emotions through non-
intrusive means will allow data collection to be more natural and free flowing, removing
uneasiness from participants.
Recent trends used mobile phone sensors to collect data through non-intrusive means
[2]. Mainly, data from accelerometers and gyroscopes are being studied to recognize
human emotions [7,9]. This study explored using accelerometer and gyroscope sensors
to capture human emotion during gameplay.
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023
K. Kabassi et al. (Eds.): NiDS 2023, LNNS 783, pp. 316–326, 2023.
https://doi.org/10.1007/978-3-031-44097-7_34
Emotion Recognition Through Accelerometer and Gyroscope Sensors 317
2 Emotion Recognition
Emotion recognition is simply a process of identifying human emotion. In psychology,
emotion recognition is an attribution to observable emotional states, whether visual or
auditory nonverbal cues [1]. Visually, Ekman introduced six basic emotions of fear,
anger, joy, sadness, disgust, and surprise [4].
With the advancement of technology, several techniques have been developed to
recognize emotions [2]. Most familiar to these is the use of the camera to identify visual
cues, mainly in the face. But a lot of studies currently look beyond the face and recognize
emotion through other physiological signals.
In [3], depth map video sequences recognized emotional states from body motion
patterns. Another made predictions about emotions while a user is walking with a mobile
phone in the pocket [7]. Piskioulis, Tzafilkou, and Economides investigated the recogni-
tion of enjoyment and frustration while using a prototype gaming app [9]. Rumiantcev,
on the other hand, investigated the use of data from wearable device sensors to recognize
emotion and human activity [10]. Sensor data, while a user is typing on a mobile device’s
keyboard, was used to infer the user’s emotional state [11]. Lastly, [6] utilizes sensors
in smartphones to estimate affect based on the link between affect and movement.
With these recent studies, interest in multimodal and non-intrusive approaches to
recognizing emotion is gaining popularity. Notably, using mobile phones and other
wearable devices with sensors has become of interest to researchers.
3 Methodology
This pilot study utilizes a quasi-experimental design. In this experiment, participants
were asked to play a mobile game, and during gameplay, the actions and movements
were recorded using a third-party application. The facial expressions are recorded using
another device, preferably a computer with a camera. The entire duration of the activity
is about 10–15 min.
The data source for this experiment will come from the gameplay recordings of
participants from the SHS/College. The following materials/resources were used in the
conduct of the investigation:
Android phone that is capable of recognizing movements through accelerometer
and gyroscope sensors
Mobile game in this case, Mobile Legend, a popular MMORPG mobile game played
by almost all age groups and very familiar with the target participants.
Sensor Record a free Android app designed by Martin Golpashin to easily track
sensor data of smartphones in high frequency with the capability to store information
in external storage as a.csv file.
Computer with a camera this will be used to capture the participant’s facial
expressions while in gameplay.
This work is guided by [5], which presents the general emotion recognition process
(Fig. 1).
318 M. Dela Fuente et al.
Fig. 1. General emotion recognition process
3.1 Sensors
The identified sensors for this research are the accelerometer and gyroscope. The
accelerometer measures the force of acceleration caused by movement, vibration, or
gravity, while the gyroscope measures the orientation or angular velocity. For this exper-
iment, two types of accelerometer information were recorded, one considering gravity
with motion and position detection and the other without gravity with motion and posi-
tion detection (acceleration linear). Thus, three pieces of information from sensor data
were recorded, accelerometer (with gravity), accelerometer linear (without gravity), and
gyroscope.
3.2 Ethical Considerations
Participants were oriented about the study, the information required, and the expected
output of the research. A consent form was given since the experiment must record the
participant using a camera. Participation in the study is voluntary, and only those who
provided their consent were requested to perform the task. In total, eight participants
voluntarily joined the activity.
3.3 Procedures
In the conduct of the activity, some of the participants were observed in the presence
of the researchers, while others were asked to submit the recordings online via Google
Drive following a list of steps presented below:
Pre-experiment Stage
1. Install Sensors Toolbox, Sensor Record, and Mobile Legends
2. Verify if your phone can do the experiment by running Sensors Toolbox. Check if the
device has the following sensors: accelerometer and gyroscope.
3. You can also verify it through Sensor Record. There should be Accelerometer,
Accelerometer Linear, and Gyroscope enabled.
4. Set up your computer for video recording. Make sure that your face is visible.
Experiment Stage
1. Start the video recording. Your face should be visible during the recording, so do not
block your look with the mobile phone while playing.
Emotion Recognition Through Accelerometer and Gyroscope Sensors 319
2. Open Sensor Record and choose Accelerometer, Accelerometer Linear, and Gyro-
scope with 100 ms intervals, then press recording (red button at the bottom of the
screen).
3. Open your Mobile Legend and begin playing. A 10–15-min gameplay is enough for
this experiment.
Post-experiment Stage
1. After gameplay, stop the recording for video and sensor recording.
2. Go to bit.ly/ER-Game and create a folder with your full name. Upload the video
recording and the.csv files created in Sensor Record (there should be three.csv files).
3.4 Data Acquisition
The primary data considered for this study are the generated data from mobile phones’
accelerometers and gyroscope sensors. For this purpose, a third-party application, Sensor
Record, was used. The participants were asked to play a game while the app was recording
their actions and bodily movement. At the same time, facial expressions are recorded
using the computer and camera.
At the end of this activity, three.csv files and a video recording serve as data for the
analysis. The.csv file contains the time stamp, the time in milliseconds with a 100 ms
interval, and the values for the X, Y, and Z coordinates.
The video recordings for each participant are viewed and analyzed, particularly the
facial expressions made during the gameplay. In the analysis, we identify the time when
a participant completed the emotion and the length of time it lasted. We then compared
this from the time in the.csv and added an annotation indicating the specific emotion
we observed. For those portions where no emotion was observed, we annotate them as
“neutral.”
Only the data coming from six participants were processed. The data coming from
the other two participants are with faults. Emotion or any change in facial expression
was not observed in one, while the other had missing recorded data as provided by the
app.
3.5 Data Pre-Processing
Once the raw.csv data from the sensors were obtained, the data must be adjusted based
on different interval sizes of i. This adjustment in data requires taking every i line of raw
data and simplifying them into single lines corresponding to their means and standard
deviations.
Before this, we remove the first and last 10 s (10000 ms) in the.csv file to accommo-
date the actions of the participants before and after gameplay (e.g., switching windows
from Sensor Record to the mobile game).
Concerning the emotion after simplification, we determine the majority emotion for
every i line, which will serve as the dominant emotion. In this study, we have considered
three emotions based on what was observed, namely happy, frustrated, and neutral.
320 M. Dela Fuente et al.
3.6 Feature Engineering
In this study, we have collected data from three sensors and computed the variance and
distance for every fifty rows. We prepared a separate file for each sensor data combining
the information collected from the six participants that will contain the emotion, the
variance for each coordinate, and the total distance. This will serve as the dataset for the
model and testing.
We eliminated instances from the dataset with 0 distance to account for errors in the
recordings where the third-party app stopped recording the actions of the participant.
From these, we established the following number of instances per sensor: accelerometer
(134), accelerometer linear (102), and gyroscope (102). Table 1presents a summary of
the number of instances per emotion for each sensor.
Using Weka 3.8.6, we balanced the instances in each sensor data so that each class
has the same total weight using the CLASSBALANCER filter with ten discretization
intervals.
Table 1. Number of Instances per Emotion for Each Sensor
Emotion Accelerometer Accelerometer
Linear
Gyroscope
Happy 45 31 31
Frustration 39 36 36
Neutral 50 35 35
Total 134 102 102
3.7 Model Development
In this experiment, we are interested in knowing the accuracy of the classification as
well as the Precision, Recall, F-Measure, and Kappa values.
We use Weka 3.8.6 to create the models for this experiment. In this study, we tested
our dataset with J48, Random Forrest, Naïve Bayes, Neural Network, and Support Vector
Machine (SVM). For our test option, we used cross-validation with ten folds. We applied
the models to the three datasets and investigated which provided more substantial data.
4 Results and Discussion
This section presents the results of our model development and experimentation. We
divided the presentation into sections to represent the results of the experiment per
sensor data. In the end, we compare the performance of the models selected for this
study.
Emotion Recognition Through Accelerometer and Gyroscope Sensors 321
4.1 Accelerometer
The computed values for accuracy, precision, recall, f-measure, and kappa statistics for
the accelerometer data are presented in Table 2.
Here we can see that for accelerometer data, Random Forest and J48 provide higher
accuracy ratings as compared with the other models used. Unfortunately, the values
computed for precision, recall, f-measure, and kappa statistics are low indicative that
there is a high incidence of misclassified instances in the dataset.
Table 2. Results of Model Testing for Accelerometer Data
Model Accuracy Precision Recall F-Measure Kappa
J48 60.9573% 0.589 0.610 0.595 0.4144
Naïve Bayes 35.5271% 0.367 0.355 0.303 0.0329
Neural Network 44.4217% 0.467 0.444 0.407 0.1663
Random Forest 64.8148% 0.630 0.648 0.637 0.4722
Support Vector 33.2821% 0.374 0.333 0.270 0.0008
4.2 Accelerometer Linear
In the accelerometer linear data, we can establish that Random Forest and J48, like in
accelerometer data, gave the highest accuracy ratings as compared with other models
tested.
Table 3. Results of Model Testing for Accelerometer Linear Data
Model Accuracy Precision Recall F-Measure Kappa
J48 49.2260% 0.466 0.492 0.460 0.2384
Naïve Bayes 40.6503% 0.460 0.407 0.355 0.1098
Neural Network 40.9754% 0.422 0.410 0.379 0.1146
Random Forest 59.1091% 0.555 0.591 0.556 0.3866
Support Vector 34.3088% 0.366 0.343 0.285 0.0146
Also, like the results of the accelerometer data, a high percentage of misclassification
of instances occurred, resulting in low values for precision, recall, f-measure, and kappa
statistics. Table 3summarizes the data collected from the experiment using accelerometer
linear sensor data.
322 M. Dela Fuente et al.
4.3 Gyroscope
Regarding the result generated from the data from the gyroscope sensor, we have iden-
tified Random Forest and J48 as the two models that provided a high accuracy rating
with 67.6805% and 42.0601%, respectively.
Table 4. Results of Model Testing for Gyroscope Data
Model Accuracy Precision Recall F-Measure Kappa
J48 42.0601% 0.407 0.421 0.407 0.1309
Naïve Bayes 39.9505% 0.467 0.400 0.354 0.0993
Neural Network 39.6604% 0.463 0.397 0.401 0.0949
Random Forest 67.6805% 0.667 0.677 0.659 0.5152
Support Vector 32.9638% 0.402 0.330 0.324 0.0055
Values generated for precision, recall, f-measure, and kappa statistics are again low,
indicative of high misclassification of instances. Table 4summarizes the data generated
from the experiment using gyroscope sensor data.
4.4 Model Comparison
In Table 5we present a comparison of the models across the different sensors utilizing
the accuracy data derived from the experimentations. Here, it is evident that Random
Forrest and J48 performed well across the various sensors. Both models are based on
Decision Trees.
Table 5. Model Comparison Across Sensors
Model Accelerometer Accelerometer Linear Gyroscope
J48 60.9573% 49.2260% 42.0601%
Naïve Bayes 35.5271% 40.6503% 39.9505%
Neural Network 44.4217% 40.9754% 39.6604%
Random Forest 64.8148% 59.1091% 67.6805%
Support Vector 33.2821% 34.3088% 32.9638%
The performance of the other three models, namely Naïve Bayes, Neural Network,
and Support Vector Machine, can be attributed to the quality and type of data used, as
all three best perform with high dimensional data. In this case, the dataset used for the
study contains very limited dimensionality, and thus, we conclude they have significantly
affected the models’ performance. It is said that neural networks are prone to overfitting
Emotion Recognition Through Accelerometer and Gyroscope Sensors 323
and may not fully utilize their representation learning capabilities. Naïve Bayes, though
it works well with low dimensional data, may have trouble capturing complicated feature
relationships. Similarly, SVMs can still function well in low-dimensional data but may
be less critical in finding optimal decision boundaries.
4.5 Emotion Recognition
After the model development and testing, it was observed that among the emotions under
study, frustration is the most recognized by the models using different sensor data. This
is evident for accelerometer and accelerometer linear data, as presented in Fig. 2.
Fig. 2. Dominant Emotion using Accelerometer and Accelerometer Linear Data
Results using Gyroscope data gave varying classifications (Fig. 3). For Naïve Bayes
and Support Vector, the dominant emotion is happiness. J48 and Random Forest iden-
tified frustration, while Neural Network identified neutral emotion using gyroscope
data.
4.6 Feature Selection
In the feature selection, we use the two best model based on the results of the experi-
ment, which is Random Forest and J48. To identify the features that contribute to the
performance, we performed attribute evaluation using Learner Based Feature Selection,
particularly the use of the WrapperSubsetEval technique with BestFirst as the search
method. The evaluation mode was done using 10-fold cross-validation.
The result of the feature selection has identified, for the accelerometer data, the
variance of the Z axis for J48 and the variance of the Y axis for Random Forest to have
contributed to the performance of emotion recognition.
For the accelerometer linear data, the variance of the Z axis is the best feature for
J48, which appeared in 9 evaluations of the 10-fold cross-validation. Likewise, Random
Forest has identified the variance of the Z axis as the best attribute.
324 M. Dela Fuente et al.
Fig. 3. Dominant Emotion using Gyroscope Data
Table 6. Features Contributing to Emotion Recognition
J48 Random Forest
Accelerometer Data VarZ (10 folds) VarY (10 folds)
Accelerometer Linear Data VarZ (9 folds) VarZ (10 folds)
Gyroscope Data VarZ(10 folds) VarX (8 folds)
Finally, the gyroscope data identified the variance of the Z axis as a contributing
attribute in recognition of emotion under J48, while the variance of the X axis is the con-
tributing attribute for Random Forest, which appeared in 8 folds of the cross-validation.
Table 6summarizes the features contributing to emotion recognition.
4.7 Overall
We compare the accuracy of each sensor with the five models (see Fig. 4). Across all
sensors, Random Forest provides a better rating in terms of correctly classifying the
instances, whereas Support Vector provides the least feasible rating.
Data from the accelerometer (both with and without gravity) provides a useful basis
for recognizing emotion. Although, the rating is not too far away as compared to the
gyroscope.
The results of the experiments are less promising. This can be attributed to the
following:
1) The amount of data used. We were only able to gather very little data for this experi-
ment. This can also be seen in the video recordings, where identified emotions were
very few.
2) The quality of data used. We noticed that the use of the third-party application is
not advisable as the recordings stop at times. This is evident from the data collected.
Emotion Recognition Through Accelerometer and Gyroscope Sensors 325
Also, the posture of the player while playing the game affected the movements (e.g.,
playing the game with arms resting on a table).
3) The type of game played. We believe that the type of game used for the experiment
contributed to the quality of data collected. The sensors used for this study are very
much suited to games that will require a lot of movement.
Fig. 4. Comparison of Accuracy for Each Sensor Using Different Models
Although the results are less promising, we have identified that Random Forest
proved to be a better model for identifying emotions using the sensors, even with a not
so good-quality data. The second best is identified to be J48.
5 Conclusion
In this study, we investigated using accelerometers and gyroscope sensors to recog-
nize emotions, particularly happiness, and frustration. We recorded the data from the
sensors using a third-party application while participants played a mobile game. The
facial expressions of each participant were also recorded while in gameplay. Emotions
were identified from the video recordings, and the sensor data were annotated with the
identified emotions. The collected data from the participants are then pre-processed in
preparation for model development. We used Weka 3.8.6 to create the models for this
study and tested our dataset with J48, Random Forrest, Naïve Bayes, Neural Network,
and Support Vector Machine (SVM). We have identified that Random Forest proved
to be a better model for identifying emotions using the sensors, even with a not so
good-quality data.
From the three datasets coming from the accelerometer, accelerometer linear, and
gyroscope, J48 has identified the variance of the Z axis as a contributing attribute in
recognition of emotion. Random Forest, on the other hand, has the variance of the Y
axis for accelerometer data, the variance of the Z axis for the accelerometer linear data,
and the variance of the X axis for the gyroscope data.
The result of the experiments is less promising due to several factors. Thus, we
recommend further improving the study by increasing the dataset and identifying more
suitable materials for the experiments.
326 M. Dela Fuente et al.
References
1. Banziger, T.: Measuring emotion recognition ability. In: Encyclopedia of Quality of Life and
Well-Being Research, pp. 3934–3941. Springer, Dordrecht (2014). https://doi.org/10.1007/
978-94-007-0753-5_4188
2. Dzedzickis, A., Kaklauskas, A., Bucinskas, V.: Human emotion recognition: review of sensors
and methods. Sensors (Switz.) 20, 592 (2020). https://doi.org/10.3390/s20030592
3. Elfaramawy, N., Barros, P., Parisi, G.I., Wermter, S.: Emotion recognition from body expres-
sions with a neural network architecture. In: HAI 2017 Proceedings of the 5th Interna-
tional Conference on Human Agent Interaction, pp. 143–149. Association for Computing
Machinery, Inc. (2017). https://doi.org/10.1145/3125739.3125772
4. Simeng, G., Wang, F., Patel, N.P., Bourgeois, J.A., Huang, J.H.: A model for basic emotions
using observations of behavior in Drosophila. Front. Psychol. 10, 781 (2019). https://doi.org/
10.3389/fpsyg.2019.00781
5. Kołakowska, A., Szwoch, W., Szwoch, M.: A review of emotion recognition methods based
on data acquired via smartphone sensors. Sensors (Switz.) 20(21), 1–43 (2020). https://doi.
org/10.3390/s20216367
6. Mottelson, A., Hornbæk, K.: An affect detection technique using mobile commodity sensors in
the wild. In: UbiComp 2016 Proceedings of the 2016 ACM International Joint Conference on
Pervasive and Ubiquitous Computing, pp. 781–792. Association for Computing Machinery,
Inc. (2016). https://doi.org/10.1145/2971648.2971654
7. Olsen, A.F., Torresen, J.: Smartphone accelerometer data used for detecting human emotions.
In: Proceedings of the 2016 3rd International Conference on Systems and Informatics, Shang-
hai Jingyue International Aviation Hotel, Shanghai, China, 19–21 November 2016. ICSAI
2016, pp. 410–415 (2016)
8. Picard, R.W.: Affective Computing (1995). Retrieved from https://affect.media.mit.edu/pdfs/
95.picard.pdf
9. Piskioulis, O., Tzafilkou, K., Economides, A.A.: Emotion detection through smartphone’s
accelerometer and gyroscope sensors. In: UMAP 2021 Proceedings of the 29th ACM Con-
ference on User Modeling, Adaptation and Personalization, pp. 130–137. Association for
Computing Machinery, Inc (2021). https://doi.org/10.1145/3450613.3456822
10. Rumiantcev, M.: Emotions and activity recognition system using wearable device sensors.
In: Conference of Open Innovation Association, FRUCT, IEEE Computer Society (2021).
https://doi.org/10.23919/FRUCT50888.2021.9347652
11. Zualkernan, I., Aloul, F., Shapsough, S., Hesham, A., El-Khorzaty, Y.: Emotion recognition
using mobile phones. Comput. Electr. Eng. 60, 1–13 (2017). https://doi.org/10.1016/j.compel
eceng.2017.05.004
Article
Full-text available
In recent years, emotion recognition algorithms have achieved high efficiency, allowing the development of various affective and affect-aware applications. This advancement has taken place mainly in the environment of personal computers offering the appropriate hardware and sufficient power to process complex data from video, audio, and other channels. However, the increase in computing and communication capabilities of smartphones, the variety of their built-in sensors, as well as the availability of cloud computing services have made them an environment in which the task of recognising emotions can be performed at least as effectively. This is possible and particularly important due to the fact that smartphones and other mobile devices have become the main computer devices used by most people. This article provides a systematic overview of publications from the last 10 years related to emotion recognition methods using smartphone sensors. The characteristics of the most important sensors in this respect are presented, and the methods applied to extract informative features on the basis of data read from these input channels. Then, various machine learning approaches implemented to recognise emotional states are described.
Article
Full-text available
Automated emotion recognition (AEE) is an important issue in various fields of activities which use human emotional reactions as a signal for marketing, technical equipment, or human–robot interaction. This paper analyzes scientific research and technical papers for sensor use analysis, among various methods implemented or researched. This paper covers a few classes of sensors, using contactless methods as well as contact and skin-penetrating electrodes for human emotion detection and the measurement of their intensity. The results of the analysis performed in this paper present applicable methods for each type of emotion and their intensity and propose their classification. The classification of emotion sensors is presented to reveal area of application and expected outcomes from each method, as well as their limitations. This paper should be relevant for researchers using human emotion evaluation and analysis, when there is a need to choose a proper method for their purposes or to find alternative decisions. Based on the analyzed human emotion recognition sensors and methods, we developed some practical applications for humanizing the Internet of Things (IoT) and affective computing systems.
Article
Full-text available
Emotion plays a crucial role, both in general human experience and in psychiatric illnesses. Despite the importance of emotion, the relative lack of objective methodologies to scientifically studying emotional phenomena limits our current understanding and thereby calls for the development of novel methodologies, such us the study of illustrative animal models. Analysis of Drosophila and other insects has unlocked new opportunities to elucidate the behavioral phenotypes of fundamentally emotional phenomena. Here we propose an integrative model of basic emotions based on observations of this animal model. The basic emotions are internal states that are modulated by neuromodulators, and these internal states are externally expressed as certain stereotypical behaviors, such as instinct, which is proposed as ancient mechanisms of survival. There are four kinds of basic emotions: happiness, sadness, fear, and anger, which are differentially associated with three core affects: reward (happiness), punishment (sadness), and stress (fear and anger). These core affects are analogous to the three primary colors (red, yellow, and blue) in that they are combined in various proportions to result in more complex “higher order” emotions, such as love and aesthetic emotion. We refer to our proposed model of emotions as called the “Three Primary Color Model of Basic Emotions.”
Conference Paper
Full-text available
The recognition of emotions plays an important role in our daily life and is essential for social communication. Although multiple studies have shown that body expressions can strongly convey emotional states, emotion recognition from body motion patterns has received less attention than the use of facial expressions. In this paper, we propose a self-organizing neural architecture that can effectively recognize affective states from full-body motion patterns. To evaluate our system, we designed and collected a data corpus named the Body Expressions of Emotion (BEE) dataset using a depth sensor in a human-robot interaction scenario. For our recordings, nineteen participants were asked to perform six different emotions:anger, fear, happiness, neutral, sadness, and surprise. In order to compare our system with human-like performance, we conducted an additional experiment by asking fifteen annotators to label depth map video sequences as one of the six emotion classes. The labeling results from human annotators were compared to the results predicted by our system. Experimental results showed that the recognition accuracy of the system was competitive with human performance when exposed to body motion patterns from the same dataset.
Conference Paper
Full-text available
Current techniques to computationally detect human affect often depend on specialized hardware, work only in laboratory settings, or require substantial individual training. We use sensors in commodity smartphones to estimate affect in the wild with no training time based on a link between affect and movement. The first experiment had 55 participants do touch interactions after exposure to positive or neutral emotion-eliciting films; negative affect resulted in faster but less precise interactions, in addition to differences in rotation and acceleration. Using off-the-shelf machine learning algorithms we report 89.1% accuracy in binary affective classification, grouping participants by their self-assessments. A follow up experiment validated findings from the first experiment; the experiment collected naturally occurring affect of 127 participants, who again did touch interactions. Results demonstrate that affect has direct behavioral effect on mobile interaction and that affect detection using common smartphone sensors is feasible.
Conference Paper
Emotion recognition is essential for assessing human emotional states and predicting user behavior to provide appropriate and personalized feedback. The wide range of Smartphones with accelerometers, microphones, GPSs, gyroscopes, and more motivate researchers to explore the automatic emotion detection through Smartphone sensors. To this end, mobile sensing can facilitate the data retrieval process in a non-intrusive way without disturbing the user’s experience. This study seeks to contribute to the field of non-intrusive mobile sensing for emotion recognition by detecting user emotions via accelerometer and gyroscope sensors in Smartphones. A prototype gaming app was designed and a sensor log app for Android OS was used to monitor the users’ sensor data while interacting with the game. The recorded data from 40 users was processed and used to train different classifiers for two emotions: a positive (enjoyment) and a negative (frustration) one. The validation study demonstrates a high prediction of 87.90% for enjoyment and 89.45% for frustration. Our findings indicate that by analyzing accelerometer and gyroscope data, it is possible to make efficient predictions of a user’s emotional state. The proposed model and its empirical development and validation are described in this paper.
Article
The availability of built-in sensors in mobile phones has enabled a host of innovative applications. One class of application deals with detecting a user's emotions. Previous applications have primarily relied on recording and displaying self-reported emotions. This paper presents an intelligent emotion detection system for mobile phones implemented as a smart keyboard that infers a user's emotional state using machine learning techniques. The system uses accelerometer readings and various aspect of typing behavior like speed and delay between letters to train a classifier to predict emotions. Naïve Bayes, J48, IBK, Multi-response linear regression and SVM were evaluated and J48 was found to be the best classifier with over 90% accuracy and precision. In addition to providing emotive feedback to individual users, the system also uses geo-tagged data to collect and display emotional states of regions or countries through a website.
Measuring emotion recognition ability
  • T Banziger
Banziger, T.: Measuring emotion recognition ability. In: Encyclopedia of Quality of Life and Well-Being Research, pp. 3934-3941. Springer, Dordrecht (2014). https://doi.org/10.1007/ 978-94-007-0753-5_4188