Diﬀerentiation in Personality Emotion Mappings
From Self Reported Emotion and Automatically
Ryan Donovan1, Aoife Johnson1, and Ruairi O’Reilly1
Cork Institute of Technology, Ireland
Abstract. How does the relationship between personality traits and the
basic emotions vary across the modalities of self-report and facial expres-
sion analysis? This article presents the results of an exploratory study
that quantiﬁes consistencies and diﬀerences in personality-emotion map-
pings across these two modalities. Twenty-four participants answered a
personality questionnaire before watching twelve emotionally provocative
videos. Participant’s self-reported their emotional reactions per video,
while their facial expressions were being recorded for automated emo-
tional analysis. The results indicated that overall there was greater con-
sistency than diﬀerences in personality-emotion mappings across the two
modalities. The robustness of this relationship enables direct applica-
tions of emotional-state-to-personality-trait in academic and industrial
Keywords: Personality ·Five-Factor Model ·Emotion ·Facial Expres-
sion Analysis ·Multimodal.
Cognitive science has shown that the functioning of both personality and emotion
is necessary for positive well-being  . Personality represents the idiosyncratic
way we perceive, feel, and interact with the world. It is a psychological system
that structures one’s desires, goals, and our methods for fulﬁlling our wants and
attaining our goals in the medium to long-term . Emotions had historically
been considered as an impediment to clear thinking and action. However, emo-
tions guide our thoughts, behaviour, feelings, and motivation towards stimuli
that can satisfy our needs and desires in the present; emotions are signposts
towards our destination, not obstacles . People with a malfunctioning person-
ality are aimless; people with malfunctioning emotions are chaotic.
There exists a wealth of research that has investigated the phenomena of
personality and emotion  . However, only in a small proportion on such
research has focused on how these two phenomena interact with one another.
The results of such research showcase that there exists a quantiﬁable link between
2 R. Donovan, A. Johnson, R. O’Reilly
personality and emotion . The existence of this relationship enables potential
applications in the domains of academia (e.g. understanding the aﬀective nature
of our personality), clinical care (e.g. personality-based screenings for the onset
of aﬀective disorders), occupational and marketing (e.g. personalised content and
However, whilst the potential for applying personality-emotion mappings in
practical domains is exciting, there is a need to assess the robustness and gener-
alisability of this relationship. If emotions are a reliable indicator of personality
(or vice versa), then it needs to be demonstrated that this relationship is robust
across important factors . Otherwise, the eﬀectiveness of such applications will
be erratic and imprecise.
In terms of generalisability, an important factor is the modality of emotional
expression, e.g. subjective self-report and facial expressions. Research methodol-
ogy for assessing personality, emotion, and their relationship is largely reliant on
self-report-based questionnaires. From a researcher’s point of view, self-report-
based questionnaires are cost-eﬀective and quick to administer. However, this
reliance on questionnaires can weaken the validity of results in cases where re-
peated self-report is required per participant (e.g. “retest artifact” eﬀects )
and where it is an obstacle to recruiting participant due to the time taken to com-
plete such questionnaires. If the results from self-report-based questionnaires can
generalise across modalities, then this enables alternative and automatic meth-
ods of data capturing that requires minimal input from participants, even in
How consistent are emotional-state-to-personality trait mappings converge
across multiple modalities? This paper describes an exploratory research ex-
periment that investigated this question. The experiment investigated the level
of consistency of personality-to-emotion mappings across the modalities of self-
report and facial expressions. If it can be shown that there exists a large degree of
consistency with self-reported and automatically extracted emotions, then this
fortiﬁes the concept of “state-to-trait” mappings as a usable tool .
Personality was conceptualised as personality traits, which are the typical
expressions of cognition, behaviour, aﬀect, and motivation across time . The
bedrock model of personality traits is the Five-Factor Model (FFM), which cat-
egorises personality across ﬁve broad traits: Openness to Experience, Consci-
entiousness, Extraversion, Agreeableness, and Neuroticism . Emotion was
conceptualised as the basic emotions, which are a group of distinct emotions
that are reliably indicated by psychological, behavioural, and physiological sig-
nals. The basic emotions considered in this article are Anger, Disgust, Fear, Joy,
Sadness, and Surprise .
The research study utilised a machine learning-based detection platform,
Emotion Viewer , to automate the classiﬁcation of facial expressions. Emotion
Viewer analyses real-time video to automatically classify emotions via facial
expressions from video recordings of participants taking part in the study. Facial
landmarks are deﬁned as the detection and localization of certain key points
on a human face. The Emotion Viewer was trained on two data sets (Cohn-
Diﬀerentiation in Personality-Emotion Mappings 3
Kanade and Multimedia Understanding Group) to detect the basic emotions.
The Emotion Viewer was then tested with real-time video clips resulting in
an average accuracy of 88.76% per basic emotion . The Emotion Viewer
provided a strong foundation for analysing diﬀerences between personality-to-
The paper is structured in the following manner. Section 2 describes the
experiment’s methodology in terms of the design, participant pool, materials
used, and procedure. Section 3 presents the results for descriptive and inferential
statistical analyses. Section 4 discusses the results in the context of the research
question. Section 5 concludes the paper and provides recommendations for future
research in this area.
The experimental method adopted generated a range of interesting results wor-
thy of publication. Experimental results quantifying the links between person-
ality traits and basic emotions via self-reported emotions were published in .
In this paper, the work is extended to include analysis of automated emotion
categorisation. The work focuses on the diﬀerentiation in emotion-personality
mappings between self-reported emotions and automated emotion detection from
The sample consisted of 24 participants (n = 24, females = 16, males = 8, M-
age = 31.96, SD = 13.73) from a subset of 38 participants. The age range of
the sample (range = 19-63) is larger than in most social science research, which
tends to primarily consist of 18-23 undergraduate students.
Emotions Scale - A Likert-scale was created for the purposes of this study.
Participants were asked to answer the question “While watching the previous
video, to what extent did you experience these following emotions? Please use
the following scale in your self-assessment: 1 = Not at All; 2 = A little bit; 3 =
Moderately; 4 = A lot; 5 = A great deal/an extreme amount”. This scale was
designed to capture the experience of each emotion, but not the level of valence
or arousal. Participants completed the scale 12 times each, consisting of 84
questions overall (with alpha = .95). Each individual emotion was assessed 12
times throughout the duration of the study: Anger (alpha = .80), Disgust
(alpha =.83), Fear (alpha = .85), Sadness (alpha = .76), Joy (alpha =.66),
Surprised (alpha =.88).
Personality Questionnaire - The Big Five Aspects Scale is a reliable
measure for Big Five traits and their associated sub-traits . The scale is
composed of 100 questions. The results showed that scale had satisfactory
4 R. Donovan, A. Johnson, R. O’Reilly
test-retest reliability across each FFM trait: Openness to Experience (0.81),
Conscientiousness (.85), Extraversion (.85), Agreeableness (.74), Neuroticism
Technology - Participant’s reactions were recorded with the camera of a 2015
MacBook Pro, which included a 1080p web-camera. The software,
Screen-Cast-O-Matic, was used in order to both record the participant and the
MacBook screen simultaneously. Participants were also given a pair of Bose
QuietComfort Noise-Cancelling Headphones to wear whilst watching the
Emotional Stimuli - Previous research by independent groups have
demonstrated that video clips from movies and TV shows are a reliable
method for evoking emotional reactions   .
Twelve video clips were used in this experimental design. Nine of those video
clips were chosen based on prior research demonstrating their ability to evoke
emotional reactions. Three new video clips were also selected - the rationale
being that the nine tested video clips provided a solid foundation to
empirically evaluate new stimuli. The overall list for the videos is presented in
Table 2, along with the clip’s length and the expected emotional reaction it
Emotion Viewer - The Emotion Viewer analyses real-time video to
automatically classify emotions using a machine learning supported support
vector machine. Figure 1 depicts a demonstration of the tool in operation. The
tool comes with three options: track face, track expressions, and to set the
voting count. The ﬁrst two are required to enable facial expression analysis.
The voting count refers to the amount of consecutive classiﬁed emotions on a
frame-by-frame basis required to register a particular emotion. The voting
count for this research study was set to 10, which is the highest supported by
the Emotion Viewer. This means that for the Emotion Viewer to output an
emotion classiﬁcation (for this example, Anger) it would require 10 consecutive
frames where it detected the emotion Anger. This represented a conservative
approach and was chosen to reduce the risk for Type 1 errors. More
information on the design of the Emotion Viewer is available from .
2.3 Procedure for Participants
Participants initially ﬁlled out a demographic information form online, which
included questions regarding their age, gender, nationality, and previous experi-
ence with psychometric tests. Participants were then invited to the laboratory
stage of the experiment, provided they ﬁt inclusion criteria for the study (over
18 years of age and have not been diagnosed with an aﬀective disorder).
In the ﬁrst part of the laboratory stage, participants completed the Big Five
Aspects scale questionnaire, which on average took about 15 minutes to com-
plete. In the second part of the laboratory stage, the researcher set up the video
recording on the MacBook Pro. Given that participants varied in height, there
Diﬀerentiation in Personality-Emotion Mappings 5
Fig. 1. The Emotion Viewer Platform. Examples shown are of automated emotional
analysis of Joy (Happy), Anger, and Surprise
had to be a manual check to ensure that each participant’s face occupied the
camera frame. Once this was settled, the researcher would leave the room, and
the participants watched 12 video clips always in the same order (see Table 2).
Participants were alone when viewing videos to elicit a more natural reaction
and to prevent the participant from feeling self-conscious about their response.
Participants were given a pair of noise-cancelling headphones whilst watching the
videos, to help immerse themselves in the video. After each video, participants
completed a short emotion questionnaire, asking how they felt whilst watching
the video clip. Overall, the study took a participant 1 hour to complete.
This section presents the key descriptive and inferential statistics from the study
in relation to the level of diﬀerentiation between mappings between personality
and emotions across modalities.
3.1 Descriptive Statistics
Emotions - Participants did not seem to experience a dominant emotion through-
out the study. The mean and standard deviations for both self-report and auto-
matically classiﬁed emotions are presented in Table 1.
Self-Reported Emotions per Video Clip - Table 2 presents the mean and
standard deviation for self-reported emotional reactions per video clip.
Personality Self-Reported Scores - The descriptive statistics for self-reported
personality scores are presented in Table 3.
6 R. Donovan, A. Johnson, R. O’Reilly
Emotions M SD Emotions M SD
A-Anger 1.96 1.15 SR-Anger 1.92 0.55
A-Fear 1.93 1.27 SR-Fear 2.00 0.70
A-Joy 1.56 1.07 SR-Joy 1.54 0.34
A-Sadness 2.06 1.24 SR-Sadness 2.22 0.41
A-Surprise 2.33 1.29 SR-Surprise 2.78 0.79
A-Disgust 1.42 0.87 SR-Disgust 2.50 0.51
Table 1. Descriptive Statistics for Automatically Classiﬁed Emotions
View Sequence: Source Anger Disgust Fear Joy Sadness Surprise
1: When Harry Met Sally 1.30 1.53 1.03 3.23 1.00 2.50
2: Peep Show 1.87 1.80 1.37 2.10 2.10 2.87
3: The Champ 1.77 1.53 1.63 1.17 4.13 1.80
4: The Lion King 2.77 2.33 2.27 1.20 4.20 1.77
5: Schindler List 3.93 4.10 1.93 1.03 3.30 2.40
6: American History X 3.67 4.33 3.00 1.03 3.57 3.13
7: Trainspotting 1.30 4.47 1.20 2.17 1.47 3.37
8: Pink Flamingo 1.67 4.67 1.50 1.17 1.33 3.87
9: Test Your Awareness 1.03 1.03 1.07 2.50 1.30 4.43
10: Sea of Love 1.07 1.00 2.30 1.00 1.00 2.43
11: Annabelle 1.37 1.63 4.10 1.10 1.57 3.20
12: The Blaire Witch Project 1.33 1.40 3.53 1.07 1.67 2.23
Table 2. Mean Emotion per Video Clip Across Sample.
Personality Traits M SD Personality Traits M SD
Openness to Experience 3.53 0.42 Agreeableness 1.03 0.29
>Openness 1.87 0.51 >Compassion 1.37 0.40
>Intellect 1.77 0.51 >Politeness 1.63 0.36
Conscientiousness 2.77 0.53 Neuroticism 2.27 0.45
>Industriousness 3.93 0.56 >Withdrawal 1.93 0.45
>Orderliness 3.67 0.60 >Volatiltiy 3.00 0.67
Extraversion 1.30 0.43
>Enthusiasm 1.67 0.47
>Assertiveness 1.03 0.60
Table 3. Descriptive Statistics for Personality Traits.
Diﬀerentiation in Personality-Emotion Mappings 7
3.2 Inferential Statistics
Relationship Between Personality Traits and Self-Reported Emotions
and Automatically Extracted Emotions Table 4 presents the correlated
mapping only for sub-sample (n = 24). These results are presented as a means for
comparison with the mapping between personality traits, self-reported emotions,
and automatically classiﬁed emotions.
Traits Anger Fear Joy Sadness Surprise Disgust
SR—A SR—A SR—A SR—A SR—A SR—A
Openness to Experience 0.01 -0.21 -0.04 0.05 0.20 0.29 0.11 -0.18 0.09 0.22 0.10 0.19
Openness 0.05 -0.06 -0.06 -0.03 0.17 0.28 0.10 -0.14 0.26 0.18 0.18 0.07
Intellect -0.03 -0.28 0.00 0.11 0.16 0.20 0.08 -0.16 -0.12 0.18 -0.02 0.25
Conscientiousness 0.16 0.58 -0.06 0.01 0.01 0.09 0.30 -0.30 -0.11 -0.19 0.10 0.10
Industriousness 0.11 0.51 -0.15 0.04 -0.15 -0.16 0.19 -0.12 -0.19 -0.14 -0.01 -0.01
Orderliness 0.16 0.50 0.03 -0.01 0.14 0.28 0.33 -0.39 -0.02 -0.19 0.18 0.17
Extraversion -0.13 0.15 -0.39 0.07 0.12 0.16 -0.09 -0.14 0.03 -0.12 -0.08 -0.18
Enthusiasm -0.13 0.17 -0.35 0.08 0.12 0.18 -0.19 -0.18 0.04 -0.07 -0.04 -0.20
Assertiveness -0.08 0.09 -0.29 0.04 0.08 0.09 0.03 -0.06 0.00 -0.13 -0.09 -0.10
Agreeableness -0.13 0.20 -0.16 -0.15 -0.35 -0.08 0.07 -0.34 0.15 0.26 0.13 0.05
Compassion -0.15 0.13 -0.29 -0.18 -0.32 0.04 0.01 -0.32 0.23 0.18 0.06 0.06
Politeness -0.04 0.18 0.07 -0.04 -0.20 -0.16 0.10 -0.19 -0.02 0.21 0.13 0.01
Neuroticism 0.22 0.26 0.14 -0.42 0.40 0.14 0.41 0.08 0.21 0.05 0.25 0.28
Withdrawal -0.01 0.04 0.24 -0.21 0.50 0.02 0.14 0.10 0.32 0.21 0.15 -0.03
Volatility 0.29 0.32 0.03 -0.42 0.20 0.18 0.45 0.04 0.07 -0.07 0.24 0.39
Table 4. Bi-Directional Mappings of Personality Traits and the Basic Emotions for
both Self-Reported (SR) and Automatically Classiﬁed Emotions (A).
Relationship between Personality Traits and Automatically Classiﬁed
Emotions Conscientiousness positively correlated with recognition of Anger
with a large eﬀect size (df = 23, p≤0.01, r= 0.58). The sub-traits of Con-
scientiousness, Industriousness and Orderliness also positively correlated with
Anger with large eﬀect sizes (df = 23, p≤0.05, r= 0.51; df = 23, p≤0.05,
r= 0.50). Orderliness negatively correlated with recognition of Sadness with a
medium eﬀect size (df = 23, p= .059, r= -0.39).
Neuroticism negatively correlated with recognition of Fear with a medium-to-
large eﬀect size (df = 23, p= .04, r= -0.41). The sub-trait Volatility negatively
correlated with recognition of Fear with a medium-to-large eﬀect size (df = 23,
p= .04, r= -0.42). Volatility positively correlated with recognition of Disgust
with a medium eﬀect size (df = 23, p= .05, r= 0.39)
Relationship between Self-Reported and Automatically Extracted Emo-
tions A Pearson correlation was conducted on the matrix of self-reported emo-
8 R. Donovan, A. Johnson, R. O’Reilly
Fig. 2. Overlap between automatic emotions and self-reported emotions. Green indi-
cates consistency, white no overlap, and red diﬀerences. The numbers within each cell
indicate the Pearson R eﬀect size.
tions and A-emotions. A matrix depicting the level of consistency and diﬀerences
between self-reported emotions and automatically classiﬁed emotional expres-
sions,is presented in Figure 2 along with the eﬀect size for each comparison. In
terms of consistency/diﬀerences for same-emotions across modalities, then 5 of
the 7 emotions are positively correlated: Fear (df = 23, p= 0.10), Sadness (df
= 23, p= 0.65), and Surprise (df = 23, p= 0.08). The emotion Disgust (df =
23, p= 0.72) and Anger (df = 23, p= 0.88) diverge with negative correlations
with their modality counterpart.F
This study aimed to investigate whether the relationship between personality
traits and emotions is robust enough to generalise across the modalities of self-
report and facial expression analysis. The results showed that there existed more
consistency than diﬀerences in personality-emotion mappings. This section dis-
cusses both (i) which personality-emotion mappings showed greater consistency
than diﬀerence and (ii) which personality-emotion mappings showed greater dif-
ferences than consistency. Potential reasons for (i) and (ii) are also discussed.
Diﬀerentiation in Personality-Emotion Mappings 9
Additionally, an evaluation of the video-clips used is conducted and recommen-
dations for future researchers interested in employing video clips in emotion-
elicitation research is provided.
4.1 Consistency in Personality to Self-Reported and Automatically
Extracted Emotion Mappings
A consistent mapping is deﬁned here as similarity in the direction of correlation
(e.g. both positive, both negative) and the existence of non-trivial eﬀect sizes
(both must be greater than r=±.10) across both self-reported and automati-
cally classiﬁed emotions. Overall, 11 of the 15 personality traits showed greater
consistency than diﬀerences across both modalities.
Openness to Experience mapped consistently for Joy and Disgust. Openness
mapped consistently for Joy and Surprise; Intellect mapped consistently for Joy.
The largest consistency found across both emotion modalities for these traits
was the emotion Joy, which positively correlated with small-to-medium eﬀect
sizes. This is consistent with prior research linking Openness to Experience with
positive emotion systems in the brain .
Conscientiousness mapped consistently for emotions Anger, Surprise, Dis-
gust. Industriousness mapped consistently for Anger and Surprise; Orderliness
mapped consistently for Anger and Disgust. The positive relationship found be-
tween Conscientiousness and Anger was the strongest across the entire data-set
for both self-reported and automatically classiﬁed emotions.
Extraversion mapped consistently for Joy. Enthusiasm mapped consistently
for Joy and Sadness; Assertiveness did not map consistently across any emotion.
The consistent relationship found between Extraversion and Joy is consistent
with past research ﬁndings showing that people high in Extraversion experience
more joy on a daily basis .
Agreeableness mapped consistently for Fear and Surprise. Compassion also
mapped consistently for Fear and Surprise; Politeness mapped consistently for
Joy. As detailed out in the next sub-section, there was a considerable amount of
diﬀerences between the modalities.
Neuroticism mapped consistently for Anger, Joy, and Disgust. Withdrawal
mapped consistently for Sadness and Surprise; Volatility mapped consistently
for Anger, Joy, and Disgust. The positive correlation found between Joy and the
personality traits associated with Neuroticism is inconsistent with past research.
Neuroticism has been consistently considered a personality trait that is nega-
tively valenced and it has been repeatedly linked to the experience of mental
4.2 Diﬀerences in Personality to Self-Reported and Automatically
Extracted Emotions Mappings
An inconsistent mapping (diﬀerence) is deﬁned here as when the direction of cor-
relation for both modalities is the opposite of one another (e.g. one positive, one
10 R. Donovan, A. Johnson, R. O’Reilly
negative) and the existence of non-trivial eﬀect sizes (both must be greater than
r=±.10). Overall, 2 of the 15 personality traits showed greater inconsistency
(diﬀerence) than consistency across both modalities.
Openness to Experience mapped diﬀerently for Sadness. Openness also mapped
diﬀerently for Sadness; Intellect mapped diﬀerently for Surprise and Disgust. For
both Openness to Experience and Openness, there were more consistencies than
diﬀerences across modalities. This was the opposite for the sub-trait Intellect.
Conscientiousness mapped diﬀerently for Sadness. Industriousness and Or-
derliness also mapped diﬀerently for Sadness. Overall, there were more consis-
tencies than diﬀerences for Conscientiousness and its two sub-traits.
Extraversion mapped diﬀerently for Anger. Enthusiasm also mapped diﬀer-
ently for Anger; Assertiveness did not map diﬀerently across both modalities. It
should be stated although the diﬀerences between self-reported and automati-
cally classiﬁed Fear were still striking, it did not hit our threshold.
Agreeableness mapped diﬀerently for Anger. Compassion also mapped diﬀer-
ently for Anger. Politeness mapped diﬀerently for Anger and Sadness. Anger was
the major source of divergence for mappings related to Agreeableness. Agree-
ableness, and in particular Politeness, has been linked to reduced experience of
Anger . However, it is an open question as to whether people high in these
traits experience less anger or are less likely to report experiencing anger .
The self-reported emotion mappings support the ﬁrst hypothesis, that people
high in these traits are less likely to experience Anger. However, the automated
emotional analysis suggests that these participants do experience Anger, but
are less likely to recognise or admit it. This latter ﬁnding is consistent with
the description of people high in politeness withholding their feelings to avoid
Neuroticism mapped diﬀerently for Fear. Withdrawal also mapped diﬀerently
for Fear; Volatility did not map diﬀerently. There existed several mappings with
Neuroticism that were stronger in a given modality (e.g. Sadness), however,
these mappings did not have a large enough eﬀect size in both modalities to
meaningfully compare and contrast.
4.3 Recommendations for Emotional Stimuli
Twelve video clips were used as emotional stimuli in this research study. Three
of those videos had not been previously used in prior experimental research. Two
of those new videos, Annabelle and Who Dunnit? Test Your Awareness, were
successfully elicited high levels of self-reported emotion. Annabelle evoked the
strongest mean experience of Fear across the participant group and Who Dunnit?
Test Your Awareness evoked the strongest mean experience of Surprise across the
participant group. Both videos are recommended as reliable emotional stimuli for
future research. However, the third inclusion, Peep Show, is not recommended for
future usage. The results of this study also cast suspicion on the reliability of the
video clip, Sea of Love, which has been used in prior research experiments. but
only a minimal amount of its targeted emotion, Surprise. Researchers requiring
Diﬀerentiation in Personality-Emotion Mappings 11
a video to reliably produce Surprise are recommended to use the video clip Who
Dunnit? Test Your Awareness instead.
Personality traits and basic emotions are signiﬁcant predictors of human be-
haviour and the functioning of both phenomena is necessary for positive well-
being. Previous research has also found quantiﬁable links between these two
phenomena that can enable state-trait inferences, i.e. personality-emotion map-
pings . However, empirical observations of personality and emotion and their
relationship are largely reliant on self-report-based methodology (i.e. question-
naires). This reliance on self-report limits the validity of empirical research in
direct (e.g. “retest artifact”) and indirect ways (e.g. makes participant recruit-
ment more diﬃcult).
This paper described an empirical research study that tested the generalis-
ability of personality-emotion mappings across a self-reported based approach
and an automatically classiﬁed emotion-based approach via video. If personality-
emotion mappings were robust across both modalities, then this would be an
indicator that technological-based approaches can directly analyse personality,
emotion, and their relationship. Technological-based approaches can enable in-
telligent and automatic observations of these phenomena with minimal input
The results showed greater consistency than diﬀerences in personality-emotion
mappings across the two modalities. For the 15 personality traits captured in
this study, the results showed that: (i) 11 personality traits showed greater con-
sistency than diﬀerences personality-emotion mappings across modalities; (ii)
2 traits showed an equal amount of consistency and diﬀerences in personality-
emotion mappings across modalities; (iii) 2 traits showed greater diﬀerences than
consistency in personality-emotion mappings across modalities.
The criteria for assessing a personality-emotion mapping was two-fold, a com-
parison between the direction of the correlations and the existence of non-trivial
eﬀect sizes. However, there is scope for future research to assess the degree of
consistency/diﬀerence between the two modalities. Quantifying the degree of
consistency/diﬀerence would enable assessments about the strengths and weak-
nesses of self-report and automatically classiﬁed approaches.
Overall, the results are a promising indication that the relationship between
personality traits and basic emotion is robust enough to enable research and
commercial applications. However, given the small sample size of the research
study, caution is required in generalising the results. Future research in this
area that can (a) scale the number of participants and (b) incorporate other
modalities of emotional expression (e.g. language, speech) and (c) quantify the
degree of consistency/diﬀerence across modalities would represent a signiﬁcant
next step in this research area.
12 R. Donovan, A. Johnson, R. O’Reilly
1. Antonio R Damasio. Descartes’ error. Random House, 2006.
2. Colin G DeYoung. Cybernetic big ﬁve theory. Journal of Research in Personality,
3. Dan P McAdams. The art and science of personality development. Guilford Pub-
4. Lisa Feldman Barrett. How emotions are made: The secret life of the brain.
Houghton Miﬄin Harcourt, 2017.
5. Daniel K Mroczek and Todd D Little. Handbook of personality development. Psy-
chology Press, 2014.
6. Lisa Feldman Barrett, Michael Lewis, and Jeannette M Haviland-Jones. Handbook
of emotions. Guilford Publications, 2016.
7. Ryan Donovan, Aoife Johnson, Aine deRoiste, and Ruairi O’Reilly. Quantifying
the Links between Personality Sub-Traits and the Basic Emotions. In Aﬀective
Computing and Emotion Recognition, Lecture Notes in Computer Science, Cagliari,
July 2020. Springer.
8. Alessandro Vinciarelli and Gelareh Mohammadi. More personality in personality
computing. IEEE Transactions on Aﬀective Computing, 5(3):297–300, 2014.
9. Chris J Durham, Lisa Ducommun McGrath, Gary M Burlingame, G Bruce
Schaalje, Michael J Lambert, and D Rob Davies. The eﬀects of repeated admin-
istrations on self-report and parent-report scales. Journal of Psychoeducational
Assessment, 20(3):240–257, 2002.
10. Rebecca L Shiner and Colin G DeYoung. 6 CHAPTER The Structure of Temper-
ament and Personality Traits: A Developmental. The Oxford handbook of develop-
mental psychology, Vol. 2: Self and other, 2:113, 2013.
11. Oliver P John, Laura P Naumann, and Christopher J Soto. Paradigm shift to the
integrative big ﬁve trait taxonomy. Handbook of personality: Theory and research,
12. Paul Ekman. Basic emotions. Handbook of cognition and emotion, pages 45–60,
13. Michael Healy, Ryan Donovan, Paul Walsh, and Huiru Zheng. A machine learning
emotion detection platform to support aﬀective well being. pages 2694–2700. IEEE,
14. Crystal A. Gabert-Quillen, Ellen E. Bartolini, Benjamin T. Abravanel, and
Charles A. Sanislow. Ratings for emotion ﬁlm clips. Behavior Research Meth-
ods, 47(3):773–787, September 2015.
15. T Lee Gilman, Razan Shaheen, K Maria Nylocks, Danielle Halachoﬀ, Jessica Chap-
man, Jessica J Flynn, Lindsey M Matt, and Karin G Coifman. A ﬁlm set for
the elicitation of emotion in research: A comprehensive catalog derived from four
decades of investigation. Behavior research methods, 49(6):2061–2082, 2017.
16. James J. Gross and Robert W. Levenson. Emotion elicitation using ﬁlms. Cognition
and Emotion, 9(1):87–108, 1995.
17. David Watson and Lee Anna Clark. Extraversion and its positive emotional core.
In Handbook of personality psychology, pages 767–793. Elsevier, 1997.
18. DH Saklofske, IW Kelly, and BL Janzen. Neuroticism, depression, and depression
proneness. Personality and individual diﬀerences, 18(1):27–31, 1995.
19. William G Graziano and Ren´ee M Tobin. Agreeableness. 2009.
20. Colin G DeYoung and Timothy A Allen. Personality neuroscience. Handbook of
Personality Development, page 79, 2018.