ArticlePDF Available

Abstract and Figures

Background The paper explored emotion comprehension in children with regard to facial expression of emotion. The effect of valence and arousal evaluation, of context and of psychophysiological measures was monitored. Indeed subjective evaluation of valence (positive vs. negative) and arousal (high vs. low), and contextual (facial expression vs. facial expression and script) variables were supposed to modulate the psychophysiological responses. Methods Self-report measures (in terms of correct recognition, arousal and valence attribution) and psychophysiological correlates (facial electromyography, EMG, skin conductance response, SCR, and heart rate, HR) were observed when children (N = 26; mean age = 8.75 y; range 6-11 y) looked at six facial expressions of emotions (happiness, anger, fear, sadness, surprise, and disgust) and six emotional scripts (contextualized facial expressions). The competencies about the recognition, the evaluation on valence and arousal was tested in concomitance with psychophysiological variations. Specifically, we tested for the congruence of these multiple measures. Results Log-linear analysis and repeated measure ANOVAs showed different representations across the subjects, as a function of emotion. Specifically, children’ recognition and attribution were well developed for some emotions (such as anger, fear, surprise and happiness), whereas some other emotions (mainly disgust and sadness) were less clearly represented. SCR, HR and EMG measures were modulated by the evaluation based on valence and arousal, with increased psychophysiological values mainly in response to anger, fear and happiness. Conclusions As shown by multiple regression analysis, a significant consonance was found between self-report measures and psychophysiological behavior, mainly for emotions rated as more arousing and negative in valence. The multilevel measures were discussed at light of dimensional attribution model.
Content may be subject to copyright.
R E S E A R C H Open Access
Multilevel analysis of facial expressions of
emotion and script: self-report (arousal and
valence) and psychophysiological correlates
Michela Balconi
, Maria Elide Vanutelli and Roberta Finocchiaro
Background: The paper explored emotion comprehension in children with regard to facial expression of emotion.
The effect of valence and arousal evaluation, of context and of psychophysiological measures was monitored.
Indeed subjective evaluation of valence (positive vs. negative) and arousal (high vs. low), and contextual (facial
expression vs. facial expression and script) variables were supposed to modulate the psychophysiological responses.
Methods: Self-report measures (in terms of correct recognition, arousal and valence attribution) and
psychophysiological correlates (facial electromyography, EMG, skin conductance response, SCR, and heart rate, HR)
were observed when children (N = 26; mean age = 8.75 y; range 6-11 y) looked at six facial expressions of emotions
(happiness, anger, fear, sadness, surprise, and disgust) and six emotional scripts (contextualized facial expressions).
The competencies about the recognition, the evaluation on valence and arousal was tested in concomitance with
psychophysiological variations. Specifically, we tested for the congruence of these multiple measures.
Results: Log-linear analysis and repeated measure ANOVAs showed different representations across the subjects, as
a function of emotion. Specifically, childrenrecognition and attribution were well developed for some emotions
(such as anger, fear, surprise and happiness), whereas some other emotions (mainly disgust and sadness) were less
clearly represented. SCR, HR and EMG measures were modulated by the evaluation based on valence and arousal,
with increased psychophysiological values mainly in response to anger, fear and happiness.
Conclusions: As shown by multiple regression analysis, a significant consonance was found between self-report
measures and psychophysiological behavior, mainly for emotions rated as more arousing and negative in valence.
The multilevel measures were discussed at light of dimensional attribution model.
Keywords: Facial expression of emotion, EMG, Valence, Psychophysiology
In the last two decades, developmental psychology has
seen an increasing interest in the study of emotion com-
prehension. Specifically, emotional face recognition and
understanding represent a primary social competence,
because they contribute to social interactions and social
management [1]. These competencies are related to gen-
eral cognitive functions and it was shown that language
was the most important predictor of nonverbal emotion
recognition ability [2]. Indeed both gender and verbal
skills are important predictors of childrens emotional
awareness [3].
More specifically, Bullock and Russell [4] proposed a
model in which children acquire a system to represent
and classify emotions which is based on a limited num-
ber of wide categories. The most important of them are
the two dimensional axes of the hedonic value and the
arousal level. This model was tested by some empirical
studies which found that firstly children interpret facial
expressions in terms of pleasure-displeasure (bipolar he-
donic value) and intensity (arousal level). Only successively
they use more articulated and wider conceptual categories
[5,6]. To verify the type of categorization applied to the
emotional domain, affective responses organized around
* Correspondence:
Research Unit in Affective and Social Neuroscience, Department of
Psychology, Catholic University of the Sacred Heart, Milan Largo Gemelli, 1,
20123 Milan, Italy
© 2014 Balconi et al.; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative
Commons Attribution License (, which permits unrestricted use, distribution, and
reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain
Dedication waiver ( applies to the data made available in this article,
unless otherwise stated.
Balconi et al. Behavioral and Brain Functions 2014, 10:32
the arousal and valence dimension include subjective
experience, often measured using self-report responses to
affective stimuli. At this regard, Self-Assessment Manikin
(SAM) was used to test this subjective emotional correlates
[7]. It was also demonstrated that age, facial expression in-
tensity and emotion category are important for predicting
accuracy on emotion-processing tasks [8].
Previous results demonstrate how task type and chil-
drens mood influence childrens emotion processing [9].
Indeed, in order to explain this developmental process,
the type of emotions children have to recognize is a first
main factor related to decoding competencies [10]. More
generally, in line with Russells model of emotional ex-
perience, emotion fundamentally varies activation in
centrally organized appetitive and aversive motivational
systems that have evolved to mediate the wide range of
adaptive behaviors necessary for an organism struggling
to survive in the physical world [11-13]. Most pleasant
affects are held to be associated with the appetitive mo-
tivation system; unpleasant affects with defensive motiv-
ation [14]. Thus, a primary distinction among emotional
events is whether they are appetitive or aversive, positive
or negative, pleasant or unpleasant, which clearly relates
to the motivational parameter of direction. Secondly, all
agree that hedonically valenced events differ in the de-
gree of to which they arouse or engage action, which is
related to intensity parameter. Emotional intensity prob-
ably reflects the strength of activation in motivational sys-
tems subserving appetitive and defensive behaviors and, as
such, has impact on the type of physiological response. In-
tensity was conceptualized as predator imminence,or
the distance of the threatening pattern from the subject
[15] or in terms of distance from an aversive or appetitive
goal [16]. More generally, arousal in humans appears to
reflect the degree to which a stimulus elicits appropriate
appetitive or defensive behaviors. Thus, the two dimen-
sions of pleasure and arousal explain the majority of the
emotional experience and subjective judgment, and the in-
creased perception of emotional significance of a stimulus
in term of valence may increase the perception of its
arousing power [17].
Secondly, it should be noted that young subjects showed
to be competent in the decoding of the primary or simple
emotions (e.g. happiness and anger), but they have more
difficulties in processing secondary or complex emotions,
such as pride and embarrassment [18-20]. To identify
these emotions more time and more informational cues
must be analyzed. Moreover, as regard to the secondary
emotions or emotions developed only later during the de-
velopmental phases, a more accentuated difficulty in un-
derstanding causal antecedents (the events that caused the
emotion expressed by face) and contextual relations (the
social situations in which the emotion is produced)
emerges [21]. Bormann-Kischkel et al. [10] observed a
specific difficulty in understanding the emotions that
present a lack of correspondence between people expecta-
tions and environment events. These emotions have an
external and social origin, such as surprise, dismay, and
astonishment. In parallel, Capps, Yirmiya, and Sigman
[20] observed a greater impairment in recognizing and la-
belling the expression of those emotions that have an ex-
ternal locus of control and, simultaneously, that require a
wide knowledge of the social scripts and of their social
consequences. In line with this hypothesis, Baron-Cohen,
Spitz, and Cross [22] suggested that the comprehension is
more difficult for emotions that imply the activation of
some cognitive functions, such as mentalization and
metarepresentation. In general, previous results provide
evidence for late developmental changes in emotional ex-
pression recognition with some specificity in the time
course for distinct emotions [23]. Indeed it was found that
some emotions, like disgust and sadness, are more often
confused with other primary and early developed emo-
tions, such as anger or fear and they are not spontan-
eously labelled in comparison with other emotions such as
anger, fear or happiness.
Thus, it was also supposed that, through a progressive
process of script generalization, a situatedcomprehension
of emotions arises. The process of emotion categorization
is well illustrated by the use of adequate attribution. Indeed,
emotional labels constitute the final step of a developmental
process that goes through a dimensional attribution
(characterized by the presence of pleasure-displeasure cor-
relate) to a situational attribution(the script representa-
tion). This should be taken into consideration in studying
the development of emotional decoding, because these
competencies seem to be bound not only to cognitive but,
above all, to social and communicative competencies,
which have an influence on emotion conceptualization.
Thus, another main concern is represented by contextual
and situational elements that cause emotion and that might
facilitate or not the emotion processing and comprehension
[24]. As Russell and Widen [25] underlined, in everyday ex-
perience children use facial expressions in order to infer
emotions. On the other hand, the facial cues are always lo-
cated in an interactive context. In other words, it is neces-
sary to take into account the role of a wider socializing
context. Therefore, the concept of emotional context,
considered as a complex and multidimensional representa-
tion of situational events, is relevant in facial expression
Thirdly, it was observed that behavioral and physio-
logical responses to emotional pictures co-vary with
system evaluation (pleasure/displeasure) and motive in-
tensity (arousal) [26,27]. Psychophysiological responses
are not directly observable, and they include cardiovas-
cular, electrodermal, respiratory measures, etc. It was
underlined that emotion comprehension plays a critical
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 2 of 14
role in adaptive behavior, since it promotes survival and
guides human behavior by exerting a direct influence on
brain responsiveness and psychophysiological activities.
Between the others, facial action revealed by EMG meas-
ure (electromyogram), heart rate, and skin conductance
were observed to variate in concomitance of pleasure and
displeasure reports while viewing of emotional patterns.
More specifically, about the psychophysiological mea-
sures, facial behavior using electromyography (EMG)
suggested they were sensitive to the valence dimensions,
with increased corrugator activity in response to unpleas-
ant patterns and zygomatic activity in response to pleasant
patterns. Facial EMG (electromyographic) activity accom-
panies changes in appetitive (positive) and defensive (nega-
tive) activation [28]. Specifically, the corrugator muscle
appears to be responsive of to judgment of unpleasant
event compared to neutral pictures [27]. Many studies
found a consistent and significant relationship between
corrugator and hedonic valence, with greater corrugator
activity elicited when viewing the most unpleasant stimuli
[29]. Moreover, Bradley, Codisposti, Sabatinelli, and Lang
[30] showed that pictures that produce disgust (for ex-
ample mutilation), that were higher in arousal, prompt lar-
ger changes than other unpleasant pictures.
Other physiological measures of emotional behavior in-
clude heart rate (HR), with observed increased HR acceler-
ation to pleasant patterns and increased HR deceleration to
unpleasant patterns [30]. Moreover, investigations exploring
cardiovascular activity in emotion perception assessed vari-
ations as a function of differences in stimulus intensity, as
this variable was revealed critical in eliciting orienting or
defense response [27,30-32]. Low-intensity stimuli were
found to relate with heart rate deceleration, whereas intense
stimuli were observed to activate defense responses associ-
ated with heart rate acceleration [33-36]. Nevertheless, also
contrasting results were collected, since heart initially de-
celerated, rather than accelerated, when people viewed
pictures of unpleasant emotional events, contrary to the no-
tions that these aversive stimuli might prompt defensive
heart rate acceleration [27,30,37]. However, different experi-
mental paradigms were adopted in previous research and,
in some cases, no direct comparison can be conducted be-
tween them.
Moreover, it was found electrodermal activity (Skin
Conductance Response, SCR) consistently varies with
emotional intensity, with larger responses elicited in either
unpleasant and pleasant context and that are more
pronounced in those that are rated as highly arousing
[27,38,39]. Thus, also electrodermal reactions increase
with increases in defensive or appetitive activation [30,37].
In general, it was found increased skin conductance when
people view pictures rated as emotional, compared to neu-
tral, regardless they are rated pleasant or unpleasant in he-
donic valence [27]. However, when hedonic valence and
emotional arousal were co-varied, skin conductance re-
sponses were largest for highly arousing stimuli, irrespect-
ive of hedonic valence [40], consistent with the notion
that these reactions primarily reflect differences in emo-
tional arousal, rather than hedonic valence per se.
About these psychophysiological variations in response
to emotions and facial stimuli, an important debate
regards the presence of a coherent response by psycho-
physiological measures in childhood, as it was observed
in adult behavior. Previous research found consistent
patterns of psychophysiological activation also by chil-
dren in response to emotional stimuli [41,42]. Nevertheless,
to verify the coherence of these physiological measures in
young people in response to facial emotional patterns, spe-
cific analysis should be conducted which included both
arousal and valence parameters.
Therefore, emotional behavior manifests within multiple
domains, comprehending conceptual and self-report attri-
bution, autonomic responses (physiological systems), and
the comprehension of contextual components, which all
may have a significant role in this process. However, no
previous study has directly analyzed the relationship
between these multilevel measures, that is self-report
evaluation based on valence and arousal parameters, psy-
chophysiological behavior and contextual cue variability.
The present study was finalized to explore the conver-
gence of these different measures.
In the present research the effect of some main factors,
valence modulation (emotional type) from one hand, and
contextual effect (face alone vs. facial display within a
script), from the other, was considered. Specifically, we
explored their influence on physiological reactivity (auto-
nomic activity) and emotional attribution (self-report
attributional process), which are all relevant to the de-
scription of the emotional responses [26,43]. Thus, the
purpose of this study is to verify the attended psycho-
physiological and attributional responses to emotion vari-
ation, and, secondly, to show that the attributional process
was related to valence and to context modulation.
Previous assumptions should be strengthened by the
following hypotheses:
1) Faces evaluated as more negative or positive in term
of valence and arousing power should elicit more
intense responses, being the subjects more engaged
with the stimulus, whereas neutral stimuli should be
less involving and intense, and, consequently, differ
in affective rating from emotional stimuli. The
interaction effect of these two parameters (i.e.
valence and arousal) is also expected. This would
suggest that effects due to emotional arousal should
be greater for highly unpleasant and pleasant
stimuli, which were rated as slightly more arousing
than stimuli evaluated as less positive/negative [44].
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 3 of 14
2) Secondly, HR, EMG, and SCR should show a
modulation in correlation with emotionally relevant,
arousing and pleasant or unpleasant stimuli. We
expected that subjects might be more emotionally
involved by a highly negative or positive and more
arousing stimulus than neutral or low-arousing pic-
tures, and that they might have a more intense psy-
chophysiological activation while viewing a negative
or positive than a neutral pattern when they are also
perceived as more arousing [13]. This should pro-
duce an increased SCR and HR response, and the
modulation of facial EMG. Specifically, we expected
an increased SCR for more positive and negative
emotions, an increased corrugator activity in re-
sponse to negative emotions and an increased zygo-
matic activity in response to positive emotions.
Finally a general higher HR should be expected
mainly for more arousing emotions.
3) Furthermore, we expect that children may have
more difficulties to decode and understand emotions
generally considered as more complex and learned
only successively (such as disgust) rather than
primary basic emotions (such as happiness, anger,
and fear). In particular, we focus our attention on
the representation of the dimensional axes of
hedonic/arousal value, that engenders the
acquisition of a more complex conceptual
representation [5,45]. Thanks to this acquisition it
can be produced the developmental process, that
includes an initial competence in the discrimination
of basic emotional categories and a successive
comprehension of more complex emotional
categories (as disgust or sadness). Thus we supposed
that about these emotions children could have more
difficulty to give a spontaneous attributional correct
attribution (in term of valence and arousal) to the
facial patterns. Secondly they should be less
physiologically responsive to these emotional cues,
based on the intrinsic relationship that we expected
to exist between attributional and
psychophysiological processes.
4) Fourthly, based on the situationalperspective to
explain facial emotion comprehension, we may
suppose that emotion decoding is the result of the
elaboration of multiple emotional cues, among which
facial patterns (facial expressions), behavioral correlates
(the causal bonds between events), as well as specific
contextual factors (eliciting emotional context). The
comparison between two different types of condition
(only a facial expression of an emotion; a facial
expression within an emotional script) allows us to
explore in detail the role of the eliciting context in the
emotion. We suppose that script facilitates subjects
recognition. According to our hypothesis, a script will
function as a facilitation cue to correctly interpret the
whole emotional event. This facilitation should be
mainly more evident for the secondary emotions, as
disgust, because in order to comprehend this emotion,
subjects have to understand some contextual or
externalelements. Finally, this facilitation effect
should be supported by psychophysiological measures,
and in parallel situational cues should support the SCR
increasing (more positive and negative emotions); the
increased corrugator activity in response to negative
emotions, and the increased zygomatic activity in
response to positive emotions.
The sample includes 26 normal children. Ages varied
from 6 to 11 (M= 8.75; S.D. = 0.78; range = 6-11.5; 15 fe-
males and 11 males). None of them presented cognitive
or linguistic deficits. With regard to cognitive compe-
tencies, children presented a middle-high or high func-
tioning cognitive profile (WAIS-IV FSIQ: M= 87; range:
70-120). No history of psychiatric or neurological impair-
ments was observed for the participants. Indeed two neuro-
psychologists applied a specific semi-structured interview
before the experimental session to test no clinical impair-
ments. The presence of other deficits on the perceptive or
cognitive levels was excluded. Childparents gave informed
written consent to participate in the study by their sons,
and the research was approved by the local ethics commit-
tee (Catholic University Ethic Committee, Department of
Facial stimuli
The facial stimuli (cardboards black and white 10 cm × 10
cm), which consist of an emotional face of a young boy
showing six emotions (happiness, sadness, anger, fear, dis-
gust and surprise) and one neutral face. The stimulus ma-
terial was selected by Ekman and Friesen database [46].
We have opted for a young actor aged similarly to the ex-
perimental subjects, in order to facilitate the identification
process, which would make easier the recognition task
(Figure 1a).
Emotional scripts
The material consists of 6 pictures (coloured cardboards
10 cm × 15 cm) with an emotional content and one neu-
tral picture (see Figure 1b). Pictures illustrate contextu-
alized situations eliciting the emotional correlates of
happiness, sadness, anger, fear, surprise and disgust
[5,45]. In particular each picture presents a character (a
girl or a boy) in an interactive context (with peers or
adults). In addition, the presence of a clear emotional fa-
cial expression was considered a discriminant stimulus for
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 4 of 14
the selection of the pictures. The pertinence of the emo-
tional content for each emotional script, the homogeneity
of the stimuli in terms of cognitive complexity and famil-
iarity were tested in a pre-experimental phase (12 males
and females; 6-11 years). Stimulus homogeneity, intended
as the degree of difficulty in comprehending the situation
represented in the script (clarity of the context and the
represented subjects), and the complexity (number of de-
tails represented) were tested with a 5-points Likert scale.
No significant differences were found between emotions
for homogeneity F(6,11) = 1.18, p= .40; and complexity: F
(6,11) = 1.64, p=.31).
In each phase, first time stimuli were presented simultan-
eously, in order to allow familiarization with the material.
Figure 1 Examples of (a) facial stimuli and (b) emotional scripts.
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 5 of 14
In a second assessment, they were presented one at time, in
a random sequence, varying the order of the stimulus
across the participants. Furthermore, to avoid a possible
order effect between the experimental conditions, some
subjects were submitted to face decoding condition and
successively to emotional script condition, whereas other
subjects decoded the stimulus materials in an opposite se-
quence (firstly the emotional script and then the facial
Subjects were told that they had to evaluate some pic-
tures (faces or scenes) based on some rating scales. Self-
Assessment Manikin was used to test the self-report
measures on a nine-point scale hedonic value (positive/
negative) and arousal value of the emotional content
(more/less arousing) [7]. After each presentation of the
stimulus (stimulus presentation duration = 15 sec.) sub-
jects were invited to evaluate it, no longer viewing the
image. During stimulus presentation subjectspsycho-
physiological responses were acquired. Furthermore,
through a half-structured interview grid [47], the experi-
menter invited the child to observe the stimulus set and
to describe the emotional feelings represented (What is
that facial expression?). It was made another focal ques-
tion about the situation illustrated by the pictures
(What happened?). Interviews were audio- and video-
taped and scored verbatim. Three judges examined the
verbal material encoded, in order to analyze specific con-
ceptual categories relatively to the correctness of the verbal
labels (correct recognition). For the first level of analysis, a
correct answer included an explicit emotional label (such
as happiness) or synonymous terms (joy) [47].
Psychophysiological recording procedure
SCR, HR and EMG data reduction
Skin conductance response was measured continuously
with a constant voltage by Biofeedback (Biofeedback
2000, version 7.01). Before the attaching electrodes, the
skin was cleaned with alcohol and slightly abraded. SCR
was recorded from two electrodes placed on the medial
phalanges of the second and third finger of the non-
dominant hand. The sample rate was of 400 Hz. SCRs
elicited by each stimulus were scored manually and de-
fined as the largest increase in conductance in a time
window from 1,500 to 4,000 ms after stimulus presenta-
tion (for the procedure see Amrhein, Muhlberger, Pauli,
& Wiedermann) [48]. Trials with artifacts were excluded
from analysis, whereas trials with no detectable response
were scored as zero. The electrocardiogram was re-
corded using electrodes on the left and right forearms.
Inter-beat intervals of the HR were converted to heart
rate in beats per minute, to detect HR modulation dur-
ing viewing stimuli. Trials with artifacts were excluded
from analysis, whereas trials with no detectable response
were scored as zero. Facial electromyographic (EMG) ac-
tivity in the zygomaticus major and corrugator supercilii
muscle regions were considered. The electrodes (4 mm
diameter Ag/AgCl electrodes), filled with Surgicon elec-
trolyte paste, were positioned over the corrugator and
zygomatic muscles in accordance with guidelines for psy-
chophysiological recording [49,50]. Frequencies of interest
generally ranged from 20 to 400 Hz. Corrugator and zygo-
matic EMG responses were successively scored as the dif-
ference between the mean rectified corrugator/zygomatic
signals present during the presentation of the stimuli and
the mean rectified signals in the 1 s prior to stimulus pres-
entation (baseline measure). A positive value indicates that
the corrugator/zygomatic measures were greater during
the experimental phase than during the baseline phase.
All the data were acquired for the time interval of stimu-
lus presentation (15 sec.) and successively normalized.
The exact synchrony between the stimulus presentation
and the psychophysiological data acquisition was guaran-
teed by the introduction of specific marker by a second ex-
perimenter, simultaneously to the onset of the stimulus
presentation. A successive analysis of the video-taped regis-
tration of the entire experimental session furnished other
checking of this synchrony.
Analysis and results
Self-report measures
The statistical analysis applied to self-report measures in-
cluded two steps: a first step, where log-linear analysis was
applied to correctness of emotional evaluation; a second
step, where repeated measure ANOVAs was applied. Type
I errors associated with inhomogeneity of variance were
controlled by decreasing the degrees of freedom using the
Greenhouse-Geiser epsilon.
A log-linear hierarchical analysis (saturated model) was
applied to subject labeling (correct labelling of emotion)
with factors correctness (correct/incorrect, 2) × condition
(face/script, 2) × emotion (type, 7) variables (see Areni,
Ercolani, Scalisi) [51] (Table 1). In both conditions (emo-
tional face and script), the emotions were largely recog-
nized by the subjects. In fact, they correctly labeled each
emotion (with increased correct recognition more than in-
correct, χ
(1, N = 26, 11.38, p .01) independently from
the type of task χ
(1, N = 26, 1.21, p = .30). However, emo-
tional type showed significant effect χ
(1, N = 26, 8.03,
p.01). Post-hoc comparisons (standardized residuals) re-
vealed that anger, fear, surprise and happiness were better
recognized than disgust, sadness and neutral faces (all
comparisons p .01).
About the valence attribution, ANOVA showed a signifi-
cant emotion (F(6, 25) = 10.30, p .01, ɳ
= .38) and emo-
tion × condition effect (F(6, 25) = 9.14, p .01, ɳ
= .38)
(Table 1). Post-hoc comparisons (contrast analysis, with
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 6 of 14
Bonferroni corrections for multiple comparisons) showed
increased negative valence attribution for anger, fear, sur-
prise and sadness in comparison with happiness and neu-
tral face, as well as happiness was considered as more
positive than the other faces (all comparisons p .01).
Moreover, it was found a more negative attribution for
disgust in the case of script more than face condition
F(1, 25) = 10.79, p .01, ɳ
= .40). No other comparison
was statistically significant (all comparisons p .01).
About the arousal attribution it was found a significant
emotion (F(6, 25) = 10.15, p .01, ɳ
= .39) and emotion ×
condition effect (F(6, 25) = 9.56, p .01, ɳ
= .37). Post-
hoc comparisons showed increased arousal attribution for
anger, fear, and surprise in comparison with happiness and
sadness (all paired comparisons p .01). Moreover all the
emotional faces were considered more arousing than
neutral faces (all paired comparisons p .01). In addition,
about the interaction effect, disgust was found as more
arousing in the case of script than face condition F(1, 25) =
8.09, p .01, ɳ
= .33). No other comparison was statisti-
cally significant (all paired comparisons p .01).
Psychophysiological measures
Successively, repeated measure ANOVAs, with two inde-
pendent repeated (within-subjects) factors (condition ×
emotion), were applied to each dependent measure
ANOVA showed significant main effect of emotion (F(6,
25) = 9.56, p .01, ɳ
= .37). As shown by contrast effects,
anger, fear and surprise revealed increased SCR in com-
parison with happiness, sadness, disgust and neutral stim-
uli. Moreover, disgust and happiness showed higher SCR
than neutral faces (all comparisons p .01) (Figure 2).
ANOVA showed significant main effect of emotion (F
(6, 25) = 10.98, p .01, ɳ
= .40). As shown by contrast
analyses, anger, fear surprise and happiness revealed in-
creased HR in comparison with sadness, disgust and
neutral stimuli. Moreover, disgust and sadness showed in-
creased HR than neutral faces (all comparisons p .01)
(Figure 3).
Zygomatic EMG activity revealed significant differences as
a function of emotion (F(6, 25)= 10.76, p .01, ɳ
As shown by contrast effects, EMG activity was enhanced
in response to positive stimuli in comparison with nega-
tive and neutral faces (all comparisons p .01). Contrarily,
corrugator EMG activity was increased for negative emo-
tions, respectively anger, fear, and surprise in comparison
with happiness, disgust, sadness and neutral stimuli (all
comparisons p .01) (Figure 4 and 4b).
Regression analysis between valence and arousal ratings
and psychophysiological measures
Distinct stepwise multiple regression analyses were per-
formed for each psychophysiological measure and emo-
tion, considering the mean values for face and script
condition. Predictor variables were arousal and valence
ratings, and predicted variables were EMG, SCR, and HR
amplitude for each emotion. We report in Table 2 the
cumulative multiple correlations between predictor
and predicted variables (R), cumulative proportion of
explained variance (R
), and the regression weights (β)for
the regression equation at each step of the multivariate
As shown in Table 2, arousal and valence accounted
for the amplitudes of zygomatic muscle for happiness,
whereas mainly arousal rating accounted for corrugator
muscle for anger, fear and surprise. In addition, valence
and arousal explained the HR (increasing) more for
anger, fear, surprise, and happiness. Finally, SCR in-
creased response was mainly explained by the two pre-
dictors for anger, fear, and surprise, and secondly for
disgust and happiness.
Table 1 Self-report measure of correctness (percentage), arousal and valence for each emotion and condition (face and
Self-report rating Anger Fear Suprise Happiness Disgust Sadness Neutral
Face M (sd) M (sd) M (sd) M (sd) M (sd) M (sd) M (sd)
Correctness 89 1.34 91 2.87 84 1.89 80 2.09 69 1.98 65 1.56 64 1.22
Arousal 8.46 0.56 8.52 1.09 8.11 0.77 7.09 0.65 6.32 0.65 4.55 0.54 3.91 0.65
Valence 2.33 0.98 2.11 0.78 2.87 0.22 8.04 3.50 3.50 0.76 2.33 0.64 4.98 0.68
Script M (sd) M (sd) M (sd) M (sd) M (sd) M (sd) M (sd)
Correctness 86 2.33 88 3.98 85 2.87 77 2.09 74 1.98 67 2.09 60 2.09
Arousal 8.40 0.78 8.16 0.86 8.09 0.49 7.32 0.36 6.98 0.65 4.13 0.53 3.08 0.54
Valence 2.39 0.65 2.66 0.71 2.43 0.76 8.76 0.65 2.98 0.84 2.38 0.67 4.55 0.39
SAM rating nine-points (valence: 1 = high negative, 9 = high positive; arousal: 1 = low, 9 = high).
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 7 of 14
The present study produced three major results, that we
summarize in the following points. First, there was a clear
differentiation in childrenconceptualization (in terms of
arousal and valence) as a function of different emotions;
besides, the psychophysiological measures were highly
modulated by emotional types, and arousal and valence
parameters accounted for the psychophysiological varia-
tions in relationship with different emotional patterns; fi-
nally the presence of two different types of task afacial
expression decoding and a script comprehension in-
duced significant differences in the subjective represen-
tations only for a limited number (mainly disgust) of
For the first time we used multimodal measures to ex-
plore the evaluation effect (based on valence and arousal)
on psychophysiological behavior taking into account
an ample range of emotions. Secondly we applied this
multimodal approach to study the specific domain of
facial expression of emotions whereas other previous
research did not specifically consider this emotional
domain. Thirdly we considered the facial expression
of emotion with and without an emotional script con-
text to study the contextual impact on face decoding.
Therefore the situated perspective was adopted in the
present research.
As hypothesized by the dimensional approach to emo-
tion [52,53], the representation of the emotional domain
was based on a conceptual space defined by two exes,
arousal and hedonic value. In particular, the emotions
with a high arousal level and a negative value were bet-
ter understood, if compared with other emotions. Specif-
ically, the emotions of fear, anger and surprise were well
recognized and well labeled. A significant higher arous-
ing power was attributed to them, and these emotions
were also considered as more negative. Moreover, they
Figure 2 Mean (and SE) SCR modulations in response to different emotions.
Figure 3 Mean (and SE) HR modulations in response to different emotions.
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 8 of 14
were better recognized than the other emotions, specific-
ally in comparison with sadness and disgust. The positive
emotion of happiness was considered as less arousing and
more positively valenced and it was well represented and
recognized. On the contrary, disgust appears to be more
difficult to be identified, as well as sadness, and they both
were considered as less arousing and less negative. It should
be considered that in present research we opted to evaluate
the ability of subjects in spontaneously labelling the face/
script they saw. As revealed, disgust and sadness were not
immediately labelled, but in many cases they were correctly
described (using a semi-structured interview) only after a
successive enquire. Therefore, the subjects showed a gen-
eral ability in recognizing the two emotions, although this
recognition was less immediate. It should be based on the
increased complexity to decode these emotions, because
they are learned only successively in comparison with other
primary emotions (such as anger and fear).
Therefore a first main result of the present study was
that the dichotomy pleasure/displeasure and high/low
arousal was considered relevant by the subjects, confirm-
ing a significant role in emotion representation, as
indicated by previous researches [19,53,54]. In fact, not
only the hedonic category was systematically well repre-
sented, but it was correctly identified in terms of negativity
or positivity. Moreover, arousal rating can be considered a
predictive cue of the ability to classify and differentiate
emotional correlates. Indeed, it was correctly used when
the child was able to attribute an adequate label to the
emotion, while when the child cannot conceptualize the
emotion, the arousal value seems to be more ambiguous
(for example for disgust) or less relevant (sadness).
As regard to more negative and arousing emotions (fear,
anger and surprise) some recent study [55,56] revealed
high rates of recognition, that the researcher attributes to
the central adaptive function of these negative high arous-
ing emotions. Indeed, they has a main role for the individ-
ual safeguard, both on an ontogenetic and a phylogenetic
level. They may be represented as a cue in order to detect
unfavorable environmental conditions [19,54]. Accord-
ingly to the functional model [57,58], the emotional
expressions represent a response to a particular event, sig-
nificant in terms of costs and benefits for people. Speci-
fically, the expression of anger and fear represents the
Figure 4 Mean (and SE) (a) zygomatic and (b) corrugator modulations in response to different emotions.
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 9 of 14
Table 2 Stepwise multiple regressions
Anger Fear Surprise Happiness Sadness Disgust Neutral
Predictor Arousal Valence Arousal Valence Arousal Valence Arousal Valence Arousal Valence Arousal Valence Arousal Valence
Model 12121212121212
R 0.13 0.28 0.20 0.34 0.22 0.33 0.44 0.76 0.18 0.30 0.26 0.37 0.14 0.26
R2 0.01 0.07 0.04 0.09 0.04 0.11 0.19 0.57 0.03 0.09 0.06 0.03 0.01 0.06
β0.20 0.21 0.24 0.23 0.15 0.11 0.34 0.28 0.15 0.18 0.23 0.26 0.23 0.20
std error 0.21 0.22 0.15 0.17 0.21 0.28 0.18 0.27 0.23 0.20 0.17 0.19 0.18 0.26
t 1.02 0.87 0.95 0.88 0.78 0.70 1.76
0.77 0.84 0.96 0.87 0.67 0.59
R 0.49 0.64 0.1 = 51 0.69 0.35 0.52 0.24 0.41 0.18 0.29 0.22 0.40 0.18 0.29
R2 0.24 0.40 0.26 0.47 0.12 0.27 0.05 0.18 0.03 0.07 0.04 0.18 0.03 0.07
β0.31 0.32 0.23 0.21 0.27 0.20 0.23 0.28 0.32 0.38 0.36 0.29 0.20 0.23
std error 0.25 0.20 0.20 0.18 0.18 0.19 0.33 0.34 0.30 0.28 0.22 0.27 0.17 0.20
1.03 1.93
1.02 1.43
0.99 1.02 0.96 0.65 0.49 0.90 0.78 0.56 0.43
R 0.43 0.69 0.54 0.72 0.38 0.62 0.32 0.59 0.21 0.38 0.29 0.51 0.17 0.29
R2 0.18 0.47 0.27 0.51 0.13 0.41 0.38 0.34 0.04 0.14 0.07 0.25 0.03 0.08
β0.18 0.28 0.22 0.20 0.18 0.23 0.29 0.20 0.34 0.35 0.35 0.57 0.27 0.33
std error 0.11 0.17 0.28 0.26 0.20 0.21 0.21 0.30 0.30 0.30 0.26 0.23 0.22 0.29
1.07 1.65
0.88 0.63 1.10
0.54 0.45
R 0.42 0.70 0.50 0.80 0.44 0.71 0.36 0.65 0.22 0.38 0.35 0.67 0.18 0.29
R2 0.17 0.49 0.25 0.64 0.19 0.50 0.14 0.42 0.04 0.14 0.12 0.36 0.03 0.08
β0.17 0.20 0.20 0.22 0.28 0.27 0.29 0.27 0.19 0.34 0.25 0.22 0.39 0.28
std error 0.22 0.26 0.18 0.15 0.20 0.21 0.32 0.30 0.15 0.18 0.18 0.29 0.27 0.22
0.67 0.78 1.55
0.77 0.60
Arousal and valence as predictor variables, pshychophysiological measures as predicted variables.
*= P.05.
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 10 of 14
perception of a threat for the personal safeguard and,
therefore, it requires a greater investment of attentional
resources. The prominence of specific category of emotion
(more negative and arousing) may suggest their central
role in emotion acquisition in comparison with other less
relevant (and less arousing) emotions in childhood.
The script condition introduces another main explica-
tive factor, regarding the emotional representation. In-
deed, the presence of a specific context generally does
not affect the correctness of the emotional label attribu-
tion, but it produces a discriminant effect exclusively for
one emotion, that is disgust. Indeed in presence of a spe-
cific situational context disgust was better characterized
in terms of arousal (more clearly arousing) and valence
(more negatively valenced). The presence of the inter-
actional features that characterize the emotional experi-
ence seems to introduce a facilitation element for emotion
comprehension, also producing a better description in the
emotion labeling (more correct recognition). It was pos-
sible to state that the situational component constitutes a
facilitation cue, because it allowed the subjects to activate
a more complex conceptual representation, which takes
into account the context in which the emotional event
happens, the emotional causes, the logical order of actions
and their consequences [4].
It was noticeable, however, that the script enables a
wider and a more complete representation only in case
of a this secondaryemotion, which maximally has a
benefit from the situated condition. It was observed that
emotion recognition was allowed by the development
and the generalization of an emotional script, that is, a
child can recognize a specific emotion by verifying the
presence of several prototypical elements that are ar-
ranged in precise causal and temporal sequences. These
scripts include not only facial expressions, but also the
representation of causal factors, physical and social con-
text, several actions and their consequences, as well as
the cognitive appraisal of the situation and the subjective
experience [4]. Among these cues, the representation of
the causal bonds, that is a set of causal events and of
their behavioral consequences, has a remarkable signifi-
cance, because they constitute the more explicative ele-
ments of the emotional experience [5,45,59].
To conclude, even if our study does not allow us to state
which of the two representational modalities (facial pattern
comprehension or script decoding) precedes the other, it
was possible to observe that the situational correlates pro-
vide a facilitation cue for the representation of emotional
correlate when a secondary emotion is represented. How-
ever, no specific facilitation effect was observable in case of
primaryemotions, which were well recognized and de-
scribed also in absence of contextual cues.
A relevant main result of the present research was
related to the psychophysiological measures which were
shown to vary in concomitance to the type of stimuli (dif-
ferent emotions) and to the categorization process (the
subjective ratings). In fact, subject revealed a coherent
psychophysiological behavior in response to the emotions,
independently from the condition (script or face). More-
over, it was shown that SCR, HR and EMG were modu-
lated as a function of the two main axes of valence and
arousal, as they were rated by the subjects.
Firstly, SCR was shown to be increased when children
processed emotional faces and scripts rated as high
arousing and negative (anger, fear and surprise), whereas
it decreased in concomitance with stimuli rated as low
arousing (mainly sadness, disgust, and neutral patterns).
A similar profile was observed for HR, which showed
higher values in case of more positive, more negative
and arousing stimuli. These results were in line with
many other studies on adults, which postulated a signifi-
cant HR effect for more arousing and relevant stimuli
[33-36]. Moreover, the variation in term of arousing
power (high vs. low) may determine the different impact
of the emotional cues, since perception of a high arousal
generally induces a consistent HR increasing independ-
ently from the stimulus valence. These multiple parame-
ters and their combination were relevant to comprehend
the effect of emotions on psychophysiological data.
An important result was also observed for the facial
EMG values. Indeed we found that children were highly
responsive to facial stimuli and scripts, by adopting a
sort of facial feedbackmodality, since they used similar
facial configurations displayed by the pictures (consonant
behavior) [60]. It was observed an increasing of mimic ac-
tivity in case of some conditions: the different emotions
evoked distinct facial EMG response patterns, with in-
creased zygomatic muscle activity to positive patterns and
increased corrugator muscle activity to negative patterns,
whereas both the corrugator and the zygomatic muscle re-
sponse patterns were less pronounced in sadness, disgust
and neutral condition. More generally, corrugator muscle
activity was increased in response to more negative and
arousing stimuli, mainly for fear, anger, and surprise. In
addition, as revealed by regression analysis, arousal param-
eter showed to explain in greater measure the corrugator
modulation, whereas valence was less relevant to describe
the psychophysiological activity in response to negative,
highly arousing patterns. Contrarily, zygomatic muscle was
modulated by both arousal and valence, with significant in-
creasing responsiveness related to happiness.
These variations may mark a psychophysiological re-
sponse in case of a high arousing situations, since rele-
vant (with arousing power) stimuli seem to produce and
reinforce a coherent psychophysiological behavior. Con-
trarily, subject reported a reduced arousing power for
sadness and partially for disgust, fact that may explain
the concomitant reduced EMG, SCR and HR values.
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 11 of 14
Thus, more arousing conditions showed a perfect conson-
ance between subjective evaluation and psychophysio-
logical (both facial and autonomic) measures. Specifically,
anger, fear, surprise and happiness were rated as more
emotionally involving. In parallel, the psychophysiological
behavior was responsive of this subjective self-evaluation,
with an increased positive(zygomatic) facial expression
and a higher autonomic activity (increased HR) for happi-
ness, from one hand; an increased negative(corrugator)
facial expression and higher arousal response (more SCR
and HR) for anger, fear and surprise, from the other.
However, more generally the modulation of psycho-
physiological measures was mainly related to arousing
power more than to valence, since independently from
the valence, the stimuli rated as high arousing (anger,
fear, surprise and happiness) were able to induce a more
significant and coherent emotional response. Regression
analysis confirmed these results: mainly arousal attribu-
tion was significant to determine the psychophysiological
variations, able to explain SCR, HR and facial response
modulation, since subjects sharedthe facial behavior
and autonomic activity observed in both positive vs. nega-
tive conditions.
Thus, in general psychophysiological measures may be
interpreted as functional mechanism of mirroringthe
emotional condition displayed by the facial stimuli, where
sharingsimilar emotional responses allows a direct form
of understanding and recognize emotion by a sort of simu-
lation process. More specifically, contexts evaluated as
emotionally involving and significant may ingenerate a
consonant shared response by the observer, who firstly rec-
ognizes and secondly mimic(by face and autonomic be-
havior) the somatic markers related to the experienced
emotions [61]. Moreover, based on these results we may
suggest that the gradual development of emotional compe-
tencies proceeds from more basic and simple emotions,
which are primarily acquired by children, to more complex
and less prominent emotions, which might be less relevant
in terms of salience. Brain correlates may support this dif-
ferent learning process, related to a maturation effect
which might explain more deeply the early acquisition of
the recognition abilities in response to more salient and
relevant emotions in term of human safeguard and the
successive acquisition for the less relevant (less threatening
and primary for the safeguard) emotions.
To summarize, self-report measures were replicated by
psychophysiological behavior, that showed to vary coher-
ently in relationship with different emotions. Children
revealed a consonant and adequate behavior in terms of
labeling (correct recognition), evaluation (valence and
arousal attribution) and psychophysiological responsive-
ness. However, a clear advantage was observed for some
specific emotions, those rated as more arousing and
negative (fear, anger and surprise). It was suggested these
emotions may be central to people safeguard and they
may be priority developed by children. Arousal attribu-
tion was considered as the most critical parameter to ex-
plain the emotion recognition process and the
psychophysiological behavior. Contrarily, sadness and dis-
gust were less prominent in terms of both arousal and
valence, and in some cases they were also less correctly
recognized. The contextual cues (script condition) may
allow to perform a better attribution, mainly for the emo-
tion of disgust. In case of more complex emotional cue,
the context (script) contribution was relevant to complete
the correct recognition.
However, about the main limitations of the present
study, future research may explore more directly the intrin-
sic effect induced by facial expression of emotion taking
into account also gender effect. Indeed previous research
found significant differences between male/female children
in response to the emotional type. Secondly, the arousal ef-
fect we found in the present study should be better consid-
ered in relationship with different emotional valence taking
into account a wider range of facial expressions which may
cover the ample orthogonal axes low/high arousal positive/
negative valence. Thirdly, due to the limited sample we
used for the present research, it is crucial to integrate the
present data with an ampler sample size, in order to extend
the present results to a general population.
Competing interests
The authors declare that they have no competing interests.
MB planned the experiment, supervised the experiment; designed the
statistical analysis, wrote the paper. RF realized the experiment; applied the
analysis; provided the editorial and reference assistance. MEV realized the
experiment; applied the analysis; provided the editorial and reference
assistance. All authors read and approved the final manuscript.
Received: 19 May 2014 Accepted: 15 September 2014
Published: 26 September 2014
1. Balconi M, Lucchiari C: EEG correlates (event-related desynchronization)
of emotional face elaboration: A temporal analysis. Neurosci Lett 2006,
2. Rosenqvist J, Lathi-Nuuttila P, Laasonen M, Korman M: Preschoolers' recognition
of emotional expressions: relationships with other neurocognitive capacities.
Child Neuropsychol 2014, 20:281302.
3. Mancini G, Agnoli S, Trombini E, Baldaro B, Surcinelli P: Predictors of
emotional awareness during childhood. Health 2013, 5:375380.
4. Bullock M, Russell JA: Concepts of emotion in developmental psychology.
In Measuring emotions in infants and children. Edited by Izard CE, Read P.
Cambridge: Cambridge University Press; 1986:203237.
5. Balconi M, Carrera A: Emotional representation in facial expression and
script: A comparison between normal and autistic children. Res Dev
Disabil 2007, 28:409422.
6. Widen SC, Russell JA: A closer look at preschoolersfreely produced labels
for facial expressions. Dev Psychol 2003, 39:114128.
7. Lang PJ: Behavioral tretment and biobehavioral assessment: computer
applications. In Tecnhology in mental health care delivery system. Edited by
Sidowski JB, Johnson JH, Williams TA. Norwood NJ: Ablex; 1980:119137.
8. Herba CM, Landau S, Russell T, Ecker C, Phillips ML: The development of
emotion-processing in children: effects of age, emotion, and intensity.
J Child Psychol Psychiatry 2006, 47:10981106.
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 12 of 14
9. Cummings AJ, Rennels JL: How mood and task complexity affect
children's recognition of others' emotions.
Soc Dev 2014, 23:8099.
10. Bormann-Kischkel C, Vilsmeier M, Baude B: The development of emotional
concepts in autism. J Child Psychol Psychiatry 1995, 36:12431259.
11. Bradley MM, Lang PJ: Motivation and emotion. In Handbook of
Psychophysiology. Edited by Cacioppo JT, Tassinary LG, Berntson GG. New
York: Cambridge University Press; 2007:581607.
12. Davidson R, Ekman P, Saron CD, Senulis JA, Friesen WV: Approach/
withdrawal and cerebral asymmetry: Emotional expression and brain
physiology. J Pers Soc Psychol 1990, 58:330341.
13. Lang PJ, Bradley MM, Cuthbert BN: Emotion, attention, and the startle
reflex. Psychophysiological Rev 1990, 97:377398.
14. Cacioppo JT, Berntson GG: Relationships between attitudes and
evaluative space: a critical review with emphasis on the separability of
positive and negative substrates. Psychol Bull 1994,
15. Fanselow MS: Neural organization of the defensive behavior system
responsible for fear. Psychon Bull Rev 1994, 1:429438.
16. Miller NE: Liberalization of basic S-R concepts: Extensions to conflict
behaviour, motivation and social learning. In Psychology: A study of a
science. Volume 2. Edited by Koch S. New York: McGraw Hill; 1959.
17. Russell J: A circumplex model of affect. J Pers Soc Psychol 1980,
18. Balconi M, Lucchiari C: Consciousness, emotion and face: An event-related
potentials (ERP) study. In Consciousness & emotion. Agency, conscious choice,
and selective perception. Edited by Ellis RD, Newton N. Amsterdam/Philadelphia:
John Benjamins Publishing Company; 2005:121135.
19. Balconi M, Pozzoli U: Face-selective processing and the effect of pleasant
and unpleasant emotional expressions of ERP correlates. Int J Psychophysiol
2003, 49:6774.
20. Capps L, Yirmiya N, Sigman M: Understanding of simple and complex
emotions in non-retarded children with autism. J Child Psychol Psychiatry
1992, 33:11691182.
21. Hillier A, Allinson L: Beyond expectations. Autism, understanding
embarrassment, and the relationship with theory of mind. Autism 2002,
22. Baron-Cohen S, Spitz A, Cross P: Do children with autism recognize
surprise? A research note. Cogn Emotion 1993, 7:507516.
23. Thomas LA, De Bellis MD, Graham R, LaBar KS: Development of emotional facial
recognition in late childhood and adolescence. Dev Sci 2007,
24. Fein D, Lucci D, Braverman M, Waterhouse L: Comprehension of affect in
context in children with Pervasive Developmental Disorders. J Child
Psychol Psychiatry 1992, 33:11571167.
25. Russell JA, Widen SC: Words versus faces in evoking preschool children's
knowledge of the causes of emotions. Int J Behav Dev 2002, 26:97103.
26. Cuthbert BN, Schupp HT, Bradley MM, Birbaumer N, Lang PJ: Brain
potentials in affective picture processing: Covariation with autonomic
arousal and affective report. Biol Psychol 2000, 62:95111.
27. Lang PJ, Greenwald MK, Bradley MM, Hamm AO: Looking at pictures:
affective, facial, visceral, and behavioral reactions. Psychophysiology 1993,
28. Tassinary LG, Cacioppo JT, Geen TR: A psychometric study of surface
electrode placements for facial electromyographic recording: I. The brow
and cheek muscle regions. Psychophysiology 1989, 26:116.
29. Balconi M, Brambilla E, Falbo L: Appetitive vs. defensive responsens to
emotional cues. Autonomic measures and brain oscillation modulation.
Brain Res 2009, 1296:7284.
30. Bradley MM, Codispoti M, Sabatinelli D, Lang PJ: Emotion and motivation I:
defensive and appetitive reactions in picture processing. Emotion 2001,
31. Christie IC, Friedman BH: Autonomic specificity of discrete emotion and
dimensions of affective space: a multivariate approach. Int J Psychophysiol
2004, 51:143153.
32. Lacey JI: Somatic response patterning and stress: Some revisions of
activation theory. In Psychological stress: Issues in research. Edited by Appley
MH, Trumbull R. New York: Appleton-Century-Crofts; 1967:1438.
33. Balconi M, Brambilla E, Falbo L: BIS/BAS, cortical oscillations and
coherence in response to emotional cues. Brain Res Bull 2009,
34. Graham FK: Distinguishing among orienting, defense, and startle reflexes.
In The orienting reflex in humans. Edited by Kimeel HD, Van Olst EH,
Orlebeke JF. Hillsdale NJ: Lawrence Erlbaum Associates;
35. Quigley KS, Berntson GG: Autonomic origins of cardiac responses to
nonsignal stimuli in the rate. Behav Neurosci 1990,
36. Turpin G: Ambulatory clinical psychophysiology: an introduction to
techniques and methodological issues. J Psychophysiol 1986,
37. Palomba D, Sarlo M, Angrilli A, Mini A, Stegagno L: Cardiac responses
associated with affective processing of unpleasant film stimuli. Int J
Psychophysiol 2000, 36:4557.
38. Bradley MM, Lang PJ: Measuring emotion: Behavior, feeling and
physiology. In Cognitive neuroscience of emotion. Edited by Lane R, Nadel L.
New York: Oxford University Press; 2000:242276.
39. Miller NE, Patrick CJ, Levenston GK: Affective imagery and the startle
response: probing mechanism of modulation during pleasant scenes,
personal experiences, and discrete negative emotions. Psychophysiology
2002, 39:519529.
40. Gomez P, Zimmermann P, Guttormsen-Schär S, Danuser B: Respiratory
responses associated with affective processing of film stimuli. Biol Psychol
2004, 68:223235.
41. Bernat EM, Cadwallader M, Seo D, Vizueta N, Patrick CJ: Effects of
instructed emotion regulation on valence, arousal and attentional
measures of affective processing. Dev Neuropsychol 2011,
42. McManis MH, Bradley MM, Berg WK, Cuthbert BN, Lang PJ: Emotional
reactions in children: verbal, physiological, and behavioural responses to
affective pictures. Psychophysiology 2001, 38:222231.
43. Lang PJ: What are the data of emotion? In Cognitive perspectives on
emotion and motivation. Edited by Hamilton V, Bower GH, Frijda N. Boston
MA: Martinus Nijhoff; 1989:173191.
44. Polich J, Kok A: Cognitive and biological determinants of P300: an
integrative review. Biol Psychol 1995, 41:103146.
45. Balconi M, Amenta S, Ferrari C: Emotional decoding in facial expression,
scripts and video. A comparison between normal, autistic and Asperger
children. Res Autism Spectr Disord 2012, 6:193203.
46. Ekman P, Friesen WV: Pictures of facial affects. Palo Alto: Consulting
Psychologist Press; 1976.
47. Widen SC, Russell JA: The relative power of an emotion's facial
expression, label and behavioral consequence to evoke preschoolers
knowledge of its causes. Cogn Dev 2004, 19:111125.
48. Amrhein C, Muhlberger A, Pauli P, Wiedermann G: Modulation of event-related
brain potentials during affective picture processing: a complement to startle
reflex and skin conductance response? Int J Psychophysiol 2004,
49. Blumenthal TD, Cuthbert BN, Filion DL, Hackley S, Lipp OV, van Boxtel A:
Committee report: guidelines for human startle eyeblink
electromyographic studies. Psychophysiology 2005, 42:115.
50. Fridlund AJ, Cacioppo JT: Guidelines for human electromyographic
research. Psychophysiology 1986, 23:567589.
51. Areni A, Ercolani AP, Scalisi TG: Introduzione alluso della statistica in
psicologia. Milano: LED; 1994.
52. Balconi M, Pozzoli U: Semantic attribution and facial expression of
emotion. Conscious Emotion 2003, 2:6681.
53. Ellsworth PC, Scherer KR: Appraisal processes in emotion. In Handbook of
affective sciences, Series in affective science. Edited by Davidson RJ, Scherer
KR, Goldsmith HH. London: Oxford University Press; 2003:572595.
54. Balconi M: Neuropsicologia delle emozioni. Roma: Carocci; 2004.
55. Bernat E, Bunce S, Shevrin H: Event-related brain potentials differentiate
positive and negative mood adjectives during both supraliminal and
subliminal visual processing. Int J Psychophysiol 2001, 42:1134.
56. Castelli F: Understanding emotions from standardized facial expressions
in autism and normal development. Autism 2005, 9:428449.
57. Frijda NH: Varieties of affect: Emotions and episodes, moods and
sentiments. In The nature of emotion: Fundamental questions. Edited by
Ekman P, Davidson R. Oxford: Oxford University Press; 1994:5967.
58. Lazarus RS: The cognition-emotion debate: A bit of history. In Handbook
of cognition and emotion. Edited by Dalgleish T, Power MJ. Chichester: John
Wiley; 1999:319.
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 13 of 14
59. Want SC, Harris PL: Learning from other people's mistakes: causal
understanding in learning to use a tool. Child Dev 2001, 72:431443.
60. Balconi M, Bortolotti A: Empathy in cooperative versus non-cooperative
situations: the contribution of self-report measures and autonomic
responses. Appl Psychophysiol Biofeedback 2012, 37:161169.
61. Balconi M, Bortolotti A: Resonance mechanism in empathic behavior BEES, BIS/
BAS and psychophysiological contribution. Physiol Behav 2012, 105:298304.
Cite this article as: Balconi et al.:Multilevel analysis of facial expressions
of emotion and script: self-report (arousal and valence) and
psychophysiological correlates. Behavioral and Brain Functions 2014 10:32.
Submit your next manuscript to BioMed Central
and take full advantage of:
Convenient online submission
Thorough peer review
No space constraints or color figure charges
Immediate publication on acceptance
Inclusion in PubMed, CAS, Scopus and Google Scholar
Research which is freely available for redistribution
Submit your manuscript at
Balconi et al. Behavioral and Brain Functions 2014, 10:32 Page 14 of 14
... For example, briefly viewing an image of a face with a fearful expression, rather than a neutral expression, can increase a person's ability to identify features of line gratings shown next (Phelps et al. 2006). Balconi et al. (2014) showed schoolaged children a series of emotional face stimuli and explored their comprehension of the emotions (Balconi et al. 2014). They found that children reported anger, fear and surprise as more arousing in comparison to happiness and sadness. ...
... For example, briefly viewing an image of a face with a fearful expression, rather than a neutral expression, can increase a person's ability to identify features of line gratings shown next (Phelps et al. 2006). Balconi et al. (2014) showed schoolaged children a series of emotional face stimuli and explored their comprehension of the emotions (Balconi et al. 2014). They found that children reported anger, fear and surprise as more arousing in comparison to happiness and sadness. ...
Reciprocal interactions require memories of social exchanges; however, little is known about how we remember social partner actions, especially during childhood when we start forming peer-to-peer relationships. This study examined if the expectation-violation effect, which has been observed in adults’ source memory, exists among 5–6-year-old children. Forty participants played a coin collection game where they either received or lost coins after being shown an individual with a smiling or angry expression. This set-up generated congruent (smiling-giver and angry-taker) versus incongruent (smiling-taker and angry-giver) conditions. In the subsequent tasks, the children were asked to recall which actions accompanied each individual. The children considered the person with incongruent conditions as being stranger than the person with congruent conditions, suggesting that the former violated the children’s emotion-based expectations. However, no heightened source memory was found for the incongruent condition. Instead, children seem to better recognise the action of angry individuals than smiling individuals, suggesting that angry facial expressions are more salient for children’s source memory in a social exchange.
... In the latter case, biometric tools attempt to classify an individual's visual attentive process and emotional expression in terms of emotional arousal (intensity) as well as emotional valence (positive versus negative hedonic affect); (Barrett, 2006;Kron et al., 2015;Russell, 1980). Often, a biometric tool (see Figure 3) will provide information on one dimension more than another; galvanic skin responses (GSR) and electrocardiography (ECG), for example, are often used as measures of arousal but not valence (Berntson et al., 1997;Boucsein, 1999;De Santos et al., 2011;Torres et al., 2013), while facial Electromyography (EMG) and Facial Emotion Recognition (FER) can provide valenced information but are less likely to provide an accurate measure of intensity, although intensity can contribute to the correct identification of an expression (Balconi et al., 2014). Thus, the use of multiple sensors in a multimodal approach is necessary to provide the most comprehensive profile of an individual's psychophysiological reaction to stimuli (Balconi et al., 2014;De Santos et al., 2011). ...
... Often, a biometric tool (see Figure 3) will provide information on one dimension more than another; galvanic skin responses (GSR) and electrocardiography (ECG), for example, are often used as measures of arousal but not valence (Berntson et al., 1997;Boucsein, 1999;De Santos et al., 2011;Torres et al., 2013), while facial Electromyography (EMG) and Facial Emotion Recognition (FER) can provide valenced information but are less likely to provide an accurate measure of intensity, although intensity can contribute to the correct identification of an expression (Balconi et al., 2014). Thus, the use of multiple sensors in a multimodal approach is necessary to provide the most comprehensive profile of an individual's psychophysiological reaction to stimuli (Balconi et al., 2014;De Santos et al., 2011). Within the context of social interaction, a limitation of conventional self-report methods is that one's feelings, attitudes and reactions may not be articulated to their fullest extent. ...
... Within the study of emotions, it has been observed that there is an increase in the acceleration of the HR against pleasant patterns, and an increase in the deceleration of the HR against unpleasant patterns (Balconi, Vanutelli, and Finocchiaro 2014). It has also been seen that there are variations according to the intensity of the stimulus. ...
... It has also been seen that there are variations according to the intensity of the stimulus. Stimuli with low intensity are related to a deceleration of the HR, while intense stimuli have been seen to activate defence responses associated with the acceleration of the HR (Balconi, Vanutelli, and Finocchiaro 2014). ...
Full-text available
This paper is based on the second stage of a design-based research (DBR) project encompassing the initial prototyping of virtual reality (VR) simulation in para-medicine education using self-reported and biometric feedback data. In this discussion paper we present the range of reflections and theoretical possibilities that arose from the piloting experience and their implications in redesigning practice in paramedicine education. We focus on the foundational literature and episte-mological understandings coming from neurophenomenological cognitive science applied in technology-enhanced learning, using mixed reality (MR) in paramedi-cine simulation learning as a case. We do so following the logic of a DBR method-ological framework, in part demonstrating the usefulness of DBR when reflecting on applied practice to inform newer theoretical developments, leading to further integrated solutions in future practice. In addition, we also put attention on a conceptual shift from a focus on VR, to a focus on MR with emphasis on the associated benefits offered by MR learning situations within paramedicine education. Finally, we discuss the benefits of incorporating self-reported and biometric feedback data in paramedicine education in particular, and in technology-enhanced learning in general, for the design of meaningful learning experiences informed by emotional and physiological responses of learners.
... Pictures of smiling people (Balconi et al., 2014), facial expression manipulationtowards more smiling (Boiten, 1996), and recall of a joyful moment (Kop et al., 2011). ...
Full-text available
Autonomic nervous system (ANS) activity is a fundamental component of emotional responding. It is not clear, however, whether positive emotional states are associated with differential ANS reactivity. To address this issue, we conducted a meta-analytic review of 120 articles (686 effect sizes, total N = 6,546), measuring ANS activity during 11 elicited positive emotions, namely amusement, attachment love, awe, contentment, craving, excitement, gratitude, joy, nurturant love, pride, and sexual desire. We identified a widely dispersed collection of studies. Univariate results indicated that positive emotions produce no or weak and highly variable increases in ANS reactivity. However, the limitations of work to date – which we discuss – mean that our conclusions should be treated as empirically grounded hypotheses that future research should validate.
... As both systems show a close functional relationship (Phillips et al., 2003), it can be argued that adaptive processing (see, e.g., emotion-generative and -regulative cycles, "process model" by Gross and Thompson, 2007) theoretically enables both alignment procedures. In this context, strong associations between subjective and physiological arousal were shown (Alpers, Adolph, & Pauli, 2011;Balconi, Vanutelli, & Finocchiaro, 2014;Tan et al., 2016). Furthermore, judgements of arousal were previously shown to covary systematically with biological reflexes that are associated with the human defensive motive system (for an overview see Bradley et al., 2001a, b). ...
Full-text available
Subjective emotional arousal in typically developing adults was investigated in an explorative study. 177 participants (20–70 years) rated facial expressions and words for self-experienced arousal and perceived intensity, and completed the Difficulties in Emotion Regulation scale and the Hospital Anxiety and Depression scale (HADS-D). Exclusion criteria were psychiatric or neurological diseases, or clinically relevant scores in the HADS-D. Arousal regarding faces and words was significantly predicted by emotional clarity. Separate analyses showed following significant results: arousal regarding faces and arousal regarding words constantly predicted each other; negative faces were predicted by age and intensity; neutral faces by gender and impulse control; positive faces by gender and intensity; negative words by emotional clarity; and neutral words by gender. Males showed higher arousal scores than females regarding neutral faces and neutral words; for the other arousal scores, no explicit group differences were shown. Cluster analysis yielded three distinguished emotional characteristics groups: “emotional difficulties disposition group” (mainly females; highest emotion regulation difficulties, depression and anxiety scores; by trend highest arousal), “low emotional awareness group” (exclusively males; lowest awareness regarding currently experienced emotions; by trend intermediate arousal), and a “low emotional difficulties group” (exclusively females; lowest values throughout). No age effect was shown. Results suggest that arousal elicited by facial expressions and words are specialized parts of a greater emotional processing system and that typically developing adults show some kind of stable, modality-unspecific dispositional baseline of emotional arousal. Emotional awareness and clarity, and impulse control probably are trait aspects of emotion regulation that influence emotional arousal in typically developing adults and can be regarded as aspects of meta-emotion. Different emotional personality styles were shown between as well as within gender groups.
... Moreover, the strength of the corrugators supercilii activations strongly correlated with the empathy profile of individuals. This research failed to find a link between alexithymia and embodiment of sad expressions, a result that is not surprising since it is well known that the expression of sadness produces less arousal compared with the other negative emotions (for example, see Balconi et al. 2014), probably reducing the embodiment effect. The results on the emotion of anger are apparently in contrast with the results of another study where rapid facial reactions (RFRs) were measured in alexithymia (Peasley-Miklus et al. 2016). ...
... This so-called orienting response is followed by increased sympathetic control indexed by cardiac acceleration (Bradley, Codispoti, Cuthbert, & Lang, 2001). Contrary to the simple fight-or-flight type of response, cardiac acceleration results both from stimulating negative and positive events and is thus related to the stimulus arousal rather than valence (Balconi, Vanutelli, & Finocchiaro, 2014; for a review see Kreibig, 2010). ...
... Conversely, it has been suggested that the emotional state of the observer may influence the degree of mimicry such that observed expressions congruent with the perceiver's emotional state are more quickly and easily mimicked (e.g., Niedenthal et al., 2001). Furthermore, it has been shown that emotional empathy, i.e., process when perception of other's emotions generates the same emotional state in the perceiver (e.g., de Waal, 2008;Jankowiak-Siuda et al., 2015), is related to the magnitude of facial muscle activity (e.g., Sonnby-Borgstrom, 2002;Sonnby-Borgström et al., 2003;Dimberg et al., 2011;Balconi and Canavesio, 2013;Balconi et al., 2014). For example, using static prototypical facial expressions of happiness and anger, Dimberg et al. (2011) reported that high-empathic subjects responded with greater CS activity to angry compared to happy faces and with larger ZM activity to happy faces compared to angry faces. ...
Full-text available
Facial mimicry is the tendency to imitate the emotional facial expressions of others. Increasing evidence suggests that the perception of dynamic displays leads to enhanced facial mimicry, especially for happiness and anger. However, little is known about the impact of dynamic stimuli on facial mimicry for fear and disgust. To investigate this issue, facial EMG responses were recorded in the corrugator supercilii, levator labii, and lateral frontalis muscles, while participants viewed static (photos) and dynamic (videos) facial emotional expressions. Moreover, we tested whether emotional empathy modulated facial mimicry for emotional facial expressions. In accordance with our predictions, the highly empathic group responded with larger activity in the corrugator supercilii and levator labii muscles. Moreover, dynamic compared to static facial expressions of fear revealed enhanced mimicry in the high-empathic group in the frontalis and corrugator supercilii muscles. In the low-empathic group the facial reactions were not differentiated between fear and disgust for both dynamic and static facial expressions. We conclude that highly empathic subjects are more sensitive in their facial reactions to the facial expressions of fear and disgust compared to low empathetic counterparts. Our data confirms that personal characteristics, i.e., empathy traits as well as modality of the presented stimuli, modulate the strength of facial mimicry. In addition, measures of EMG activity of the levator labii and frontalis muscles may be a useful index of empathic responses of fear and disgust.
... These two clips, along with one other, primarily evoked the emotion of fear, whereas the remaining two clips primarily evoked sadness. As anxiety produces a greater physiological response than sadness, our results may reflect these differences in physiological responding to differing emotions (Balconi et al., 2014). Future research may want to focus on one specific emotion or use a greater number of trials in order to determine differences between emotions. ...
While there is a general consensus in the literature that individuals with autism spectrum disorder have difficulty with cognitive empathy, much less is known about emotional empathy processing in these individuals. Most research has employed subjective self-report measures, which can often be misinterpreted or under-reported/over-reported. More objective measures such as psychophysiological recordings of arousal offer a more objective response. Furthermore, combining physiological responses with self-report ratings allows us to explore the relationship between these two responses to emotionally charged stimuli. A total of 25 individuals with autism spectrum disorder were compared with 25 matched controls on their physiological (arousal) and psychological (self-report) responses to emotionally distressing video scenes. These responses were also then compared with self-report cognitive and emotional trait empathy. Results indicate that while individuals with autism spectrum disorder appear to respond similarly to controls physiologically, their interpretation of this response is dampened emotionally. Furthermore, this dampening of self-report emotional response is associated with a general reduction in trait empathy.
Full-text available
Real-life faces are dynamic by nature, particularly when expressing emotion. Increasing evidence suggests that the perception of dynamic displays enhances facial mimicry and induces activation in widespread brain structures considered to be part of the mirror neuron system, a neuronal network linked to empathy. The present study is the first to investigate the relations among facial muscle responses, brain activity, and empathy traits while participants observed static and dynamic (videos) facial expressions of fear and disgust. During display presentation, blood-oxygen level-dependent (BOLD) signal as well as muscle reactions of the corrugator supercilii and levator labii were recorded simultaneously from 46 healthy individuals (21 females). It was shown that both fear and disgust faces caused activity in the corrugator supercilii muscle, while perception of disgust produced facial activity additionally in the levator labii muscle, supporting a specific pattern of facial mimicry for these emotions. Moreover, individuals with higher, compared to individuals with lower, empathy traits showed greater activity in the corrugator supercilii and levator labii muscles; however, these responses were not differentiable between static and dynamic mode. Conversely, neuroimaging data revealed motion and emotional-related brain structures in response to dynamic rather than static stimuli among high empathy individuals. In line with this, there was a correlation between electromyography (EMG) responses and brain activity suggesting that the Mirror Neuron System, the anterior insula and the amygdala might constitute the neural correlates of automatic facial mimicry for fear and disgust. These results revealed that the dynamic property of (emotional) stimuli facilitates the emotional-related processing of facial expressions, especially among whose with high trait empathy.
Full-text available
Evaluative processes refer to the operations by which organisms discriminate threatening from nurturant environments. Low activation of positive and negative evaluative processes by a stimulus reflects neutrality, whereas high activation of such processes reflects maximal conflict. Attitudes, an important class of manifestations of evaluative processes, have traditionally been conceptualized as falling along a bipolar dimension, and the positive and negative evaluative processes underlying attitudes have been conceptualized as being reciprocally activated, making the bipolar rating scale the measure of choice. Research is reviewed suggesting that this bipolar dimension is insufficient to portray comprehensively positive and negative evaluative processes and that the question is not whether such processes are reciprocally activated but under what conditions they are reciprocally, nonreciprocally, or independently activated. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Full-text available
Previous studies examined how mood affects children's accuracy in matching emotional expressions and labels (label-based tasks). This study was the first to assess how induced mood (positive, neutral, or negative) influenced 5- to 8-year-olds' accuracy and reaction time using both context-based tasks, which required inferring a character's emotion from a vignette, and label-based tasks. Both tasks required choosing one of four facial expressions to respond. Children responded more accurately to label-based questions relative to context-based questions at 5 to 7 years of age, but showed no differences at 8 years of age, and when the emotional expression being identified was happiness, sadness, or surprise, but not disgust. For the context-based questions, children were more accurate at inferring sad and disgusted emotions compared to happy and surprised emotions. Induced positive mood facilitated 5-year-olds' processing (decreased reaction time) in both tasks compared to induced negative and neutral moods. Results demonstrate how task type and children's mood influence children's emotion processing at different ages.
Children's performance on free labeling of prototypical facial expressions of basic emotions is modest and improves only gradually. In 3 data sets (N = 80, ages 4 or 5 years; N = 160, ages 2 to 5 years; W = 80, ages 3 to 4 years), errors remained even when method factors (poor stimuli, unavailability of an appropriate label, or the difficulty of a production task) were controlled. Children's use of emotion labels increased with age in a systematic order: Happy, angry, and sad emerged early and in that order, were more accessible, and were applied broadly (overgeneralized) but systematically. Scared, surprised, and disgusted emerged later and often in that order, were less accessible, and were applied narrowly.
Despite the burgeoning literature using facial electromyography (EMG) to study cognitive and emotional processes, the psychometric properties of facial EMG measurement have received little attention. Two experiments were conducted to assess the reliability and validity of facial EMG as a measure of specific facial actions. In Experiment 1, two recording sites in the brow region were compared for their ability to differentiate facial actions hypothesized to be due to the activation of the corrugator supercilii from facial actions presumed to be due to the activation of proximate muscles (e.g. depressor supercilii, procerus, frontalis, levator labii superioris alaeque nasi, orbicularis oculi), and four sites in the infraorbital triangle were compared for their ability to differentiate facial actions hypothesized to be due to the activation of the zygomaticus major from facial actions presumed to be due the activation of proximate muscles (e.g. zygomaticus minor, risorius, buccinator, orbicularis oculi, orbicularis oris). Fifteen subjects were instructed to pose a series of facial actions while EMG activity was sampled simultaneously at all sites. In Experiment 2, 5 subjects returned to the laboratory for a more extensive investigation of surface EMG activity over the zygomaticus major muscle region. The results of this experiment confirmed the findings of Experiment 1. Overall, the results demonstrate that certain recording sites located over specific facial muscle regions are more sensitive and valid indices of particular facial actions than other nearby sites.