Event-related potential correlates of the collective self-relevant effect.
ABSTRACT The present study investigated the electrophysiological correlates of the psychological processing of the collective self-relevant stimulus using a three-stimulus oddball paradigm. The results showed that P300 amplitude elicited by the collective self-relevant stimulus was larger than those elicited by familiar and unfamiliar stimuli. In addition, N250 and P300 amplitudes elicited by subjects' own names were larger than those elicited by other name stimuli. In terms of lateralization of P300, the collective self-relevant effect was largest in the left region sites and the individual self-relevant effect was largest in the right region sites. Therefore, the present study extended previous findings by showing that the collective self, similar to the famous individual self, was psychologically important to humans.
- [Show abstract] [Hide abstract]
ABSTRACT: We investigated the processing of self-related information under the prime paradigm using event-related potentials (ERPs) to provide evidence for implicit self-positivity bias in Chinese individuals. Reaction times and ERPs were recorded when participants made positive/negative emotional judgments to personality-trait adjectives about themselves or others. Faster responses occurred to self-related positive adjectives and other-related negative adjectives, indicating implicit self-positivity bias at the behavioral level. ERPs showed an interaction between prime and emotion at the P300 amplitude, with larger P300 amplitudes for words within the self-positivity bias, indicating that self-related information occupied more attentional resources. Larger N400 amplitudes elicited by words that were inconsistent with the self-positivity bias, suggesting that accessing non-self-relevant information is more difficult than self-relevant information. Thus, P300 and N400 could be used as neuro-indexes of the implicit self-positivity bias.Experimental Brain Research 01/2014; · 2.22 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: Autism spectrum disorder (ASD) is a heterogeneous neurodevelopmental condition clinically characterized by social interaction and communication difficulties. To date, the majority of research efforts have focused on brain mechanisms underlying the deficits in interpersonal social cognition associated with ASD. Recent empirical and theoretical work has begun to reveal evidence for a reduced or even absent self-preference effect in patients with ASD. One may hypothesize that this is related to the impaired attentional processing of self-referential stimuli. The aim of our study was to test this hypothesis. We investigated the neural correlates of face and name detection in ASD. Four categories of face/name stimuli were used: own, close-other, famous, and unknown. Event-related potentials were recorded from 62 electrodes in 23 subjects with ASD and 23 matched control subjects. P100, N170, and P300 components were analyzed. The control group clearly showed a significant self-preference effect: higher P300 amplitude to the presentation of own face and own name than to the close-other, famous, and unknown categories, indicating preferential attentional engagement in processing of self-related information. In contrast, detection of both own and close-other's face and name in the ASD group was associated with enhanced P300, suggesting similar attention allocation for self and close-other related information. These findings suggest that attention allocation in the ASD group is modulated by the personal significance factor, and that the self-preference effect is absent if self is compared to close-other. These effects are similar for physical and non-physical aspects of the autistic self. In addition, lateralization of face and name processing is attenuated in ASD, suggesting atypical brain organization.PLoS ONE 01/2014; 9(1):e86020. · 3.53 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: One's own name seems to have a special status in the processing of incoming information. In event-related potential (ERP) studies this preferential status has mainly been associated with higher P300 to one's own name than to other names. Some studies showed preferential responses to own name even for earlier ERP components. However, instead of just being self-specific, these effects could be related to the processing of any highly relevant and/or frequently encountered stimuli. If this is the case: (1) processing of other highly relevant and highly familiar names (e.g., names of friends, partners, siblings, etc.) should be associated with similar ERP responses as processing of one's own name and (2) processing of own and close others' names should result in larger amplitudes of early and late ERP components than processing of less relevant and less familiar names (e.g., names of famous people, names of strangers, etc.). To test this hypothesis we measured and analyzed ERPs from 62 scalp electrodes in 22 subjects. Subjects performed a speeded two-choice recognition task-familiar vs. unfamiliar-with one's own name being treated as one of the familiar names. All stimuli were presented visually. We found that amplitudes of P200, N250 and P300 did not differ between one's own and close-other's names. Crucially, they were significantly larger to own and close-other's names than to other names (unknown and famous for P300 and unknown for P200 and N250). Our findings suggest that preferential processing of one's own name is due to its personal-relevance and/or familiarity factors. This pattern of results speaks for a common preference in processing of different kinds of socially relevant stimuli.Frontiers in Human Neuroscience 01/2014; 8:194. · 2.91 Impact Factor
Neuroscience Letters 445 (2008) 135–139
Contents lists available at ScienceDirect
journal homepage: www.elsevier.com/locate/neulet
Music-induced mood modulates the strength of emotional negativity bias:
An ERP study
Jie Chena,b, Jiajin Yuana,b, He Huanga,b, Changming Chena,b, Hong Lia,b,∗
aKey Laboratory of Cognition and Personality (SWU), Ministry of Education, Southwest University, Chongqing 400715, China
bSchool of Psychology, Southwest University, Chongqing 400715, China
a r t i c l e i n f o
Received 21 May 2008
Received in revised form 1 August 2008
Accepted 25 August 2008
Emotional negativity bias
a b s t r a c t
The present study investigated the effect of music-elicited moods on the subsequent affective processing
through a music-primed valence categorization task. Event-related potentials were recorded for positive
and negative emotional pictures that were primed by happy or sad music excerpts. The reaction time
data revealed longer reaction times (RTs) for pictures following negative versus positive music pieces,
irrespective of the valence of the picture. Additionally, positive pictures elicited faster response latencies
than negative pictures, irrespective of the valence of the musical prime. Moreover, the main effect of
and for the averaged amplitudes at 500–700ms interval. Negative pictures elicited smaller P2 amplitudes
than positive pictures, and the amplitude differences between negative and positive pictures were larger
with negative musical primes than with positive musical primes. Similarly, compared to positive pictures,
negative pictures elicited more negative deflections during the 500–700ms interval across prime types.
positive music primes at this interval. Therefore, the present study observed a clear emotional negativity
bias during either prime condition, and extended the previous findings by showing increased strength of
the negative bias under negative mood primes. This suggests that the neural sensitivity of the brain to
negative stimuli varies with individuals’ mood states, and this bias is particularly intensified by negative
© 2008 Elsevier Ireland Ltd. All rights reserved.
Considerable studies revealed the existence of negativity bias
in emotional processing that people are particularly sensitive
to emotionally negative events, such that negative events are
often preferentially processed compared to neutral and positive
events [9,16,17,20,27]. Behavioral studies have suggested that neg-
ative traits are given greater weight in overall evaluations than
are positive traits , and a greater weight paid for negative
information than for positive information was manifested in risk-
taking research . Moreover, a growing body of event-related
brain potential studies documents a bias towards negative over
positive information [16,20,23,27]. For instance, it was reported
that larger P1 amplitudes were elicited by rare negative stim-
uli than rare positive stimuli as early as 110ms after stimulus
onset . In addition, at approximately 170ms after stimulus
onset, smaller P2 amplitudes were observed for highly nega-
tive stimuli than for moderately negative and neutral stimuli
∗Corresponding author at: School of Psychology, Southwest University, Beibei,
Chongqing 400715, China. Tel.: +86 23 6825 4337; fax: +86 23 6825 2309.
E-mail address: email@example.com (H. Li).
[20,27]. More recently, it was further demonstrated that the
emotional negativity bias also occurred in reaction readiness
As described above, it is clear that the emotional negativity bias
occurs at each step of the information-processing stream. Never-
ative stimuli [7,8]. According to Bower’s network model of emotion
and cognition [7,8], negatively distorted thinking influences judg-
ment of events and situations, and the perception of emotional
information are colored by the underlying emotional state. In this
view, it is predictable that variations in mood will modulate the
occurrence or features of the negativity bias during human emo-
tional processing. It has been shown that subjects perceived more
sadness in ambiguous faces and less happiness in clear faces dur-
ing the depressive mood induction than during the elation mood
induction. This result suggested a depression-related negative bias
in the perception of facial affects . More recently, some behav-
ioral as well as neuroimaging studies consistently demonstrated
that depressive and anxious patients, as compared to healthy indi-
viduals, displayed increased strength of emotional negativity bias
0304-3940/$ – see front matter © 2008 Elsevier Ireland Ltd. All rights reserved.
J. Chen et al. / Neuroscience Letters 445 (2008) 135–139
Fig. 1. Grand average ERPs at Fz, FCz, CZand CPZfor NN (thick solid lines), NP (thin solid lines), PN (thick dotted lines), and PP (thin dotted lines).
during the perception of emotional words, images as well as faces
Nevertheless, whether the processing bias of the brain for neg-
ative stimuli is modulated by mood changes in normal individuals
iological mechanism that accounts for, this phenomenon needs to
be determined. With high temporal resolution, ERP measures are
advantageous to our investigation of the time course of the poten-
tial modulation effect of the negative bias by mood primes. Based
on these considerations, the present study focused on the effects
of music-induced moods (sadness or happiness) on the strength of
negativity bias in emotional processing via ERP measures. Specifi-
cally, as predicted by the established emotional negativity bias, we
hypothesized that the neural responses to negative images would
be more intense than to positive images, irrespective of music
primes. Moreover, based on prior studies [6,10,14,21], it was pre-
dicted that the strength of emotional negativity bias, as indexed
by difference ERPs between negative and positive images, would
be increased with negative music prime than with positive music
The present study used an affective priming paradigm. Negative
(sad) and positive (happy) music pieces, which were believed to
ing stimuli to induce negative (sad) and positive (happy) moods
[5,18,19]. Emotional pictures were used as the task-relevant target
tem (IAPS) has been reported in Chinese subjects , the pictures
selected to elicit emotional responses in the present study were
mood-inducing musical pieces were selected from Chinese classi-
cal music pool that were more familiar to Chinese participants, so
as to guarantee the effectiveness of mood inducement.
As paid volunteers, 12 native Chinese students (7 women and
5 men) aged 20–25 years (mean age, 22.8 years) participated in
the study. All subjects were healthy, right-handed, with normal or
corrected to normal vision and audition, and reported no history of
affective disorder. Each subject signed an informed consent form
with the ethical principles of the 1964 Declaration of Helsinki .
The stimulus material consisted of 7 blocks of 232 prime–target
pairs, in which music excerpts served as primes and pictures as
targets, and these pairs were divided into four experimental condi-
target), 58 NP trials (negative music as prime and positive picture
as target), 58 PP trails (positive music as prime and positive picture
Summary of results of the 2 (music category: negative and positive)×2 (picture category: negative and positive)×3 (laterality: left, midline and right sites)×4 (frontality:
front, front-central, central and central-parietal) ANOVAs for the 500–550, 550–600, 600–650 and 650–700ms time windows
Time (ms) Music Picture LateralityFrontalityMusic×picture
J. Chen et al. / Neuroscience Letters 445 (2008) 135–139
as target), and 58 PN trials (positive music as prime and negative
picture as target).
Picture targets consisted of 116 emotionally negative pictures
and 116 emotionally positive pictures selected from the CAPS. The
negative pictures were divided into two groups and each con-
tains 58 pictures. There were no significant difference in valence
[t(39)=−0.37, p=0.71] and arousal [t(39)=−0.7, p=0.94] between
two groups. In the same way, the positive pictures were divided
into two groups and each contains 58 pictures. There were also no
significant difference in valence [t(39)=−0.17, p=0.87] and arousal
[t(39)=−0.11, p=0.92] between two groups. In addition, the lumi-
nance level of the pictures was matched in the four conditions,
and the contrast of the monitor was set to a constant value across
Prime stimulus consisted of 75 negative (sad) music and 75
positive (happy) music from the Chinese classical music. All were
instrumental and rated by 40 non-musician college students (20
women and 20 men). These excerpts were rated for valence
on a nine-point scale (1=‘extremely sad’ whereas 9=‘extremely
happy’). All excerpts were also rated for arousal on a 9-point scale
(1=‘not arousing at all’ to 9=‘extremely arousing’). Finally, a total
of 116 excerpts were selected for the EEG experiment: 58 negative
music and 58 positive music, with the valence significantly differ-
ent (t=−36.34, p<0.001) whereas the arousal similar (t=−0.687,
p=0.5) between the two music groups (the averaged valence were
3.01 for sad music and 6.99 for happy music, whereas the mean
arousal for sad and happy excerpts were 5.72 and 5.8, respectively).
Musical excerpts were matched for length. Happy excerpts lasted
on average 11.72s and sad excerpts lasted on average 11.93s. All
Subjects were seated in a quiet room at approximately 150cm
from a computer screen with the horizontal and vertical visual
angles below 5◦. In order to familiarize participants with the task,
experiment started with 12 practice trials, with 3 trials under each
condition. Each trial was initiated by a 300ms presentation of a
small white cross on the black computer screen; then, the auditory
presentation of a musical excerpt via an earphone was followed
by the visual presentation of target emotional image (presenta-
tion time for targets was 1000ms). The interstimulus interval (ISI)
between prime and target was 200ms. Half of the subjects were
instructed to press the “F” key on the key board (as accurately and
quickly as possible) if the pleasant images appeared, and to press
the “J” key if the unpleasant images appeared. For the remaining
subjects the response pattern was reversed. The stimulus picture
was terminated by a key pressing, or was terminated when it
elapsed for 1000ms. Therefore, each subject was informed that
their responses must be made under 1000ms. Each response was
followed by 2000ms of a blank screen. Trails were presented ran-
domly and distributed over seven blocks.
Electroencephalography (EEG) was recorded from 64 scalp sites
on the medial frontal aspect. Eye movements were monitored with
canthi. Electrode impedance was maintained below 5k?. EEG and
EOG activity was amplified with a dc∼100Hz bandpass and con-
tinuously sampled at 500Hz/channel. EEG data were corrected to
a 200ms baseline prior to the onset of the target. Artifact-free EEG
segments to trials with correct responses were averaged separately
for each experimental condition. ERP averages were computed off-
line; trials with EOG artifacts (mean EOG voltage exceeding ±80V),
amplifier clipping artifacts, or peak-to-peak deflection exceeding
±80V were excluded from averaging.
EEG activity for correct response in each valence condition
was overlapped and averaged separately. ERP waveforms were
time-locked to the onset of pictures and the average epoch was
ERPs grand averaged waveforms and topographical map, negative
tures, irrespective of the prime types. Moreover, these differences
were largest at left frontal sites. In order to test the lateralization,
the following 12 electrode sites were selected for statistical analy-
sis: F3, FC3, C3, CP3(four left sites); Fz, FCz, CZ, CPZ(four midline
sites); F4, FC4, C4, CP4(four right sites). The latencies and peak
500–700ms were measured. A four-way repeated measures anal-
ysis of variance (ANOVA) was conducted for the amplitude and
latency of each component. ANOVA factors were music (negative
ity (left, midline and right sites) and frontality (front, front-central,
central and central-parietal sites). The degrees of freedom of the F-
ratio were corrected according to the Greenhouse-Geisser method.
A two-way repeated measure ANOVA, with music and image
valence as factors, was performed for mean reaction time (RT)
and accuracy rates (ACC). Analysis of RT revealed a signif-
icant main effects for music [F(1, 11)=48.63, p<0.001] and
picture [F(1, 11)=29.98, p<0.001], but no significant interac-
tion effect between music and picture valence [F(1, 11)=0.9,
p=0.363]. Longer RTs were recorded for images following nega-
tive (M=640.78ms, S.E.=23.26ms) versus positive music pieces
(M=614.16ms, S.E.=24.17ms), irrespective of the valence of the
picture. Moreover, the behavioral responses for negative pictures
(M=668.61ms, S.E.=26.45ms) were delayed compared to posi-
tive pictures (M=586.34ms, S.E.=23.04ms), irrespective of the
valence of the musical prime. In addition, false responses were
rare, mean accuracy rates for the NN, NP, PP and PN condi-
tions were 94%(S.E.=0.012), 95%(S.E.=0.016), 96.4%(S.E.=0.014),
effects observed for mean ACC.
As shown in Fig. 1, P2 component was elicited by all four con-
ditions. A four-way repeated measures ANOVA on P2 amplitudes
demonstrated a significant music by picture valence interaction
effect [F(1, 11)=6.58, p=0.026]. To break down this interaction,
we performed a simple-effect analysis with which P2s elicited
by positive and negative pictures were compared within each
priming condition. Negative images elicited smaller P2 amplitudes
than positive pictures in both positive music priming condition
[Mnegative pictures=3.33, S.E.=0.88; Mpositive pictures=4.04, S.E.=1.03;
F(1, 11)=18.51, p=0.001] and negative music priming condi-
tion [Mnegative pictures=3.1, S.E.=0.99; Mpositive pictures=4.33, S.E.=1;
F(1, 11)=48.14, p<0.001]. More importantly, the ERP difference
between negative and positive pictures was larger with negative
musical primes than with positive musical primes [F(1, 11)=6.58,
p=0.026]. In addition, the interaction effect between laterality and
and picture valence [F(3, 11)=4.27, p=0.038] were both significant.
The amplitude differences during negative and positive conditions
significant main, or interaction effects observed at the P2 latencies.
A four-way repeated measure ANOVA on the average ampli-
tudes showed significant music by picture valence interaction
effects at each 50ms of the 500–700ms interval (Table 1). To
break down this interaction, we compared the average ampli-
tudes elicited by positive and negative pictures for negative and
positive priming conditions, respectively. Negative images elicited
more negative deflections than positive images with the pos-
itive musical primes during the time interval of 500–550ms
[F(1, 11)=12.19, p=0.005], 550–600ms [F(1, 11)=15.44, p=0.002],
J. Chen et al. / Neuroscience Letters 445 (2008) 135–139
Fig. 2. Top: the negative picture minus positive picture difference wave in the negative musical condition (solid line) and in the positive musical condition (dotted line) at
FCzand CZ. Left-bottom: topographical maps of voltage amplitudes for the negative picture minus positive picture difference wave in the positive musical condition at each
50ms interval from 500 to 700ms. Right-bottom: topographical maps of voltage amplitudes for the negative picture minus positive picture difference wave in the negative
musical condition at each 50ms interval from 500 to 700ms.
600–650ms [F(1, 11)=29.48, p<0.001] and 650–700ms [F(1,
11)=22.87, p=0.001]. Furthermore, negative images elicited more
negative deflections than positive images with the negative musi-
cal primes during the time interval of 500–550ms [F(1, 11)=50.08,
11)=55.94, p<0.001] and 650–700ms [F(1, 11)=38.52, p<0.001].
More importantly, the ERP difference between negative and pos-
itive images was larger with negative musical primes than with
positive musical primes during the time interval of 500–550ms
[F(1, 11)=8.77, p=0.013), 550–600ms (F(1, 11)=24.2, p<0.001),
600–650ms (F(1, 11)=23.04, p=0.001) and 650–700ms [F(1,
11)=9.41, p=0.011] (see Fig. 2). In addition, there were significant
laterality by picture valence interaction effect as well as site by
picture valence interaction effect (see Table 1), and the amplitude
differences between negative and positive pictures were largest at
left frontal sites (see Fig. 2).
As for behavioral data, longer RTs were observed for negative
trigger extensive and time-consuming cognitive analysis , and
this might be manifested in longer categorization times of neg-
ative than positive stimulus in valence categorization task .
Thus, these behavioral data were consistent with previous studies
and the predictions based on emotional negativity bias, indexed an
enhanced processing for information of negative emotions, espe-
cially when following the negative primes.
At approximately 170ms after stimulus onset, obvious P2 activ-
ity was elicited in all four conditions, and smaller P2 amplitudes
were detected for negative images than for positive images, irre-
spective the musical prime. Frontal P2 activation within 200ms
is indicative of rapid detection of typical stimulus features .
for positive images is likely indicative of a rapid feature detection
process. This is consistent with previous studies which observed
erately negative and neutral stimuli [20,27]. Because information
processing within 200ms probably occurs subconsciously, and the
frontal P2 activity within 200ms is modulated by the valence
of emotional images across musical primes, the present study
hypothesized a subconscious process that attends to the emotional
salience of picture stimuli rapidly. Interestingly, the amplitude dif-
ferences between negative and positive images were larger with
negative musical primes than with positive musical primes, which
ulated by music-primed mood even at the early processing stages
in the present study, and the bias of the brain for negative images
is intensified by the negative mood induced by the sadness music.
A prior ERP study conducted for anxious subjects suggested that
the amplitudes of the early contingent negative variation (CNV)
were larger when anxious subjects were vigilant towards nega-
tive affective stimuli than towards positive affective stimuli. In
contrast, this bias was absent in non-anxious subjects . It has
been indicated that the amplitude of early CNV is directly related
to vigilance-related attention . Consistent with this view, it
was most likely that, in the present study, the sadness music pro-
vided a negative emotional context, in which the neural sensitivity
and the vigilance of the brain to negative stimuli was enhanced
compared to that in the positive music context. This probably con-
tributed to the increased intensity of the early negative bias with
J. Chen et al. / Neuroscience Letters 445 (2008) 135–139
sadness versus happiness primes during the processing of emo-
ative images elicited increased negativity as compared to positive
images, independent of the type of music prime (Fig. 1). These
differences were largest across the left frontal scalp regions, and
ity in the negative-positive difference waveforms (Fig. 2). There
was some evidence suggesting that the SNW is an index for the
memory-related processes . In particular, it was indicated that
the emotional processing reflected by SNW activity involves a ref-
erence to emotional experiences stored in long-term memory, and
stimuli with increased saliency would evoke greater SNW activ-
ity compared to stimuli of lesser saliency . Therefore, the SNW
activity could be interpreted as reflecting richer associations of
emotional memories evoked by negative stimuli than by positive
Negative stimuli were of greater adaptive importance for indi-
viduals, as they typically represent a salient threat to survival,
which was absent in positive stimuli. Therefore, negative images
in the present study were given enhanced resources for later
cognitive-related processing than were positive stimuli. More
importantly, the negative bias during memory-related stages, as
primes, which suggested that the neural sensitivity of the human
brain to negative stimuli was enhanced under negative mood than
under positive mood. This is similar to prior evidence indicating a
memory tasks [4,12]. Therefore, it seems that the later memory-
related negative bias, as indexed by the larger SNW activity for
negative pictures, would be intensified within a negative mood
Therefore, the processing bias of the brain for negative stimuli,
as termed as emotional negativity bias, occurred in both negative
and positive mood conditions in the present study. This suggests
that the existence of emotional negativity bias, to some extent, is
stable across mood states , despite the variability of its strength
in different emotional contexts. On the other hand, in addition to
repeated reports of the existence of the negativity bias during the
processing of emotional stimuli [9,16,17,20,27], there were still a
body of studies that reported a conspicuous modulation effect of
[10,14,21]. This suggests that individuals different in mental health
may be distinct in emotional processing and may influence the
strength of the processing bias for emotionally negative stimuli.
In fact, there exist noticeable individual differences during the pro-
cessing of emotional stimuli. Difference in sex, as well as that in
personality traits such as neuroticism and extroversion that closely
relate to emotional processing [11,20,22], all contribute a large part
to individual differences in emotional processing, in particular, in
ations within the same individual also influence the strength of
the processing biases for negative stimuli, in addition to the estab-
lished influences of subject variables on emotional activity. Thus,
the present study concluded that the processing bias for negative
over positive stimuli, as termed as emotional negativity bias, exists
stably under different emotional contexts, and the strength of bias
might be influenced by individuals’ mood states.
In conclusion, the current findings provide evidence that the
emotional negativity bias occurs not only at early stage of features
detection, but also in later cognitive and memory-related stages,
of this bias is modulated by individuals’ mood states at each step of
the information-processing stream, and negative mood intensifies
the neural sensitivity of the brain to emotionally negative stimuli.
The authors thank Yue-Jia Luo et al. for developing and provid-
and Jun Zhong for their assistance with EEG recording and analysis.
 N.H. Anderson, Averaging versus adding as a stimulus-combination rule in
impression formation, Journal of Personality and Social Psychology 2 (1965)
 L. Bai, H. Ma, Y.X. Huang, Y.J. Luo, The development of native Chinese affective
19 (2005) 11.
 R.F. Baumeister, E. Bratslavsky, C. Finkenauer, K.D. Vohs, Bad is stronger than
good, Review of General Psychology 5 (2001) 323–370.
 P.H. Blaney, Affect and memory: a review, Psychological Bulletin 99 (2) (1986)
 A.J. Blood, R.J. Zatorre, Intensely pleasurable responses to music correlate with
activity in brain regions implicated in reward and emotion, Proceedings of the
National Academy of Sciences 98 (2001) 11818–11823.
 A.L. Bouhuys, G.M. Bloem, T.G.G. Groothuis, Induction of depressed and elated
mood by music influences the perception of facial emotional expressions in
healthy subjects, Journal of Affective Disorders 33 (1995) 215–226.
 G.H. Bower, Mood and memory, American Psychologist 36 (1981) 129–148.
 G.H. Bower, Commentary on mood and memory, Behaviour Research and Ther-
apy 25 (1987) 443–456.
 J.T. Cacioppo, W.L. Gardner, Emotion Annual Review of Psychology 50 (1999)
 L. Carreti’e, F. Mercadoa, J.A. Hinojosab, M. Martın-Loechesb, M. Sotilloa,
Valence-related vigilance biases in anxiety studied through event-related
potentials, Journal of Affective Disorders 78 (2004) 119–130.
 D. Derryberry, M.A. Reed, Temperament and attention: orienting toward and
away from positive and negative signals, Journal of Personality and Social Psy-
chology 66 (1994) 1128–1139.
 D.E. Dietrich, H.E. Emrich, C. Waller, B.M. Wiering, S. Johannes, T.F. Munte,
Emotion/cognition-coupling in word recognition memory of depressive
patients: an event-related potential study, Psychiatry Research 96 (2000)
 P.E. Goode, P.H. Goddard, J. Pascual-Leone, Event-related potentials index cog-
nitive style differences during a serial-order recall task, International Journal
of Psychophysiology 43 (2002) 123–140.
 I.H. Gotlib, E. Krasnoperova, Biased information processing as a vulnerability
factor for depression, Behavior Therapy 29 (1998) 603–617.
Chinese Mental Health Journal 9 (2004) 631–634.
Neuroscience Letters 398 (2006) 91–96.
 D. Kahneman, A. Tversky, Choices, values, and frames, American Psychologist
39 (1984) 341–350.
 S. Koelsch, T. Fritz, T.F. von Cramon, K. Muller, A.D. Friederici, Human Brain
Mapping 27 (2006) 239–250.
 C.L. Krumhansl, An exploratory study of musical emotions and psychophysiol-
ogy, Canadian Journal of Experimental Psychology 51 (4) (1997) 336–352.
40 (2008) 1921–1929.
 J.L. Nandrino, V. Dodin, P. Martin, M. Henniaux, Emotional information pro-
cessing in first and recurrent major depressive episodes, Journal of Psychiatric
Research 38 (2004) 475–484.
 V.D. Pascalis, O. Speranza, Personality effects on attentional shifts to emotional
charged cues: ERP, behavioural and HR data, Personality and Individual Differ-
ences 29 (2000) 217–238.
 N.K. Smith, J.T. Cacioppo, J.T. Larsen, T.L. Chartrand, May I have your attention,
please: electrocortical responses to positive and negative stimuli, Neuropsy-
chologia 41 (2003) 171–183.
 G. Stenberg, S. Wiking, M. Dahl, Judging words at face value: interference in
word processing reveals automatic processing of affective facial expressions,
Cognition and Emotion 12 (1998) 755–782.
 S. Thorpe, D. Fize, C. Marlot, Speed of processing in the human visual system,
Nature 381 (1996) 520–522.
 W.G. Walter, R. Cooper, V.J. Aldridge, W.C. McCallum, A.L. Winter, Contingent
negative variation: an electric sign of sensorimotor association and expectancy
in the human brain, Nature 203 (1964) 380–384.
 J.J. Yuan, Q.L. Zhang, A.T. Chen, H. Li, Q.H. Wang, Z.Z.C. Zhuang, S.W. Jia, Are we
sensitive to valence differences in emotionally negative stimuli? Electrophysi-
ological evidence from an ERP study, Neuropsychologia 45 (2007) 2764–2771.
 World Medical Organization (1996), Declaration of Helsinki (1964). British
Medical Journal (December 7) 313 (7070), 1448–1449.