Content uploaded by J. M. Ausín
Author content
All content in this area was uploaded by J. M. Ausín on Dec 13, 2018
Content may be subject to copyright.
Content uploaded by J. M. Ausín
Author content
All content in this area was uploaded by J. M. Ausín on Dec 13, 2018
Content may be subject to copyright.
Emotion in a 360-Degree vs. Traditional Format Through
EDA, EEG and Facial Expressions
Mª Concepción Castellanos, José Manuel Ausin, Jaime Guixeres, and Enrique
Bigné
1 Introduction
Digital video advertising is growing exponentially. It is expected that digital
video ad spending of the US will see double-digit growth annually through 2020
(eMarketer, 2016). Moreover, advertisers are spending on average more than
$10 million annually on Digital Video, representing an 85% increase from 2
years (iab, 2016). This huge increase is mediated by advances in technology and
the massive use of the technology by the consumers (Krawford, 2011). Among
the most prominent technological tools are new forms of virtual reality, specifi-
cally 360-degree video (Argyriou et al., 2016), that is one of the newest trends in
online marketing in the last years (Gudacker, 2016).
Compared to traditional videos in which the point of view is a focal one de-
termined by the director, in a 360-degree format the viewer has a free and omni-
directional viewpoint. In this way, the viewer can decide at every moment the
point of view to see the video scenes, being able to move their viewpoint in an
arbitrary way to each one of the angles of a 360-degree radius. This change in
the point of view means a new interactive experience with the advertisements
that was not achieved before. The consumers have the freedom to explore the
content based on their interest, without being restricted by the creator or direc-
tor’s choices, and deciding “where and what” to look (Su and Grauman, 2017).
The aim of this exploratory study is to evaluate the effect of interactivity in
emotion during the viewing of a 360-degree video ad, compared to a traditional
one through a quantitative methodology. Surprisingly, advertising research on
360-degree ads is scarce. As far as we know, there have been some case studies
regarding engagement of those ads, but not emotion. For example, Google had
run an experiment to find out if spherical video advertising drive more viewer
engagement than standard video advertising (Habig, 2016). Hence, the question
of interest here is if a 360-degree video advertisement engages more arousal and
positive emotions than the same video ad presented in a traditional way.
© Springer Fachmedien Wiesbaden GmbH, part of Springer Nature 2018
V. Cauberghe et al. (Eds.), Advances in Advertising Research IX, European
Advertising Academy, https://doi.org/10.1007/978-3-658-22681-7_1
4 Castellanos, Ausin, Guixeres, and Bigné
2 Theoretical Background
2.1 Emotion and Advertising
Emotion refers to the coordination of cerebral, physiological, and behavioral
changes that facilitate an external or internal response of significant relevance
(Davidson 2004). In the last 30 years emotions has been playing an important
role in the consumers’ response: as markers, mediators and/or moderators (Ba-
gozzi et al., 1999), as increasers of brand attitude (Russell, 2002), as persuaders
of consumption (Johar et al., 2006), or as a predictors of brand intent purchase
(Morris et.al, 2002). Advertisers’ creation of surprising, engaging and entertain-
ing ads, in which the emotional content is emphasized, leads to consumers to
remember the product and build positive associations with the brand (McDuff,
2017). Related to engaging, previous research has found that emotions as joy
and surprise can be leveraged to engage consumers in watching Internet video
advertisements (Teixeira et al., 2012).
2.2 Emotion and 360-Degree Video
The user’s experience with a 360-degree video has two important features.
First, it resembles navigation in both 3D virtual and real worlds (Smolic et al.,
2006). Second, it leads to a more immersive experience (Ramalho and Chambel,
2013). Both of them seem to intensify the emotional response. For example,
(Visch et al., 2010) found that the stronger the immersion, the more intense
emotions participants felt viewing a film.
2.3 Measuring Emotional Responses to Video Ads
Poels and Dewitte (2006) pointed out two major types of methods to measure
emotions. The first ones, self-report methods, measure the subjectively and con-
sciously experienced feelings and can be biased from desired to please (or not)
and comfort with the context and other factors (Aaker et al., 1986). The second
ones, autonomic psychophysiology methods, focus their measurements on con-
tinuous emotional reactions and changes in central, autonomic and somatic
nervous system and are affected by neither higher cognitive processes nor a
post-hoc reflection (McDuff, 2017).
Among the various psychophysiological techniques (Wang and Minor,
2008), we are going to focus on both electroencephalography (EEG) and facial
coding (FC) as measures of valence (stimulus or situation pleasantness), and
electrodermal activity (EDA) as measure of arousal (stimulus or situation inten-
sity or activation level). Valence and arousal have been used as axes to frame
emotion in the past and current research (Lang, 1995; Kuppens et al., 2013).
Emotion in a 360-Degree vs Traditional Format 5
Regarding EEG, it enables to measure the asymmetries of the electrical activ-
ity of brain hemispheres in the frontal part of the brain (Pentus et al., 2014). It
has been proposed that the greater activity of the left hemisphere is associated
with approach-related action planning (and therefore related with positive emo-
tions caused by stimuli), and the greater activity of the right hemisphere should
be associated with withdrawal-related emotion (Davidson, 2004). (Ohme et al.,
2010) and (Vecchiato et al., 2011) found a dominance of the left hemisphere
related to the pleasantness of TV commercial advertisements. The frontal asym-
metry is calculated as an index or ratio between right and left EEG activity.
Higher values mean a relative dominance of the left hemisphere.
FC is an observational method of capturing behavior on the face (McDuff,
2017), using an objective-coding scheme of muscle movements or facial actions.
Most objective coding uses the Facial Action Coding System (FACS) (Cohn et
al., 2007), and describes the appearance of the face when muscle movements are
present, and the emotions related to these muscle movements. (Teixeira et al.,
2012, 2014) found links between facial expressions and “zapping” behavior, and
facial responses and purchase intent, respectively.
Used in advertising research as a means to analyze an emotional state pro-
duced by advertising stimuli (Ohme et al., 2009; Peacock et al., 2011), EDA
measures the electrical conductance of the skin according to the amount of mois-
ture (sweat) that correlates positively with the intensity of an emotional activa-
tion caused by stimuli. It is a reliable, valid means to measure the level of ex-
citement or arousal (Bolls, Lang and Potter, 2001). An increase in conductance
can be interpreted as a physiological activation. Related to interactivity, EDA
was higher when participants had control over the onset of the stimuli –pictures-
they had to evaluate (Wise and Reeves, 2007).
The goal of this research was twofold: 1) to compare the elicited emotions of
two ad formats (360-degree vs. traditional video) through psychophysiological
measurements; and 2) to assess self-reported measurements with physiological
ones in 360-degree vs. traditional formats.
The research questions related to those goals were:
RQ1: Do self-reported measures differ between 360-degree and traditional
video ads?
RQ2: Will emotional valence (measured by means of FC, and frontal asym-
metry) be greater for a 360-degree video ad than for a traditional video ad?
RQ3: Will emotional arousal (measured by means of EDA) be greater for a
360-degree video ad than for a traditional video ad?
6 Castellanos, Ausin, Guixeres, and Bigné
3 Method
3.1 Sample
The sample consisted of 79 participants (38 females and 41 males, 19-37
years old, M= 26.08 ± 4.15), being recruited from the database of the i3B Insti-
tute of the Polytechnic University of Valencia (UPV). This database is com-
posed of people inscribed through e-mail or telephone and willing to join as
participants. They were reimbursed for their participation with a gift card of 15€.
As explained in the Results Section, some participants were excluded from the
analyses.
3.2 Design, Stimuli and Apparatus
This experiment was part of a larger study relating driving and advertising in
which the EEG, the EDA, the FC, eye-tracking (ET) and electrocardiogram
(ECG) were recorded. First, participants watched a TV program in which a road
safety campaign (or a control ad) was included in the commercial breaks. Sec-
ond, participants watched the BMW M2 advertisement and responded some
related questions. And third, participants drove a selected route in a driving
simulator. This research is focused in the second part, and only the EEG, the
EDA and the FC measures are going to be described. This experiment was
planned as a one-way independent samples design, being the intergroup factor
the video format. Participants were randomly assigned to the two groups, a 360-
degree video ad Group (360-degree Group) and a non 360-degree video ad
Group (non 360-degreeGroup). Both groups were shown a similar video ad,
from the BMW M2, presented in different formats. Participants included the
360-degree Group were exposed to a 360-degree video ad (https://www.you
tube.com/watch?v=q87oVPusWT0), the most popular 360-degree ad on
YouTube Ads Leaderboard (YouTube 2016). In the other group, the non 360-
degreeGroup, participants were exposed to the video ad in a traditional format
(https://www.youtube.com/watch?v=DVOfGi1gScE). In both videos, viewers
were invited to keep their eyes on a famous model, which climbed into the pas-
senger side of one of three blue cars. Then the three cars started moving and two
other cars appeared. The five cars weaved in and out of each other's lanes while
tearing down a runway. At the end, all five cars came to a stop in a neat row and
the ad invited to guess in which car the model was.
A marked difference between the ads was their duration; in the traditional
format lasted about 45 seconds, and in the 360-degree format, about 80 seconds.
Although both ad durations differed, the analyzed data were the average of the
continuous data (see Section 3.4). As Luck (2010) noted, the mean or average is
a measure that it is not biased by the number of points included in its calculation.
Emotion in a 360-Degree vs Traditional Format 7
The ad videos in both formats and questionnaires were displayed on a 23”
TFT screen (1920x1080) connected to a PC in which all the sensors were con-
nected. This PC, running iMotions software (iMotions 2016), presented the
videos, collected the questionnaires responses, recorded all data and performed
some on-line processing of the data (noted below).
For recording EEG and ECG the wireless B-Alert X-10 system
(www.advancedbrainmonitoring.com/xseries/; Johnson et al., 2011) was used.
This system consisted of nine Ag/AgCl EEG channels – scalp positions F3, Fz,
F4, C3, Cz, C4, P3, POz, and P4 according to the International 10-20 system
(Klem et al., 1999) - and two ECG lead sites, located under the right clavicle and
on the lower left abdomen within the rib cage frame of the participant, respec-
tively. Another two channels were placed on both left and right mastoids and
were used as EEG references. EEG and ECG data were acquired at 256 Hz. A
wireless Affectiva’s Q sensor (www.affectiva.com) recorded EDA, consisting of
a bracelet with attached AgCl dry electrodes and situated at the ventral side of
wrist of the non-dominant hand. Data was recorded at a sampling rate of 32Hz.
ET was recorded at a sampling rate of 300Hz by a Tobii TX300 Eye Tracker
(Tobii Technology AB, Danderyd, Sweden) attached to the bottom of the screen.
For recording FC a webcam Logitech QuickCam Pro 9000 (1600 x 1200 @ 30
fps) was placed on the top of the screen.
3.3 Procedure
Experimental session was performed individually in a dim lab room at the
i3B Institute of the Polytechnic University of Valencia (UPV). Each participant
was received, informed about the study, and was asked to sign an informed con-
sent form, approved by the UPV’s Ethic Commission. Participants then sat in
front of the experimental screen and they were fitted with EEG, ECG, and EDA
recording devices and the goodness of their recordings checked. While the sen-
sors were placed, participants filled in a pre-questionnaire about demographic
characteristics. Then began an EEG baseline recording of nine minutes followed
by an ET calibration of nine points. When this calibration was excellent or good,
the study and data acquisition began. The first and third experimental phases,
watching a TV program with some commercial breaks, and driving a selected
route in a driving simulator, are not analyzed in this work. After participants
finished the first phase ET was calibrated again. Then, a baseline stimulus for
FC (a grey screen lasting 6 seconds) was presented, and the BMW´s ad started.
Participants’ task was a kind of shell game: they have to keep their eyes in the
car in which the model was while the cars moved in a fast-paced vehicle chore-
ography, making the task virtually impossible. When the cars stopped, partici-
pants had to identify the model’s car, submitting their guesses at a BMW’s dedi-
cated microsite (http://www.eyesongigi.com, now canceled).
8 Castellanos, Ausin, Guixeres, and Bigné
Procedure for both groups was the same, excepting that participants in the
360-degree Group have to use the mouse to change the point of view of the
scene. In order to do that, they had little practice before watching the video.
After watching the video ad, both groups completed a survey regarding feel-
ings towards the brand (one item, free answer), likeability towards the brand
(one item, five-point scale -“None” to “A lot”-), attitude towards the brand
(Marks and Olson, 1981); five bipolar items labeled “Attractive/Unattractive”,
“Good/Bad”, “Likeable/Unlikeable”, “Favorable/Unfavorable“, “Friendly/
Unfriendly”, 2-point scale), willingness to buy the product if money were avail-
able (1 item, 4 points scale –Definitively not” to “Definitively yes”), attitude
toward the ad technology (1 item: “I believe this advertisement makes use of the
latest available technology”, 5-point scale –“Totally disagree” to “Totally
agree”-) and attitude towards advertising (8 items: “Ads give me new ideas”, “I
know newest and most competitive products through advertising”, “I like adver-
tising because do not offend any society”, “Advertising appreciates creativity”,
“I do not approve advertising because it does not provide a real view of the
advertised product”-reverse coding-, “Advertising influences my buying deci-
sion process”,” I do not like advertising because tend to be deceptive” –reverse
coding-, “My general opinion about advertising is positive”; 5-point scale –
“Totally disagree” to “Totally agree”-) as well as questions about the more like-
able element and meaning of the ad (both free answers). Also, participants com-
plete a memory test including a recall brand question, and questions about the
number, color, and model of the cars as well as the name of the girl who appears
in the ad (free answers). Participants’ responses to these recall tests were labeled
as correct or incorrect for posterior analyses. The memory test results are not
included in this work.
3.4 Data Processing
Analyses of ET and ECG are not included in this work. B-Alert system (in-
corporated in iMotions software) performed processing of EGG on-line (Berka
et al., 2007). EEG signal was filtered with a band-pass filter (0.5–65Hz) before
the analog-to-digital conversion. To remove power network artifacts, notch
filters at 50, 60, 100, and 120Hz were applied. B-Alert system features automat-
ic signal decontamination of EEG including measures for electromyography
(EMG), electrooculography (EOG), spikes, saturations, and excursions. After
that, the power spectral densities (PSD) of all frequency bands were computed
for both videos on a second-by-second basis in each electrode site using a Fast-
Fourier transform with a 50% overlapping window. The frontal asymmetry index
was calculated off-line using an in-house MATLAB script. First, alpha power
(frequency band of 8-13Hz) in F3 and F4 sites was first natural log transformed
for each 1 second epoch. Then a difference score was calculated ((ln[F4]-ln[F3])
Emotion in a 360-Degree vs Traditional Format 9
to summarize the relative activity at homologous right and left sites (Allen et al.,
2004, Smith et al., 2017). Finally, those difference scores were averaged togeth-
er.
EDA data were analyzed with LEDALAB (www.ledalab.de). Data were
down-sampled offline at 10 Hz (Lajante et al., 2012) and then analyzed by the
method of continuous decomposition analysis (CDA), reflecting the skin con-
ductance level (SCL) in continuous measures of tonic EDA and the skin con-
ductance response (SCR) in continuous measures of phasic EDA (Lajante et al.,
2012). To determine the SCR, a threshold criterion of 0.05 μS was used. Follow-
ing Lajante et al. work, SCR index was quantified as integrated SCR (ISCR,
units of μS x second), which represents the area under the curve of the phasic
activity and allows the integration of both the spatial and temporal dimensions of
SCRs. Finally, data were normalized thorough the formula SC* = log (1 + |SC|),
in order to reduce between-participants differences in the magnitude of the re-
sponse (Lajante et al., 2012).
FC data were automatically processed on-line by an automatic facial expres-
sion recognition analysis software incorporated in iMotions software. This soft-
ware track, frame-by-frame, 20 action units corresponding to an individual face
muscle or muscle group based on the Facial Action Coding System. Different
combinations of different action units constitute different target expressions, and
each target expression is classified as a different emotional state. The software
provides an Evidence measure in logarithmic (base 10) of the odds of a target
expression being present. Namely, the software classifies seven basic emotions
(joy, anger, surprise, fear, contempt, disgust, and sadness), two complex emo-
tions (frustration, confusion) and valence (positive, negative and neutral).Only
joy and surprise were analyzed in this work. Off-line analysis using an in-house
MATLAB script to calculate the time percentage in which a joyful o surprised
expression was considered as present if the evidence score was above an ampli-
tude-based threshold of 0.5, meaning a chance of 75% of being in that state.
Before this calculation, a baseline correction was carried out by subtracting the
median score during the baseline from the median of each target expression in
both video ads.
Some participants were excluded in the following analyses due to a high lev-
el of artefacts and noise in the recordings: 6 participants in the EEG data, 13
participants from the FC data and 43 participants from the EDA data.
4 Results
Digital video advertising is growing exponentially. It is expected that digital
video ad spending of the US will see double-digit growth annually through 2020
(eMarketer, 2016). Statistical analyses were carried out with SPSS software
10 Castellanos, Ausin, Guixeres, and Bigné
(IBM SPSS v22.0. Chicago, IL). Results are reported as the mean ± SD or as the
median when distributions were not normal.
All the questionnaires were analyzed except feelings towards the brand, the
more likeable element and the meaning of the ad. Items regarding likeability
towards the brand, willingness to buy the product if money were available and
attitude toward the ad were analyzed through Mann-Whitney’s U test for inde-
pendent samples. Scores for the attitude towards the brand (Cronbach’s alpha
=0.79 for non 360-degreeGroup, and 0.77 for 360-degree Group) and the atti-
tude towards advertising scales (Cronbach’s alpha =0.73 for non 360-
degreeGroup, and 0.61 for 360-degree Group) were measured by the mean of
the responses to the items composing each scale. Their distributions were not
normal, according to Kolmogorov-Smirnov test, so the Mann-Whitney’s U test
for independent samples was used to compare both groups.
Regarding the normality of continuous measures, both the EDA data and
frontal asymmetry index were normalized after data processing, as noted above.
Hence, to compare both groups those measures were analyzed by a t-test for
independent samples. An exploratory analysis showed the distribution for the FC
data was not normal, according to the Kolmogorov-Smirnov test, and was ana-
lyzed by the Mann-Whitney non-parametric test for independent samples.
We found (see Figure 1) a significantly larger index of frontal asymmetry for
the 360-degree Group (0.09±0.07) than for the non 360-degree Group
(0.13±0.07) (t(72)= -2.476, p= .016).
0
0.05
0.1
0.15
non 360-de gree Group 360-degre e Group
Frontal asymmetry index
Figure 1. Mean of the frontal asymmetry index. Error bars show SEM.
Emotion in a 360-Degree vs Traditional Format 11
Related to emotional expression (Figure 2), the percentage of time express-
ing joy was higher and significantly different in the 360-degree Group compared
to the non 360-degreeGroup (U = 355.00, z = -2.38, p<.0),. There was no differ-
ence between both groups in the time expressing surprise (U = 499.50, z = -5.22,
ns).
-0.2
0
0.2
0.4
0.6
0.8
1
1.2
non 360-degree
Group
360-degree
Group
non 360-degree
Group
360-degree
Group
JOY SURPRISE
Percentage
Figure 2. Box plot of the joy and surprise emotion. White lines depict median of each
group, boxes represent interquartile range and whiskers represent the range of the data.
Related to emotional activation or arousal, the ISCR did not differ between
the 360° Group (0.26±0.16) compared to the non-360° Group (0.21±0.06)
(t(24.738) =0.827, ns).
5 Discussion and Implications
In digital advertising, it seems clear that the advance of the technology de-
termines its expansion. In our study, we wanted to evaluate a new format of a
video advertisement, concretely the 360-degree video. This format allows the
consumer to experience the content of the ad based on its interest, leading to a
more immersive and emotional experience. Our study is one of the first in its
field in analyzing the attitudinal and emotional impact of a 360-degree video
advertisement compared to a traditional one. The evaluation was carried out
12 Castellanos, Ausin, Guixeres, and Bigné
through self-report measures and psychophysiological techniques. The self-
reported measures only reported differences in the attitude about the advertise-
ment technology. The psychophysiological techniques reported differences
between the groups in valence but not in arousal. Immersive and interactive
features of a 360-degree advertisement enhanced participants’ emotional re-
sponses of valence. The higher frontal asymmetry index and the longer time
expressing joy watching a 360-degree ad, compared to a traditional one, are in
line with previous findings in advertisement research.
Nevertheless, there were not differences between both videos in the surprise
expression, and in measure of arousal. Due to the duration of the videos, the
expression of surprise could be circumscribed to the beginning of the videos,
and the average of the continuous measure could hide it. The similarity of the
arousal level could be explained by the kind of task participants had to perform
in both video formats, because the shell game it is virtually impossible to
achieve.
Main managerial implications of our work are twofold. First, the 360-degree
video ad seems to produce more positive emotion than the traditional one, there-
fore it could be a good solution for the digital advertising. Second, non-
conscious and continuous measures as EEG and FC can help both brands and
digital advertisers to evaluate consumers’ emotional response during the video
ads viewing.
6 Limitations and Future Research
Our research presents some limitations. First, the self-report measures did
not include emotional evaluation. It would be interesting to compare the results
of subjective and conscious measures on one hand, and more objective and non-
conscious measures on the other. Second, we did not measure the subjective
experience of interaction, navigation and immersion for the two formats. We
assumed that the 360-video ad is more interactive and immersive, but a measure
of how participants evaluate this features seems necessary. Third, we presented
the 360-degree ad in a PC and the change of the viewpoint had to be made by
the mouse. Other devices as mobile and virtual reality glasses could produce a
more “real” experience in which the perspective change is made by the head
movement, for example, and the emotional effect could be higher.
7 Acknowledgements
Acknowledgement: This work was supported by the Ministry of Economy
and Competitiveness (Spain) under Grant ECO2014-53837R.
Emotion in a 360-Degree vs Traditional Format 13
8 References
Aaker, D. A.; Stayman, D. M. and Hagerty, M. R. (1986), “Warmth in advertis-
ing: Measurement, impact, and sequence effects,” in Journal of Consumer
Research, Vol. 12(4), 365-381.
Allen, J.J.; Coan, J.A., and Nazarian, M., (2004), “Issues and assumptions on the
road from raw signals to metrics of frontal EEG asymmetry in emotion,” in:
Biological Psychology, Vol. 67, 183–218.
Argyriou, L.; Economou, D., Bouki, V., and Doumanis, I. (2016), “Engaging
immersive video consumers: Challenges regarding 360-degree gamified video
applications,” In: Ubiquitous Computing and Communications and 2016 In-
ternational Symposium on Cyberspace and Security (IUCC-CSS), Interna-
tional Conference on (pp. 145-152). IEEE.
Bagozzi, R. P.; Wong, N., and Yi, Y. (1999), “The role of culture and gender in
the relationship between positive and negative affect,” in: Cognition and
Emotion, Vol. 13(6), 641-672.
Berka, C.; Levendowski, D. J., Lumicao, M. N., Yau, A., Davis, G., Zivkovic,
V. T., ... and Craven, P. L. (2007), “EEG correlates of task engagement and
mental workload in vigilance, learning, and memory tasks,” in: Aviation,
space, and environmental medicine, Vol. 78(5), B231-B244.
Bolls, P. D.; Lang, A., and Potter, R. F. (2001), “The effects of message valence
and listener arousal on attention, memory, and facial muscular responses to
radio advertisements,” in: Communication Research, Vol. 28(5), 627-651.
Chambel, T.; Chhaganlal, M. N., and Neng, L. A. (2011), “Towards immersive
interactive video through 360 hypervideo,” In: Proceedings of the 8th Inter-
national Conference on Advances in Computer Entertainment Technology,
(p. 78), ACM.
Cohn, J. F., Ambadar, Z., & Ekman, P. (2007). “Observer-based measurement of
facial expression with the Facial Action Coding System” in: The handbook of
emotion elicitation and assessment, 203-221.
Davidson, R. J. (2004), “What Does the Prefrontal Cortex ‘Do' in Affect: Per-
spectives on Frontal EEG Asymmetry Research," Biological Psychology, 67,
219-33.
eMarketer 2016. “Digital Video Advertising Continues to Expand,“ accessed
January 10. Retrieved from https://www.emarketer.com/Article/Digital-
Video-Advertising-Continues-Expand/1013722
Gudacker, J. (2016), “360 Degree Videos - An Online Marketing Revolution? A
Critical Review and Outlook for Social Media Marketers,” accessed January
10. Retrieved from http://www.brandba.se/blog/2016/8/23/360-degree-
videos-an-online-marketing-revolution-a-critical-review-and-outlook-for-
social-media-marketers
Habig, J. (2016) , “Is 360 video advertising worth it? Think with Google. Re-
trieved from https://www.thinkwithgoogle.com/advertising-channels/video/
360-video-advertising/
iab (2016), “2016 IAB Video Ad Spend Study ,“ accessed January 10. Retrieved
from https://www.iab.com/wp-content/uploads/2016/04/2016-IAB-Video-Ad-
Spend-Study.pdf
iMotions Biometric Research Platform 6.0, iMotions A/S, Copenhagen, Den-
mark, 2016.
14 Castellanos, Ausin, Guixeres, and Bigné
Johar, G. V., Maheswaran, D., and Peracchio, L. A. (2006), “Mapping the fron-
tiers: Theoretical advances in consumer research on memory, affect, and per-
suasion,” in: Journal of Consumer Research, Vol. 33(1), 139-149.
Johnson, R. R.; Popovic, D., Olmstead, R. E., Stikic, M., Levendowski, D. J.,
and Berka, C. (2011), “Drowsiness determination through EEG: develop-
ment and validation,” in: Biological Psychology , Vol. 87(2), 241-250.
Klem, G. H.; Luders, H. O., Jasperf, H. H., and Elger, C. (1999), Section 1.
“EEG (Technical Standards and Glossary)-1.1. The ten-twenty electrode sys-
tem of the International Federation,” in: Electroencephalography and Clini-
cal Neurophysiology, Supplements only, (52), 3-6.
Krawford, K. (2011) “Digital Technologies Effect on Humans and Societies,”
accessed January 10, 2017. Retrieved from http://cyberpsych.tumblr.com
/post/13004936142/increased-use-of-digital-devices
Kuppens, P.; Tuerlinckx, F., Russell, J. A., and Barrett, L. F. (2013), “The rela-
tion between valence and arousal in subjective experience,” in: Psychological
Bulletin, Vol. 139(4), 917.
Lajante, M.; Droulers, O., Dondaine, T., and Amarantini, D. (2012), “Opening
the “black box” of electrodermal activity in consumer neuroscience research,”
in: Journal of Neuroscience, Psychology, and Economics, Vol. 5(4), 238.
Lang, P. J. (1995), “The emotion probe: Studies of motivation and attention,” in
American Psychologist, Vol. 50(5), 372
Luck, S.J. (2010), “Is it legitimate to compare conditions with different numbers
of trials?” accesed May 10, 2016 from http://erpinfo.org/Members/
sjluck/Mean_Peak_Noise.pdf/view
Marks, L. J., and Olson, J. C. (1981), “Toward a cognitive structure conceptual-
ization of product familiarity,” in: “NA - Advances in Consumer Research
Volume 08, eds. Kent B. Monroe, Ann Abor, MI : Association for Consum-
er”, Vol. 8, 145-150.
McDuff, D. (2017), “New Methods for Measuring Advertising Efficacy”. Digi-
tal Advertising: Theory and Research.
Morris, J. D.; Woo, C., Geason, J. A., and Kim, J. (2002), “The power of affect:
Predicting intention,” in: Journal of Advertising Research, Vol. 42(3), 7-17
Ohme, R.; Reykowska, D., Wiener, D., and Choromanska, A. (2009), “Analysis
of neurophysiological reactions to advertising stimuli by means of EEG and
galvanic skin response measures,” in: Journal of Neuroscience, Psychology,
and Economics, Vol. 2(1), 21.
Ohme, R.; Reykowska, D., Wiener, D., and Choromanska, A. (2010), “Applica-
tion of frontal EEG asymmetry to advertising research,” in: Journal of Eco-
nomic Psychology, Vol. 31(5), 785-793.
Peacock, J.; Purvis, S., and Hazlett, R. L. (2011), “Which broadcast medium
better drives engagement? ,” in: Journal of Advertising Research, Vol. 51(4),
578-585.
Pentus, K.; Mehine, T., and Kuusik, A. (2014), “Considering Emotions in Prod-
uct Package Design through Combining Conjoint Analysis with Psycho Phys-
iological Measurements,” Procedia-Social and Behavioral Sciences, Vol.
148, 280-290.
Poels, K., Dewitte, S. (2006), “How to Capture the Heart? Reviewing 20 Years
of Emotion Measurement in Advertising,” in: Journal of Advertising Re-
search, Vol. 46(1), 18–37.
Emotion in a 360-Degree vs Traditional Format 15
Ramalho, J., and Chambel, T. (2013, October), “Immersive 360 mobile video
with an emotional perspective,” In: Proceedings of the 2013 ACM interna-
tional workshop on Immersive media experiences (pp. 35-40), ACM.
Russell, C. A. (2002), “Investigating the effectiveness of product placements in
television shows: The role of modality and plot connection congruence on
brand memory and attitude,” in: Journal of consumer research, Vol. 29(3),
306-318.
Smith, E. E.; Reznik, S. J., Stewart, J. L., and Allen, J. J. (2017), “Assessing and
conceptualizing frontal EEG asymmetry: An updated primer on recording,
processing, analyzing, and interpreting frontal alpha asymmetry,” in: Interna-
tional Journal of Psychophysiology, Vol. 111, 98-114.
Smolic, A.; Mueller, K., Merkle, P., Fehn, C., Kauff, P., Eisert, P., and Wie-
gand, T. (2006), “3D video and free viewpoint video-technologies, applica-
tions and MPEG standards,” in: Multimedia and Expo, 2006 IEEE Interna-
tional Conference on (pp. 2161-2164), IEEE.
Su, Y. C., and Grauman, K. (2017), “Making 360-degree Video Watchable in
2D: Learning Videography for Click Free Viewing”. arXiv preprint
arXiv:1703.00495.
Teixeira, T., Picard, R., and El Kaliouby, R. (2014), “Why, when, and how
much to entertain consumers in advertisements? A web-based facial tracking
field study,” in: Marketing Science, Vol. 33(6), 809-827.
Teixeira, T., Wedel, M., and Pieters, R. (2012), “Emotion-induced engagement
in internet video advertisements,” in: Journal of Marketing Research, Vol.
49(2), 144-159.
Vecchiato, G.; Toppi, J., Astolfi, L., Fallani, F. D. V., Cincotti, F., Mattia, D.,
and Babiloni, F. (2011), “Spectral EEG frontal asymmetries correlate with the
experienced pleasantness of TV commercial advertisements,” in: Medical
and biological engineering and computing, Vol. 49(5), 579-583.
Visch, V. T.,; Tan, E. S., and Molenaar, D. (2010), “The emotional and cogni-
tive effect of immersion in film viewing,” in: Cognition and Emotion, Vol.
24(8), 1439-1445.
Wang, Y. J.; and Minor, M. S. (2008), “Validity, reliability, and applicability of
psychophysiological techniques in marketing research,” in: Psychology and
Marketing, Vol. 25(2), 197-232.
Wise, K., and Reeves, B. (2007), “The effect of user control on the cognitive
and emotional processing of pictures,” in: Media Psychology, Vol. 9(3), 549-
566.
YouTube 2016, “Most popular 360 Ads on YouTube - Cannes 2016” Accessed
January 10. Retrieved from https://www.thinkwithgoogle.com/intl/en-
gb/platforms/video/leaderboards/360-ads-youtube-cannes-2016.html