Project

Neurocinematics - an investigation into EEG, blinks and cuts in film

Goal: This project aims to see the relationship and differences in EEG signals between cuts in film and where the viewer blinks. It has been theorized that when we blink during film viewing we cuts the sequence of images and stores what we have just seen. This behavior can be hypothesized being the same when there is a physical cut in movies.
This project is a part of Javier Sanz PhD dissertation about semantics in film.

Updates
0 new
3
Recommendations
0 new
1
Followers
0 new
2
Reads
0 new
50

Project log

Andreas Wulff-Abramsson
added 2 research items
Unity is a software specifically designed for the development of video games. However, due to its programming possibilities and the polyvalence of its architecture, it can prove to be a versatile tool for stimuli presentation in research experiments. Nevertheless, it also has some limitations and conditions that need to be taken into account to ensure optimal performance in particular experimental situations. Such is the case if we want to use it in an experimental design that includes the acquisition of biometric signals synchronized with the broadcasting of video and audio in real time. In the present paper, we analyse how Unity (version 5.5.1f1) reacts in one such experimental design that requires the execution of audio-visual material. From the analysis of an experimental procedure in which the video was executed following the standard software specifications, we have detected the following problems desynchronization between the emission of the video and the audio; desynchronization between the temporary counter and the video; a delay in the execution of the screenshot; and depending on the encoding of the video a bad fluency in the video playback, which even though it maintains the total playback time, it causes Unity to freeze frames and proceed to compensate with little temporary jumps in the video. Finally, having detected all the problems, a compensation and verification process is designed to be able to work with audio-visual material in Unity (version 5.5.1f1) in an accurate way. We present a protocol for checks and compensations that allows solving these problems to ensure the execution of robust experiments in terms of reliability.
Unity is a software specifically designed for the development of video games. However, due to its programming possibilities and the polyvalence of its architecture, it can prove to be a versatile tool for stimuli presentation in research experiments. Nevertheless, it also has some limitations and conditions that need to be taken into account to ensure optimal performance in particular experimental situations. Such is the case if we want to use it in an experimental design that includes the acquisition of biometric signals synchronized with the broadcasting of video and audio in real time. In the present paper, we analyse how Unity (version 5.5.1f1) reacts in one such experimental design that requires the execution of audio-visual material. From the analysis of an experimental procedure in which the video was executed following the standard software specifications, we have detected the following problems desynchronization between the emission of the video and the audio; desynchronization between the temporary counter and the video; a delay in the execution of the screenshot; and – depending on the encoding of the video – a bad fluency in the video playback, which even though it maintains the total playback time, it causes Unity to freeze frames and proceed to compensate with little temporary jumps in the video. Finally, having detected all the problems, a compensation and verification process is designed to be able to work with audio-visual material in Unity (version 5.5.1f1) in an accurate way. We present a protocol for checks and compensations that allows solving these problems to ensure the execution of robust experiments in terms of reliability.
Andreas Wulff-Abramsson
added an update
As we speak Javier have been on a second visit to Aalborg University Copenhagen.
During this visit he developed a novel toolkit to more rigorously analyze ERD and ERS patterns across and within different types of stimuli.
Through this method clear significant ERD and ERS pattern across cuts in film were found, thus enabling us to start writing the first result specific paper based on the experiment we ran during spring 2017.
Furthermore, a technical paper on how to create a reliable synchronized system using Unity, which deals with psychophysiological, audio and visual signals, was devised. This manuscript is currently being proofread and should be ready to publish before the end of summer 2018.
 
Andreas Wulff-Abramsson
added an update
After Andreas Wulff-Jensen's visit at Barcelona University he and Javier Sanz laid out an EEG data analysis procedure.
1. analyze the EEG data from the film experiment with the aim of replicating previous Neurocinematic studies - This gave blended results. Their data supports 1.5 out of 3 studies with simple but similar version of the stimuli used in our study. The EEG waves for when zooming in and out were the same as a previous study. Furthermore, in one of the used movies breaking of 180 rule was present and through studying of the signals towards that we could replicate similar result to a previous study, which looks at a similar sensation.
2. At the visit we also found common EEG ERD/ERS components between movies when all types are averaged together.
Next up is to analyse the power band and find cross movie components in the different kinds of cuts and to find cross syntactical change EEG components.
as far as we figured we would also be looking look for similar components between syntactical changes and physical cuts.
 
Andreas Wulff-Abramsson
added an update
After J. Sanz visit at Aalborg University Copenhagen during spring 2017, we have successfully ran an EEG experiment.
In this the subject was presented 8 film sequences. 4 with physical shots at every syntactic change. 4 with continuous shots, being shots with a lot of syntactical changes, but without any physical shots.
while the subjects viewed the films their EEG, eye.tracking, EOG, and face were recorded.
As for now the EEG data are still un-treated, further analysis will come.
 
Andreas Wulff-Abramsson
added a project goal
This project aims to see the relationship and differences in EEG signals between cuts in film and where the viewer blinks. It has been theorized that when we blink during film viewing we cuts the sequence of images and stores what we have just seen. This behavior can be hypothesized being the same when there is a physical cut in movies.
This project is a part of Javier Sanz PhD dissertation about semantics in film.