A preview of this full-text is provided by American Psychological Association.
Content available from Journal of Experimental Psychology General
This content is subject to copyright. Terms and conditions apply.
Evidence for an Event-Integration Window: A Cognitive Temporal Window
Supports Flexible Integration of Multimodal Events
Madison Lee and Daniel T. Levin
Department of Psychology and Human Development, Vanderbilt University
Just as the perception of simple events such as clapping hands requires a linkage of sound with movements
that produce the sound, the integration of more complex events such as describing how to give an injection
requires a linkage between the instructor’s utterances and their actions. However, the mechanism for inte-
grating these complex multimodal events is unclear. For example, it is possible that predictive temporal rela-
tionships are important for multimodal event understanding, but it is also possible that this form of
understanding arises more from meaningful causal between-event links that are temporally unspecified.
This latter approach might be supported by a cognitive temporal window within which multimodal event
information integrates flexibly with few default commitments about specific temporal relationships. To
test this hypothesis, we assessed the consequences of disrupting temporal relationships between instructors’
actions and their speech in both narrated screen-capture instructional videos (Experiment 1) and live-action
instructional videos (Experiment 2) by displacing the audio channel forward or backward relative to the
video by 0, 1, 3, or 7 s. We assessed learning, event segmentation, disruption awareness, segmentation
uncertainty, and perceived workload. Across two experiments, 7-s temporal disruptions consistently
increased uncertainty and workload and decreased learning in Experiment 2. None of these effects appeared
for 3-s disruptions, which were barely detectable. One-second disruptions produced no effects and were
undetectable, even though much intraevent information falls within this range. Our results suggest the pres-
ence of an event-integration window that supports the integration of events independent of constraining tem-
poral relationships between subevents.
Public Significance Statement
To perceive complex events, we must be able to integrate visual information such as people’s actions or
gestures with corresponding auditory information such as speech. Although these two forms of infor-
mation are mutually supportive, it is not clear whether the precise temporal relationship between
these streams is perceptually and cognitively important. These experiments demonstrate that, within a
several-second window, the temporal relationship between these modalities can be disrupted without
interfering with effective event perception and understanding. We demonstrate that cognitive integration
of multimodal events is temporally flexible, and this may support forms of event understanding that are
robust over small variations in event synchronization and temporal attention.
Keywords: event perception, learning, multimodal integration, psychological present
Effective perception and understanding of real-world events often
require the integration of auditoryand visual information. For simple
events, such as clapping hands, visual movements must be tightly
linked with the sounds that emanate from them. This form of integra-
tion is often referred to as multisensory integration. Research in
the field of neuroscience has established that this form of integra-
tion is associated with a multisensory temporal binding window of
approximately +250 ms where multisensory event information
(i.e., a beep and flash) can be asynchronous yet perceptually bound
and perceived as occurring simultaneously (Wallace & Stevenson,
2014). Such a window is necessary in part because the relationship
between auditory and visual features of multisensory events is incom-
pletely determined by simple timing features. For example, propaga-
tion delays both externally (because of differences in the speed of
Madison Lee https://orcid.org/0000-0001-6395-0976
Results from Experiment 1 were presented as a poster at the Psychonomic
Society’s Annual Conference in 2021. The authors would like to thank Eric
Hall at Vanderbilt’s School of Nursing for contributing to the creation of our
live-action instructional videos. These studies were not preregistered. The
data and materials are publicly available (https://osf.io/x2cm3/).
Madison Lee served as lead for data curation, formal analysis, project admin-
istration, software, visualization, and writing–original draft, contributed equally
to investigation, and served in a supporting role for resources. Daniel T. Levin
served as lead for resources and supervision and served in a supporting role for
formal analysis and writing–original draft. Madison Lee and Daniel T. Levin
contributed equally to conceptualization, writing–review and editing, and
methodology.
Correspondence concerning this article should be addressed to Madison
Lee, Department of Psychology and Human Development, Vanderbilt
University, 230 Appleton Place, Nashville, TN 37203-5721, United States.
Email: madison.j.lee@vanderbilt.edu
Journal of Experimental Psychology: General
© 2024 American Psychological Association
ISSN: 0096-3445 https://doi.org/10.1037/xge0001577
1449
This document is copyrighted by the American Psychological Association or one of its allied publishers.
This article is intended solely for the personal use of the individual user and is not to be disseminated broadly.
2024, Vol. 153, No. 6, 1449–1463
This article was published Online First April 4, 2024.