Auditory temporal edge detection in human auditory cortex

Equipe Audition, Laboratoire de Psychologie de la Perception, CNRS (UMR 8158) Université Paris Descartes and Ecole Normale Supérieure, France.
Brain Research (Impact Factor: 2.83). 07/2008; 1213:78-90. DOI: 10.1016/j.brainres.2008.03.050
Source: PubMed

ABSTRACT Auditory objects are detected if they differ acoustically from the ongoing background. In simple cases, the appearance or disappearance of an object involves a transition in power, or frequency content, of the ongoing sound. However, it is more realistic that the background and object possess substantial non-stationary statistics, and the task is then to detect a transition in the pattern of ongoing statistics. How does the system detect and process such transitions? We use magnetoencephalography (MEG) to measure early auditory cortical responses to transitions between constant tones, regularly alternating, and randomly alternating tone-pip sequences. Such transitions embody key characteristics of natural auditory temporal edges. Our data demonstrate that the temporal dynamics and response polarity of the neural temporal-edge-detection processes depend in specific ways on the generalized nature of the edge (the context preceding and following the transition) and suggest that distinct neural substrates in core and non-core auditory cortex are recruited depending on the kind of computation (discovery of a violation of regularity, vs. the detection of a new regularity) required to extract the edge from the ongoing fluctuating input entering a listener's ears.

Download full-text


Available from: Jonathan Z Simon, Aug 08, 2015
  • Source
    • "Such differences in spatial distribution, which are probably explained by the fast and regular presentation rate, could reflect the distinct neuronal generators involved in the processing of local and global deviations [L€ utkenh€ oner, 2003]. The absence of earlier activity preceding the global MMNm is in line with previous findings showing that sound transitions from a regular sequence to a constant pure-tone elicited a peak at 160 ms after transition but no earlier activity in the P50 time range, suggesting that early mechanisms of deviance detection may be limited by the kinds of regularity they can compute [Chait et al., 2008]. Based on the lack of global deviance-related effects in the time range of the MLR, we suggest that very early deviance detection mechanisms work, at least for frequency [Leung et al., 2012], at the feature level [Alho et al., 2012; Althen et al., 2011; Grimm et al., 2012; Leung et al., 2013], and reflect an early stage prior to feature combination , sequential grouping, or the extraction of regularities based on the interrelationship between sounds. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Our auditory system is able to encode acoustic regularity of growing levels of complexity to model and predict incoming events. Recent evidence suggests that early indices of deviance detection in the time range of the middle-latency responses (MLR) precede the mismatch negativity (MMN), a well-established error response associated with deviance detection. While studies suggest that only the MMN, but not early deviance-related MLR, underlie complex regularity levels, it is not clear whether these two mechanisms interplay during scene analysis by encoding nested levels of acoustic regularity, and whether neuronal sources underlying local and global deviations are hierarchically organized. We registered magnetoencephalographic evoked fields to rapidly presented four-tone local sequences containing a frequency change. Temporally integrated local events, in turn, defined global regularities, which were infrequently violated by a tone repetition. A global magnetic mismatch negativity (MMNm) was obtained at 140-220 ms when breaking the global regularity, but no deviance-related effects were shown in early latencies. Conversely, Nbm (45-55 ms) and Pbm (60-75 ms) deflections of the MLR, and an earlier MMNm response at 120-160 ms, responded to local violations. Distinct neuronal generators in the auditory cortex underlay the processing of local and global regularity violations, suggesting that nested levels of complexity of auditory object representations are represented in separated cortical areas. Our results suggest that the different processing stages and anatomical areas involved in the encoding of auditory representations, and the subsequent detection of its violations, are hierarchically organized in the human auditory cortex. Hum Brain Mapp, 2014. © 2014 Wiley Periodicals, Inc.
    Human Brain Mapping 11/2014; 35(11). DOI:10.1002/hbm.22582 · 6.92 Impact Factor
  • Source
    • "Indeed MEG brain imaging experiments with such stimuli (Chait et al., 2008) demonstrate that the auditory cortex detects the emergence of regular patterns automatically, and rapidly, even in the absence of directed attention (when listeners are actively engaged in an unrelated task). The point of detection, measured as the first brain response to the transition, occurs roughly a cycle and a half after the nominal transition time (a similar estimate is also obtained by measuring behavioral detection time; see below). "
    [Show abstract] [Hide abstract]
    ABSTRACT: We investigated how listeners perceive the temporal relationship of a light flash and a complex acoustic signal. The stimulus mimics ubiquitous events in busy scenes which are manifested as a change in the pattern of on-going fluctuation. Detecting pattern emergence inherently requires integration over time; resulting in such events being detected later than when they occurred. How does delayed detection time affect the perception of such events relative to other events in the scene? To model these situations, we use rapid sequences of tone pips with a time-frequency pattern that changes from random to regular ("REG-RAND") or vice versa ("RAND-REG"). REG-RAND transitions are detected rapidly, but RAND-REG take longer to detect (∼880 ms post nominal transition). Using a Temporal Order Judgment task, we instructed subjects to indicate whether the flash appeared before or after the acoustic transition. The point of subjective simultaneity between the flash and RAND-REG does not occur at the point of detection (∼880 ms post nominal transition) but ∼470 ms closer to the nominal acoustic transition. In a second experiment we halved the tone pip duration. The resulting pattern of performance was qualitatively similar to that in Experiment 1, but scaled by half. Our results indicates that the brain possesses mechanisms that survey the proximal history of an on-going stimulus and automatically adjust perception so as to compensate for prolonged detection time, thus producing more accurate representations of scene dynamics. However, this readjustment is not complete.
    Frontiers in Psychology 10/2012; 3:396. DOI:10.3389/fpsyg.2012.00396 · 2.80 Impact Factor
  • Source
    • "For example, the transition from noise to a regular interval sound with pitch has a different cortical representation than the reverse transition (Krumbholz et al., 2003). Recently, Chait and colleagues (2007; 2008) demonstrated distinct cortical mechanisms for the detection of auditory 'edges' based on statistical properties, where the detection of a statistical regularity (in violation of a previous irregularity) had a different cortical signature than the detection of a violation of statistical regularity. The current results support the existence of such neural and perceptual asymmetries. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Auditory object analysis requires two fundamental perceptual processes: the definition of the boundaries between objects, and the abstraction and maintenance of an object's characteristic features. Although it is intuitive to assume that the detection of the discontinuities at an object's boundaries precedes the subsequent precise representation of the object, the specific underlying cortical mechanisms for segregating and representing auditory objects within the auditory scene are unknown. We investigated the cortical bases of these two processes for one type of auditory object, an "acoustic texture," composed of multiple frequency-modulated ramps. In these stimuli, we independently manipulated the statistical rules governing (1) the frequency-time space within individual textures (comprising ramps with a given spectrotemporal coherence) and (2) the boundaries between textures (adjacent textures with different spectrotemporal coherences). Using functional magnetic resonance imaging, we show mechanisms defining boundaries between textures with different coherences in primary and association auditory cortices, whereas texture coherence is represented only in association cortex. Furthermore, participants' superior detection of boundaries across which texture coherence increased (as opposed to decreased) was reflected in a greater neural response in auditory association cortex at these boundaries. The results suggest a hierarchical mechanism for processing acoustic textures that is relevant to auditory object analysis: boundaries between objects are first detected as a change in statistical rules over frequency-time space, before a representation that corresponds to the characteristics of the perceived object is formed.
    The Journal of Neuroscience : The Official Journal of the Society for Neuroscience 02/2010; 30(6):2070-6. DOI:10.1523/JNEUROSCI.5378-09.2010 · 6.75 Impact Factor
Show more