Article

Differential neural coding of acoustic flutter within primate auditory cortex

Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland, United States
Nature Neuroscience (Impact Factor: 14.98). 07/2007; 10(6):763-71. DOI: 10.1038/nn1888
Source: PubMed

ABSTRACT A sequence of acoustic events is perceived either as one continuous sound or as a stream of temporally discrete sounds (acoustic flutter), depending on the rate at which the acoustic events repeat. Acoustic flutter is perceived at repetition rates near or below the lower limit for perceiving pitch, and is akin to the discrete percepts of visual flicker and tactile flutter caused by the slow repetition of sensory stimulation. It has been shown that slowly repeating acoustic events are represented explicitly by stimulus-synchronized neuronal firing patterns in primary auditory cortex (AI). Here we show that a second neural code for acoustic flutter exists in the auditory cortex of marmoset monkeys (Callithrix jacchus), in which the firing rate of a neuron is a monotonic function of an acoustic event's repetition rate. Whereas many neurons in AI encode acoustic flutter using a dual temporal/rate representation, we find that neurons in cortical fields rostral to AI predominantly use a monotonic rate code and lack stimulus-synchronized discharges. These findings indicate that the neural representation of acoustic flutter is transformed along the caudal-to-rostral axis of auditory cortex.

Full-text

Available from: Daniel Bendor, Sep 03, 2014
0 Followers
 · 
69 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents iSpike: a C++ library that interfaces between spiking neural network simulators and the iCub humanoid robot. It uses a biologically inspired approach to convert the robot's sensory information into spikes that are passed to the neural network simulator, and it decodes output spikes from the network into motor signals that are sent to control the robot. Applications of iSpike range from embodied models of the brain to the development of intelligent robots using biologically inspired spiking neural networks. iSpike is an open source library that is available for free download under the terms of the GPL.
    Bioinspiration &amp Biomimetics 06/2012; 7(2):025008. DOI:10.1088/1748-3182/7/2/025008 · 2.53 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: One of the major challenges in human brain science is the functional hemispheric asymmetry of auditory processing. Behavioral and neurophysiological studies have demonstrated that speech processing is dominantly handled in the left hemisphere, whereas music processing dominantly occurs in the right. Using magnetoencephalography, we measured the auditory mismatch negativity elicited by band-pass filtered click-trains, which deviated from frequently presented standard sound signals in a spectral or temporal domain. The results showed that spectral and temporal deviants were dominantly processed in the right and left hemispheres, respectively. Hemispheric asymmetry was not limited to high-level cognitive processes, but also originated from the pre-attentive neural processing stage represented by mismatch negativity.
    Brain Topography 12/2013; 28(3). DOI:10.1007/s10548-013-0347-1 · 2.52 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In auditory cortex, temporal information within a sound is represented by two complementary neural codes: a temporal representation based on stimulus-locked firing and a rate representation, where discharge rate co-varies with the timing between acoustic events but lacks a stimulus-synchronized response. Using a computational neuronal model, we find that stimulus-locked responses are generated when sound-evoked excitation is combined with strong, delayed inhibition. In contrast to this, a non-synchronized rate representation is generated when the net excitation evoked by the sound is weak, which occurs when excitation is coincident and balanced with inhibition. Using single-unit recordings from awake marmosets (Callithrix jacchus), we validate several model predictions, including differences in the temporal fidelity, discharge rates and temporal dynamics of stimulus-evoked responses between neurons with rate and temporal representations. Together these data suggest that feedforward inhibition provides a parsimonious explanation of the neural coding dichotomy observed in auditory cortex.
    PLoS Computational Biology 04/2015; 11(4):e1004197. DOI:10.1371/journal.pcbi.1004197 · 4.83 Impact Factor