Article

Coding of Vocalizations by Single Neurons in Ventrolateral Prefrontal Cortex.

Dept. Neurobiology & Anatomy, Univ. of Rochester, Box 603, Rochester, NY 14642.
Hearing research (Impact Factor: 2.85). 07/2013; DOI: 10.1016/j.heares.2013.07.011
Source: PubMed

ABSTRACT Neuronal activity in single prefrontal neurons has been correlated with behavioral responses, rules, task variables and stimulus features. In the non-human primate, neurons recorded in ventrolateral prefrontal cortex (VLPFC) have been found to respond to species-specific vocalizations. Previous studies have found multisensory neurons which respond to simultaneously presented faces and vocalizations in this region. Behavioral data suggests that face and vocal information are inextricably linked in animals and humans and therefore may also be tightly linked in the coding of communication calls in prefrontal neurons. In this study we therefore examined the role of VLPFC in encoding vocalization call type information. Specifically, we examined previously recorded single unit responses from the VLPFC in awake, behaving rhesus macaques in response to 3 types of species-specific vocalizations made by 3 individual callers. Analysis of responses by vocalization call type and caller identity showed that ∼ 19 % of cells had a main effect of call type with fewer cells encoding caller. Classification performance of VLPFC neurons was ∼ 42% averaged across the population. When assessed at discrete time bins, classification performance reached 70 percent for coos in the first 300 ms and remained above chance for the duration of the response period, though performance was lower for other call types. In light of the sub-optimal classification performance of the majority of VLPFC neurons when only vocal information is present, and the recent evidence that most VLPFC neurons are multisensory, the potential enhancement of classification with the addition of accompanying face information is discussed and additional studies recommended. Behavioral and neuronal evidence has shown a considerable benefit in recognition and memory performance when faces and voices are presented simultaneously. In the natural environment both facial and vocalization information is present simultaneously and neural systems no doubt evolved to integrate multisensory stimuli during recognition.

1 Follower
 · 
66 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Music consists of strings of sound that vary over time. Technical devices, such as tape recorders, store musical melodies by transcribing event times of temporal sequences into consecutive locations on the storage medium. Playback occurs by reading out the stored information in the same sequence. However, it is unclear how the brain stores and retrieves auditory sequences. Neurons in the anterior lateral belt of auditory cortex are sensitive to the combination of sound features in time, but the integration time of these neurons is not sufficient to store longer sequences that stretch over several seconds, minutes or more. Functional imaging studies in humans provide evidence that music is stored instead within the auditory dorsal stream, including premotor and prefrontal areas. In monkeys, these areas are the substrate for learning of motor sequences. It appears, therefore, that the auditory dorsal stream transforms musical into motor sequence information and vice versa, realizing what are known as forward and inverse models. The basal ganglia and the cerebellum are involved in setting up the sensorimotor associations, translating timing information into spatial codes and back again.
    Frontiers in Systems Neuroscience 08/2014; 8:149. DOI:10.3389/fnsys.2014.00149
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Categorization enables listeners to efficiently encode and respond to auditory stimuli. Behavioral evidence for auditory categorization has been well documented across a broad range of human and non-human animal species. Moreover, neural correlates of auditory categorization have been documented in a variety of different brain regions in the ventral auditory pathway, which is thought to underlie auditory-object processing and auditory perception. Here, we review and discuss how neural representations of auditory categories are transformed across different scales of neural organization in the ventral auditory pathway: from across different brain areas to within local microcircuits. We propose different neural transformations across different scales of neural organization in auditory categorization. Along the ascending auditory system in the ventral pathway, there is a progression in the encoding of categories from simple acoustic categories to categories for abstract information. On the other hand, in local microcircuits, different classes of neurons differentially compute categorical information.
    Frontiers in Neuroscience 06/2014; 8:161. DOI:10.3389/fnins.2014.00161
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A listener's capacity to discriminate between sounds is related to the amount of acoustic variability that exists between these sounds. However, a full understanding of how this natural variability impacts neural activity and behavior is lacking. Here, we tested monkeys' ability to discriminate between different utterances of vocalizations from the same acoustic class (i.e., coos and grunts), while neural activity was simultaneously recorded in the anterolateral belt region (AL) of the auditory cortex, a brain region that is a part of a pathway that mediates auditory perception. Monkeys could discriminate between coos better than they could discriminate between grunts. We also found AL activity was more informative about different coos than different grunts. This difference could be attributed, in part, to our finding that coos had more acoustic variability than grunts. Thus, intrinsic acoustic variability constrained the discriminability of AL spike trains and the ability of rhesus monkeys to discriminate between vocalizations.
    Hearing research 04/2014; 312. DOI:10.1016/j.heares.2014.03.007 · 2.85 Impact Factor