Low error discrimination using a correlated population code

Department of Molecular Biology and the Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
Journal of Neurophysiology (Impact Factor: 2.89). 04/2012; 108(4):1069-88. DOI: 10.1152/jn.00564.2011
Source: PubMed


We explored the manner in which spatial information is encoded by retinal ganglion cell populations. We flashed a set of 36 shape stimuli onto the tiger salamander retina and used different decoding algorithms to read out information from a population of 162 ganglion cells. We compared the discrimination performance of linear decoders, which ignore correlation induced by common stimulation, with nonlinear decoders, which can accurately model these correlations. Similar to previous studies, decoders that ignored correlation suffered only a modest drop in discrimination performance for groups of up to ∼30 cells. However, for more realistic groups of 100+ cells, we found order-of-magnitude differences in the error rate. We also compared decoders that used only the presence of a single spike from each cell with more complex decoders that included information from multiple spike counts and multiple time bins. More complex decoders substantially outperformed simpler decoders, showing the importance of spike timing information. Particularly effective was the first spike latency representation, which allowed zero discrimination errors for the majority of shape stimuli. Furthermore, the performance of nonlinear decoders showed even greater enhancement compared with linear decoders for these complex representations. Finally, decoders that approximated the correlation structure in the population by matching all pairwise correlations with a maximum entropy model fit to all 162 neurons were quite successful, especially for the spike latency representation. Together, these results suggest a picture in which linear decoders allow a coarse categorization of shape stimuli, whereas nonlinear decoders, which take advantage of both correlation and spike timing, are needed to achieve high-fidelity discrimination.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The resting-state human brain networks underlie fundamental cognitive functions and consist of complex interactions among brain regions. However, the level of complexity of the resting-state networks has not been quantified, which has prevented comprehensive descriptions of the brain activity as an integrative system. Here, we address this issue by demonstrating that a pairwise maximum entropy model, which takes into account region-specific activity rates and pairwise interactions, can be robustly and accurately fitted to resting-state human brain activities obtained by functional magnetic resonance imaging. Furthermore, to validate the approximation of the resting-state networks by the pairwise maximum entropy model, we show that the functional interactions estimated by the pairwise maximum entropy model reflect anatomical connexions more accurately than the conventional functional connectivity method. These findings indicate that a relatively simple statistical model not only captures the structure of the resting-state networks but also provides a possible method to derive physiological information about various large-scale brain networks.
    Full-text · Article · Jan 2013 · Nature Communications
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The number of possible activity patterns in a population of neurons grows exponentially with the size of the population. Typical experiments explore only a tiny fraction of the large space of possible activity patterns in the case of populations with more than 10 or 20 neurons. It is thus impossible, in this undersampled regime, to estimate the probabilities with which most of the activity patterns occur. As a result, the corresponding entropy—which is a measure of the computational power of the neural population—cannot be estimated directly. We propose a simple scheme for estimating the entropy in the undersampled regime, which bounds its value from both below and above. The lower bound is the usual 'naive' entropy of the experimental frequencies. The upper bound results from a hybrid approximation of the entropy which makes use of the naive estimate, a maximum entropy fit, and a coverage adjustment. We apply our simple scheme to artificial data, in order to check their accuracy; we also compare its performance to those of several previously defined entropy estimators. We then apply it to actual measurements of neural activity in populations with up to 100 cells. Finally, we discuss the similarities and differences between the proposed simple estimation scheme and various earlier methods.
    Full-text · Article · Mar 2013 · Journal of Statistical Mechanics Theory and Experiment
  • [Show abstract] [Hide abstract]
    ABSTRACT: Neuronal responses to prolonged stimulation attenuate over time. Here, we ask a fundamental question: is adaptation a simple process for the neural system during which sustained input is ignored, or is it actually part of a strategy for the neural system to adjust its encoding properties dynamically? After simultaneously recording the activities of a group of bullfrog's retinal ganglion cells (dimming detectors) in response to sustained dimming stimulation, we applied a combination of information analysis approaches to explore the time-dependent nature of information encoding during the adaptation. We found that at the early stage of the adaptation, the stimulus information was mainly encoded in firing rates; whereas at the late stage of the adaptation, it was more encoded in neural correlations. Such a transition in encoding properties is not a simple consequence of the attenuation of neuronal firing rates, but rather involves an active change in the neural correlation strengths, suggesting that it is a strategy adopted by the neural system for functional purposes. Our results reveal that in encoding a prolonged stimulation, the neural system may utilize concerted, but less active, firings of neurons to encode information.
    No preview · Article · Jul 2013 · Journal of Neurophysiology
Show more