Optimal information storage in noisy synapses under resource constraints.

Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA.
Neuron (Impact Factor: 15.98). 12/2006; 52(3):409-23. DOI: 10.1016/j.neuron.2006.10.017
Source: PubMed

ABSTRACT Experimental investigations have revealed that synapses possess interesting and, in some cases, unexpected properties. We propose a theoretical framework that accounts for three of these properties: typical central synapses are noisy, the distribution of synaptic weights among central synapses is wide, and synaptic connectivity between neurons is sparse. We also comment on the possibility that synaptic weights may vary in discrete steps. Our approach is based on maximizing information storage capacity of neural tissue under resource constraints. Based on previous experimental and theoretical work, we use volume as a limited resource and utilize the empirical relationship between volume and synaptic weight. Solutions of our constrained optimization problems are not only consistent with existing experimental measurements but also make nontrivial predictions.

  • Source
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Many functional descriptions of spiking neurons assume a cascade structure where inputs are passed through an initial linear filtering stage that produces a low-dimensional signal that drives subsequent nonlinear stages. This paper presents a novel and systematic parameter estimation procedure for such models and applies the method to two neural estimation problems: (i) compressed-sensing based neu-ral mapping from multi-neuron excitation, and (ii) estimation of neural receptive fields in sensory neurons. The proposed estimation algorithm models the neu-rons via a graphical model and then estimates the parameters in the model using a recently-developed generalized approximate message passing (GAMP) method. The GAMP method is based on Gaussian approximations of loopy belief propa-gation. In the neural connectivity problem, the GAMP-based method is shown to be computational efficient, provides a more exact modeling of the sparsity, can incorporate nonlinearities in the output and significantly outperforms previ-ous compressed-sensing methods. For the receptive field estimation, the GAMP method can also exploit inherent structured sparsity in the linear weights. The method is validated on estimation of linear nonlinear Poisson (LNP) cascade mod-els for receptive fields of salamander retinal ganglion cells.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The brain communicates via electrical signaling. The molecular mechanisms which drive this communication are quite complex. In this paper, we compare intra-neuron signaling properties with communications networks, and find that there are many similarities. We review some existing neuroscientific findings which exemplify this neuron-communication system link. Additionally, we review existing mathematical models and theo-ries for neural functionality and communication. In particular, we look at long-term synaptic plasticity mechanisms which amplify important synaptic inputs and attenuate noisy ones over time. We prepare a basic model based upon communication theoretic principles, and derive a mathematical approximation to it. We find that our model is largely based on correlation of input patterns, which matches quite well with some existing neuroscientific theories as well as neural network principles. With simulation, we find that the model predicts some of the basic properties effects of synaptic plasticity.

Full-text (2 Sources)

Available from
May 19, 2014