Optimal Information Storage in Noisy Synapses under Resource Constraints

Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA.
Neuron (Impact Factor: 15.05). 12/2006; 52(3):409-23. DOI: 10.1016/j.neuron.2006.10.017
Source: PubMed


Experimental investigations have revealed that synapses possess interesting and, in some cases, unexpected properties. We propose a theoretical framework that accounts for three of these properties: typical central synapses are noisy, the distribution of synaptic weights among central synapses is wide, and synaptic connectivity between neurons is sparse. We also comment on the possibility that synaptic weights may vary in discrete steps. Our approach is based on maximizing information storage capacity of neural tissue under resource constraints. Based on previous experimental and theoretical work, we use volume as a limited resource and utilize the empirical relationship between volume and synaptic weight. Solutions of our constrained optimization problems are not only consistent with existing experimental measurements but also make nontrivial predictions.

Download full-text


Available from: Dmitri B Chklovskii,
  • Source
    • "As a consequence of soft thresholding derived from l 1 -norm regularization the distribution of synaptic weights must be sparse and heavy-tailed. Ineed, both physiologically [20] and anatomically [26] [27] measured synaptic weights follow a sparse heavy-tailed distribution, Fig. 4B. 8) Learning Gabor features from the natural scene ensemble. "
    [Show abstract] [Hide abstract]
    ABSTRACT: A neuron is a basic physiological and computational unit of the brain. While much is known about the physiological properties of a neuron, its computational role is poorly understood. Here we propose to view a neuron as a signal processing device that represents the incoming streaming data matrix as a sparse vector of synaptic weights scaled by an outgoing sparse activity vector. Formally, a neuron minimizes a cost function comprising a cumulative squared representation error and regularization terms. We derive an online algorithm that minimizes such cost function by alternating between the minimization with respect to activity and with respect to synaptic weights. The steps of this algorithm reproduce well-known physiological properties of a neuron, such as weighted summation and leaky integration of synaptic inputs, as well as an Oja-like, but parameter-free, synaptic learning rule. Our theoretical framework makes several predictions, some of which can be verified by the existing data, others require further experiments. Such framework should allow modeling the function of neuronal circuits without necessarily measuring all the microscopic biophysical parameters, as well as facilitate the design of neuromorphic electronics.
  • Source
    • "Information-theoretic and associative memory models of storage have been used to predict experimentally measurable properties of synapses in the mammalian brain [10] [11]. But contrary to the fact that noise is present in computational operations of the brain [12] [13], associative memory models with exponential capacities have assumed no internal noise in the computational nodes. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms allow reliable learning and recall of exponential numbers of patterns. Though these designs correct external errors in recall, they assume neurons compute noiselessly, in contrast to highly variable neurons in hippocampus and olfactory cortex. Here we consider associative memories with noisy internal computations and analytically characterize performance. As long as internal noise is less than a specified threshold, error probability in the recall phase can be made exceedingly small. More surprisingly, we show internal noise actually improves performance of the recall phase. Computational experiments lend additional support to our theoretical analysis. This work suggests a functional benefit to noisy neurons in biological neuronal networks.
    Advances in Neural Information Processing Systems (NIPS); 12/2013
  • Source
    • "We simulate the proposed reconstruction method in silico by generating a neural network, simulating experimental measurements, and recovering synaptic weights. We draw unitless synaptic weights from a distribution derived from electrophysiological measurements [4] [5] [43] [44] containing a delta-function at zero and an exponential distribution with a unit mean (Figure 2a). We generate an M-by-N stimulation matrix A by setting F randomly chosen entries in each row to one, and the rest to zero. "
    [Show abstract] [Hide abstract]
    ABSTRACT: One of the central problems in neuroscience is reconstructing synaptic connectivity in neural circuits. Synapses onto a neuron can be probed by sequentially stimulating potentially pre-synaptic neurons while monitoring the membrane voltage of the post-synaptic neuron. Reconstructing a large neural circuit using such a "brute force" approach is rather time-consuming and inefficient because the connectivity in neural circuits is sparse. Instead, we propose to measure a post-synaptic neuron's voltage while stimulating sequentially random subsets of multiple potentially pre-synaptic neurons. To reconstruct these synaptic connections from the recorded voltage we apply a decoding algorithm recently developed for compressive sensing. Compared to the brute force approach, our method promises significant time savings that grow with the size of the circuit. We use computer simulations to find optimal stimulation parameters and explore the feasibility of our reconstruction method under realistic experimental conditions including noise and non-linear synaptic integration. Multineuronal stimulation allows reconstructing synaptic connectivity just from the spiking activity of post-synaptic neurons, even when sub-threshold voltage is unavailable. By using calcium indicators, voltage-sensitive dyes, or multi-electrode arrays one could monitor activity of multiple postsynaptic neurons simultaneously, thus mapping their synaptic inputs in parallel, potentially reconstructing a complete neural circuit.
Show more