Article

Optimal information storage in noisy synapses under resource constraints.

Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA.
Neuron (Impact Factor: 15.98). 12/2006; 52(3):409-23. DOI: 10.1016/j.neuron.2006.10.017
Source: PubMed

ABSTRACT Experimental investigations have revealed that synapses possess interesting and, in some cases, unexpected properties. We propose a theoretical framework that accounts for three of these properties: typical central synapses are noisy, the distribution of synaptic weights among central synapses is wide, and synaptic connectivity between neurons is sparse. We also comment on the possibility that synaptic weights may vary in discrete steps. Our approach is based on maximizing information storage capacity of neural tissue under resource constraints. Based on previous experimental and theoretical work, we use volume as a limited resource and utilize the empirical relationship between volume and synaptic weight. Solutions of our constrained optimization problems are not only consistent with existing experimental measurements but also make nontrivial predictions.

Download full-text

Full-text

Available from: Dmitri B Chklovskii, Jul 06, 2015
0 Followers
 · 
127 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms allow reliable learning and recall of exponential numbers of patterns. Though these designs correct external errors in recall, they assume neurons compute noiselessly, in contrast to highly variable neurons in hippocampus and olfactory cortex. Here we consider associative memories with noisy internal computations and analytically characterize performance. As long as internal noise is less than a specified threshold, error probability in the recall phase can be made exceedingly small. More surprisingly, we show internal noise actually improves performance of the recall phase. Computational experiments lend additional support to our theoretical analysis. This work suggests a functional benefit to noisy neurons in biological neuronal networks.
    Advances in Neural Information Processing Systems (NIPS); 12/2013
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: One of the central problems in neuroscience is reconstructing synaptic connectivity in neural circuits. Synapses onto a neuron can be probed by sequentially stimulating potentially pre-synaptic neurons while monitoring the membrane voltage of the post-synaptic neuron. Reconstructing a large neural circuit using such a "brute force" approach is rather time-consuming and inefficient because the connectivity in neural circuits is sparse. Instead, we propose to measure a post-synaptic neuron's voltage while stimulating sequentially random subsets of multiple potentially pre-synaptic neurons. To reconstruct these synaptic connections from the recorded voltage we apply a decoding algorithm recently developed for compressive sensing. Compared to the brute force approach, our method promises significant time savings that grow with the size of the circuit. We use computer simulations to find optimal stimulation parameters and explore the feasibility of our reconstruction method under realistic experimental conditions including noise and non-linear synaptic integration. Multineuronal stimulation allows reconstructing synaptic connectivity just from the spiking activity of post-synaptic neurons, even when sub-threshold voltage is unavailable. By using calcium indicators, voltage-sensitive dyes, or multi-electrode arrays one could monitor activity of multiple postsynaptic neurons simultaneously, thus mapping their synaptic inputs in parallel, potentially reconstructing a complete neural circuit.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Neurons encode information by firing spikes in isolation or bursts and propagate information by spike-triggered neurotransmitter release that initiates synaptic transmission. Isolated spikes trigger neurotransmitter release unreliably but with high temporal precision. In contrast, bursts of spikes trigger neurotransmission reliably (i.e., boost transmission fidelity), but the resulting synaptic responses are temporally imprecise. However, the relative physiological importance of different spike-firing modes remains unclear. Here, we show that knockdown of synaptotagmin-1, the major Ca(2+) sensor for neurotransmitter release, abrogated neurotransmission evoked by isolated spikes but only delayed, without abolishing, neurotransmission evoked by bursts of spikes. Nevertheless, knockdown of synaptotagmin-1 in the hippocampal CA1 region did not impede acquisition of recent contextual fear memories, although it did impair the precision of such memories. In contrast, knockdown of synaptotagmin-1 in the prefrontal cortex impaired all remote fear memories. These results indicate that different brain circuits and types of memory employ distinct spike-coding schemes to encode and transmit information.
    Neuron 03/2012; 73(5):990-1001. DOI:10.1016/j.neuron.2011.12.036 · 15.98 Impact Factor