Synaptic Economics: Competition and Cooperation in Correlation-Based Synaptic Plasticity.

Department of Physiology, W. M. Keck Center for Integrative Neuroscience, Sloan Center for Theoretical Neurobiology, UCSF University of California 94143-0444, USA.
Neuron (Impact Factor: 15.05). 10/1996; 17(3):371-4. DOI: 10.1016/S0896-6273(00)80169-5
Source: PubMed
11 Reads
  • Source
    • ", 2001 ) . Finally dictating that no plasticity occurs when both pre and post - synaptic firing rates are below their mean removes the un - biological potentiation of synapses when both neurons have negative deviations , and hence a positive covariance ( Miller , 1996 ) . It is noteworthy that the geometry of synaptic change in Hebbian covariance plasticity is similar to the correlation based learning rule for sequentially active binary units used for the analytic calculations ( compare Figure 2B and Figure 3B ) . "
    [Show abstract] [Hide abstract]
    ABSTRACT: The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions.
    Frontiers in Computational Neuroscience 08/2015; 9:92. DOI:10.3389/fncom.2015.00092 · 2.20 Impact Factor
  • Source
    • "While the role of homeostasis in information processing and computation has only been discussed more recently, its function as a stabilization process of neural dynamics has acquired earlier attention (von der Malsburg, 1973; Bienenstock, Cooper, & Munro, 1982). From the perspective of the nervous system, pure Hebbian potentiation or anti-Hebbian depression would lead to destabilization of synaptic efficacies by generating amplifying feedback loops (Miller, 1996; Song, Miller, & Abbott, 2000), necessitating a homeostatic mechanism for stabilization (Davis & Goodman, 1998; Zhang & Linden, 2003; Turrigiano & Nelson, 2004). Similarly, as suggested by the effects of the plasticity mechanism (see equation 5.6) on the virtual network topology (see section 5.2), the facilitating sensitivity term −αςθ is counteracted by the depressive entropy term +αςρ, which prevents synaptic efficacies from overpotentiating or collapsing. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Supplementing a differential equation with delays results in an infinite-dimensional dynamical system. This property provides the basis for a reservoir computing architecture, where the recurrent neural network is replaced by a single nonlinear node, delay-coupled to itself. Instead of the spatial topology of a network, subunits in the delay-coupled reservoir are multiplexed in time along one delay span of the system. The computational power of the reservoir is contingent on this temporal multiplexing. Here, we learn optimal temporal multiplexing by means of a biologically inspired homeostatic plasticity mechanism. Plasticity acts locally and changes the distances between the subunits along the delay, depending on how responsive these subunits are to the input. After analytically deriving the learning mechanism, we illustrate its role in improving the reservoir's computational power. To this end, we investigate, first, the increase of the reservoir's memory capacity. Second, we predict a NARMA-10 time series, showing that plasticity reduces the normalized root-mean-square error by more than 20%. Third, we discuss plasticity's influence on the reservoir's input-information capacity, the coupling strength between subunits, and the distribution of the readout coefficients.
    Neural Computation 03/2015; DOI:10.1162/NECO_a_00737 · 2.21 Impact Factor
  • Source
    • "Several sources have argued that competition for limited structural resources could drive heterosynaptic adjustments in synaptic weights (Fonseca et al., 2004; Miller, 1996) and parallel changes in synaptic morphology (Ramiro-Corté s et al., 2014). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Plasticity of neuronal structure, such as the growth and retraction of individual dendritic spines, is thought to support experience-dependent neural circuit remodeling (Bosch and Hayashi, 2012 and Holtmaat and Svoboda, 2009). Indeed, as neural circuits are modified during learning, their optimization and fine-tuning involves the weakening and loss of superfluous synaptic connections. Manipulations leading to experience-dependent plasticity of neuronal circuits also increase the rate of spine shrinkage and elimination (Holtmaat et al., 2006, Tschida and Mooney, 2012, Xu et al., 2009 and Yang et al., 2009). Yet it remains unclear how neural activity drives the selective shrinkage and loss of individual dendritic spines in response to sensory experience.
    Cell Reports 12/2014; 305(2). DOI:10.1016/j.celrep.2014.12.016 · 8.36 Impact Factor
Show more


11 Reads
Available from