Synaptic Economics: Competition and Cooperation in Correlation-Based Synaptic Plasticity.

Department of Physiology, W. M. Keck Center for Integrative Neuroscience, Sloan Center for Theoretical Neurobiology, UCSF University of California 94143-0444, USA.
Neuron (Impact Factor: 15.05). 10/1996; 17(3):371-4. DOI: 10.1016/S0896-6273(00)80169-5
Source: PubMed
8 Reads
  • Source
    • ", 2001 ) . Finally dictating that no plasticity occurs when both pre and post - synaptic firing rates are below their mean removes the un - biological potentiation of synapses when both neurons have negative deviations , and hence a positive covariance ( Miller , 1996 ) . It is noteworthy that the geometry of synaptic change in Hebbian covariance plasticity is similar to the correlation based learning rule for sequentially active binary units used for the analytic calculations ( compare Figure 2B and Figure 3B ) . "
    [Show abstract] [Hide abstract]
    ABSTRACT: The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions.
    Frontiers in Computational Neuroscience 08/2015; 9:92. DOI:10.3389/fncom.2015.00092 · 2.20 Impact Factor
  • Source
    • ", 2014 ) . However , Hebbian - type learning rules introduce only a weak , if any , degree of competition ( Miller , 1996 ) , restricted to the synapses receiving distinct input patterns ( Zhang et al . , 2011 ; see below for further discussion ) . "
    [Show abstract] [Hide abstract]
    ABSTRACT: Homosynaptic Hebbian-type plasticity provides a cellular mechanism of learning and refinement of connectivity during development in a variety of biological systems. In this review we argue that a complimentary form of plasticity-heterosynaptic plasticity-represents a necessary cellular component for homeostatic regulation of synaptic weights and neuronal activity. The required properties of a homeostatic mechanism which acutely constrains the runaway dynamics imposed by Hebbian associative plasticity have been well-articulated by theoretical and modeling studies. Such mechanism(s) should robustly support the stability of operation of neuronal networks and synaptic competition, include changes at non-active synapses, and operate on a similar time scale to Hebbian-type plasticity. The experimentally observed properties of heterosynaptic plasticity have introduced it as a strong candidate to fulfill this homeostatic role. Subsequent modeling studies which incorporate heterosynaptic plasticity into model neurons with Hebbian synapses (utilizing an STDP learning rule) have confirmed its ability to robustly provide stability and competition. In contrast, properties of homeostatic synaptic scaling, which is triggered by extreme and long lasting (hours and days) changes of neuronal activity, do not fit two crucial requirements for a hypothetical homeostatic mechanism needed to provide stability of operation in the face of on-going synaptic changes driven by Hebbian-type learning rules. Both the trigger and the time scale of homeostatic synaptic scaling are fundamentally different from those of the Hebbian-type plasticity. We conclude that heterosynaptic plasticity, which is triggered by the same episodes of strong postsynaptic activity and operates on the same time scale as Hebbian-type associative plasticity, is ideally suited to serve a homeostatic role during on-going synaptic plasticity.
    Frontiers in Computational Neuroscience 07/2015; 9:89. DOI:10.3389/fncom.2015.00089 · 2.20 Impact Factor
  • Source
    • "While the role of homeostasis in information processing and computation has only been discussed more recently, its function as a stabilization process of neural dynamics has acquired earlier attention (von der Malsburg, 1973; Bienenstock, Cooper, & Munro, 1982). From the perspective of the nervous system, pure Hebbian potentiation or anti-Hebbian depression would lead to destabilization of synaptic efficacies by generating amplifying feedback loops (Miller, 1996; Song, Miller, & Abbott, 2000), necessitating a homeostatic mechanism for stabilization (Davis & Goodman, 1998; Zhang & Linden, 2003; Turrigiano & Nelson, 2004). Similarly, as suggested by the effects of the plasticity mechanism (see equation 5.6) on the virtual network topology (see section 5.2), the facilitating sensitivity term −αςθ is counteracted by the depressive entropy term +αςρ, which prevents synaptic efficacies from overpotentiating or collapsing. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Supplementing a differential equation with delays results in an infinite-dimensional dynamical system. This property provides the basis for a reservoir computing architecture, where the recurrent neural network is replaced by a single nonlinear node, delay-coupled to itself. Instead of the spatial topology of a network, subunits in the delay-coupled reservoir are multiplexed in time along one delay span of the system. The computational power of the reservoir is contingent on this temporal multiplexing. Here, we learn optimal temporal multiplexing by means of a biologically inspired homeostatic plasticity mechanism. Plasticity acts locally and changes the distances between the subunits along the delay, depending on how responsive these subunits are to the input. After analytically deriving the learning mechanism, we illustrate its role in improving the reservoir's computational power. To this end, we investigate, first, the increase of the reservoir's memory capacity. Second, we predict a NARMA-10 time series, showing that plasticity reduces the normalized root-mean-square error by more than 20%. Third, we discuss plasticity's influence on the reservoir's input-information capacity, the coupling strength between subunits, and the distribution of the readout coefficients.
    Neural Computation 03/2015; DOI:10.1162/NECO_a_00737 · 2.21 Impact Factor
Show more


8 Reads
Available from