Article

Synaptic Economics: Competition and Cooperation in Correlation-Based Synaptic Plasticity.

Department of Physiology, W. M. Keck Center for Integrative Neuroscience, Sloan Center for Theoretical Neurobiology, UCSF University of California 94143-0444, USA.
Neuron (Impact Factor: 15.05). 10/1996; 17(3):371-4. DOI: 10.1016/S0896-6273(00)80169-5
Source: PubMed

Full-text preview

Available from: sciencedirect.com
  • Source
    • "The entrainment between synapses often force all to grow boundlessly or to a maximal set value; in other cases, they may all become silent. To circumvent these issues of traditional rate-based Hebbian learning, weight normalization can be introduced to prevent the runaway of synapses (Oja, 1982; Miller, 1996). In the context of spiking activity, STDP has been termed " temporally Hebbian " when it promotes synchronous firing. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Part of hippocampal and cortical plasticity is characterized by synaptic modifications that depend on the joint activity of the pre- and post-synaptic neurons. To which extent those changes are determined by the exact timing and the average firing rates is still a matter of debate; this may vary from brain area to brain area, as well as across neuron types. However, it has been robustly observed both in vitro and in vivo that plasticity itself slowly adapts as a function of the dynamical context, a phenomena commonly referred to as metaplasticity. An alternative concept considers the regulation of groups of synapses with an objective at the neuronal level, for example, maintaining a given average firing rate. In that case, the change in the strength of a particular synapse of the group (e.g., due to Hebbian learning) affects others' strengths, which has been coined as heterosynaptic plasticity. Classically, Hebbian synaptic plasticity is paired in neuron network models with such mechanisms in order to stabilize the activity and/or the weight structure. Here, we present an oriented review that brings together various concepts from heterosynaptic plasticity to metaplasticity, and show how they interact with Hebbian-type learning. We focus on approaches that are nowadays used to incorporate those mechanisms to state-of-the-art models of spiking plasticity inspired by experimental observations in the hippocampus and cortex. Making the point that metaplasticity is an ubiquitous mechanism acting on top of classical Hebbian learning and promoting the stability of neural function over multiple timescales, we stress the need for incorporating it as a key element in the framework of plasticity models. Bridging theoretical and experimental results suggests a more functional role for metaplasticity mechanisms than simply stabilizing neural activity.
    Full-text · Article · Nov 2015 · Frontiers in Computational Neuroscience
  • Source
    • ", 2001 ) . Finally dictating that no plasticity occurs when both pre and post - synaptic firing rates are below their mean removes the un - biological potentiation of synapses when both neurons have negative deviations , and hence a positive covariance ( Miller , 1996 ) . It is noteworthy that the geometry of synaptic change in Hebbian covariance plasticity is similar to the correlation based learning rule for sequentially active binary units used for the analytic calculations ( compare Figure 2B and Figure 3B ) . "
    [Show abstract] [Hide abstract]
    ABSTRACT: The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions.
    Full-text · Article · Aug 2015 · Frontiers in Computational Neuroscience
  • Source
    • "While the role of homeostasis in information processing and computation has only been discussed more recently, its function as a stabilization process of neural dynamics has acquired earlier attention (von der Malsburg, 1973; Bienenstock, Cooper, & Munro, 1982). From the perspective of the nervous system, pure Hebbian potentiation or anti-Hebbian depression would lead to destabilization of synaptic efficacies by generating amplifying feedback loops (Miller, 1996; Song, Miller, & Abbott, 2000), necessitating a homeostatic mechanism for stabilization (Davis & Goodman, 1998; Zhang & Linden, 2003; Turrigiano & Nelson, 2004). Similarly, as suggested by the effects of the plasticity mechanism (see equation 5.6) on the virtual network topology (see section 5.2), the facilitating sensitivity term −αςθ is counteracted by the depressive entropy term +αςρ, which prevents synaptic efficacies from overpotentiating or collapsing. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Supplementing a differential equation with delays results in an infinite-dimensional dynamical system. This property provides the basis for a reservoir computing architecture, where the recurrent neural network is replaced by a single nonlinear node, delay-coupled to itself. Instead of the spatial topology of a network, subunits in the delay-coupled reservoir are multiplexed in time along one delay span of the system. The computational power of the reservoir is contingent on this temporal multiplexing. Here, we learn optimal temporal multiplexing by means of a biologically inspired homeostatic plasticity mechanism. Plasticity acts locally and changes the distances between the subunits along the delay, depending on how responsive these subunits are to the input. After analytically deriving the learning mechanism, we illustrate its role in improving the reservoir's computational power. To this end, we investigate, first, the increase of the reservoir's memory capacity. Second, we predict a NARMA-10 time series, showing that plasticity reduces the normalized root-mean-square error by more than 20%. Third, we discuss plasticity's influence on the reservoir's input-information capacity, the coupling strength between subunits, and the distribution of the readout coefficients.
    Full-text · Article · Mar 2015 · Neural Computation
Show more